9 - 12 years
25.0 - 30.0 Lacs P.A.
Delhi NCR, Bengaluru, Hyderabad
Posted:1 month ago| Platform:
Hybrid
Full Time
This role is ideal for someone with strong technical skills in cloud computing, data engineering, and analytics, who is passionate about working with cutting-edge technologies in GCP to build robust and scalable data solutions. Key Responsibilities: Data Architecture and Design: Design and implement scalable, reliable, and high-performance data pipelines on Google Cloud. Define and implement data architecture strategies to store, process, and analyze large datasets efficiently. Create optimized schemas and ensure data structures meet business requirements. Data Pipeline Development: Build and maintain ETL (Extract, Transform, Load) pipelines using tools like Google Cloud Dataflow , Apache Beam , and Cloud Dataproc . Work with Google Cloud Storage (GCS), BigQuery , and Pub/Sub for data ingestion, storage, and analysis. Automate and orchestrate data workflows using tools such as Apache Airflow or Google Cloud Composer . Data Processing and Transformation: Develop and manage batch and real-time data processing solutions. Transform raw data into useful formats for analysis or machine learning models using BigQuery , Dataflow , or Dataproc . Collaboration: Collaborate with data scientists, analysts, and other stakeholders to understand business needs and deliver data solutions. Provide data support for machine learning and AI models, ensuring the data is clean, structured, and properly ingested. Optimization and Monitoring: Monitor and optimize the performance of data pipelines, ensuring minimal downtime and efficient use of resources. Troubleshoot data issues and resolve bottlenecks in the pipeline or storage systems Required Skills and Qualifications: Experience with GCP: Strong experience with GCP services like BigQuery, Google Cloud Storage (GCS), Dataflow, Dataproc, Pub/Sub, Cloud Composer, and Cloud Functions. Programming Languages: Proficient in languages such as Python, Java, or Scala for developing data pipelines and processing. ETL Tools: Experience with ETL frameworks, tools like Apache Beam, Airflow, or Cloud Data Fusion. Data Modeling and Warehousing: Understanding of data modeling, relational databases, and data warehousing concepts. SQL and NoSQL Databases: Strong proficiency in SQL, with experience in data analysis using BigQuery or other relational databases. Familiarity with NoSQL databases is a plus. Cloud Infrastructure: Knowledge of cloud architecture and infrastructure best practices on GCP. Data Analytics and BI Tools: Experience working with data visualization tools like Google Data Studio, Tableau, or Looker is a plus. DevOps Practices: Experience with CI/CD pipelines, version control systems (e.g., Git), and automated testing. Preferred Skills: Experience with containerized environments, including Docker and Kubernetes. Familiarity with machine learning tools like AI Platform on GCP. Ability to manage large datasets efficiently and design solutions that scale with growing data volumes. Education and Certifications: A bachelors or masters degree in Computer Science, Information Technology, Data Science, or a related field. Google Cloud Certified Professional Data Engineer or other relevant certifications are often preferred.
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Bengaluru, Hyderabad
INR 3.5 - 8.5 Lacs P.A.
Mumbai, Bengaluru, Gurgaon
INR 5.5 - 13.0 Lacs P.A.
Chennai, Pune, Delhi, Mumbai, Bengaluru, Hyderabad, Kolkata
INR 3.0 - 7.0 Lacs P.A.
Chennai, Pune, Mumbai (All Areas)
INR 5.0 - 15.0 Lacs P.A.
Pune, Bengaluru, Mumbai (All Areas)
INR 11.0 - 21.0 Lacs P.A.
Chennai, Pune, Delhi, Mumbai, Bengaluru, Hyderabad, Kolkata
INR 15.0 - 16.0 Lacs P.A.
Pune, Bengaluru, Mumbai (All Areas)
INR 10.0 - 15.0 Lacs P.A.
Bengaluru, Hyderabad, Mumbai (All Areas)
INR 0.5 - 3.0 Lacs P.A.
Hyderabad, Gurgaon, Mumbai (All Areas)
INR 6.0 - 16.0 Lacs P.A.
Bengaluru, Noida
INR 16.0 - 22.5 Lacs P.A.