Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
9.0 - 14.0 years
22 - 37 Lacs
Pune
Hybrid
We're Hiring: Senior GCP Data Engineer (L4) for a client || Immediate joiners only Location: Pune | Walk-in Drive: 5th July 2025 Are you a seasoned Data Engineer with 912 years of experience and a passion for building scalable data solutions on Google Cloud Platform? Join us for an exciting walk-in opportunity! Key Skills Required GCP Data Engineering, BigQuery, SQL Python (Cloud Compressor, Cloud Functions, Python Injection) Dataproc + PySpark, Dataflow + Pub/Sub Apache Beam, Spark, Hadoop What You'll Do Architect and implement end-to-end data pipelines on GCP Work with BigQuery, BigTable, Cloud Storage, Spanner, and more Automate data ingestion, transformation, and augmentation Ensure data...
Posted 2 months ago
2.0 - 6.0 years
4 - 8 Lacs
Mumbai, Bengaluru, Delhi / NCR
Work from Office
Must have skills required: GCP, support, Python Forbes Advisor is Looking for: Role Summary We are seeking a proactive and detail-oriented Data Support Engineer- to monitor production processes, manage incident tickets, and ensure seamless operations in our data platforms. The ideal candidate will have experience in Google Cloud Platform (GCP), Airflow, Python and SQL with a strong focus on enabling developer productivity and maintaining system reliability. Key Responsibilities: Production Monitoring: Monitor and ensure the smooth execution of production data pipelines and workflows. Identify and promptly address anomalies or failures in the production environment. Perform first-level invest...
Posted 2 months ago
4.0 - 6.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Major skillset: GCP , Pyspark , SQL , Python , Cloud Architecture , ETL , Automation 4+ years of Experience in Data Engineering, Management with strong focus on Spark for building production ready Data Pipelines. Experienced with analyzing large data sets from multiple data sources and build automated testing and validations. Knowledge of Hadoop eco-system and components like HDFS , Spark , Hive , Sqoop Strong Python experience Hands on SQL , HQL to write optimized queries Strong hands-on experience with GCP Big Query , Data Proc , Airflow DAG , Dataflow , GCS , Pub/sub , Secret Manager , Cloud Functions , Beams . Ability to work in fast passed collaborative environment work with various sta...
Posted 2 months ago
4.0 - 9.0 years
11 - 19 Lacs
Chennai
Work from Office
Role & responsibilities Python, Dataproc, Airflow PySpark, Cloud Storage, DBT, DataForm, NAS, Pubsub, TERRAFORM, API, Big Query, Data Fusion, GCP, Tekton Preferred candidate profile Data Engineer in Python - GCP Location Chennai Only 4+ Years of Experience
Posted 2 months ago
6.0 - 9.0 years
19 - 25 Lacs
Hyderabad
Hybrid
Role & responsibilities Driving project delivery proactively, balancing planning, scope, schedule, budget, communications and risks. Managing and planning resources, responsibilities and schedules Establishing effective project controls and procedures and quality assurance processes. Managing relationships with internal and external stakeholders Reporting progress, issues, dependencies and risks to project or programme leadership and committees (as appropriate) and making recommendations to influence decision making, in order to maintain progress towards delivery and benefits realisation Providing management updates to maintain a focus on how the project aligns to wider programme objectives,...
Posted 2 months ago
8.0 - 10.0 years
25 - 27 Lacs
Chennai
Work from Office
Role & responsibilities Python, Data Warehousing, Big Query, SQL, AIRFLOW, GCP
Posted 2 months ago
8.0 - 13.0 years
8 - 13 Lacs
Hyderabad, Telangana, India
On-site
We are seeking a Technical Architect specializing in Healthcare Data Analytics with expertise in Google Cloud Platform (GCP). The role involves designing and implementing data solutions tailored to healthcare analytics requirements. The ideal candidate will have experience in GCP tools like BigQuery, Dataflow, Dataprep, and Healthcare APIs, and should stay up to date with GCP updates. Knowledge of healthcare data standards, compliance requirements (e.g., HIPAA), and healthcare interoperability is essential. The role requires experience in microservices, containerization (Docker, Kubernetes), and programming languages like Python and Spark. The candidate will lead the implementation of data a...
Posted 2 months ago
2.0 - 4.0 years
4 - 9 Lacs
Chennai
Work from Office
Role & responsibilities React, JavaScript, Application Support, Big Query, Application Testing, Application Design, Coding, Angular, SPRING, Application Development, Developer, Java, Web Services
Posted 2 months ago
4.0 - 9.0 years
14 - 20 Lacs
Hyderabad
Work from Office
Interested Candidates share updated cv to mail id dikshith.nalapatla@motivitylabs.com Job Title: GCP Data Engineer Overview: We are looking for a skilled GCP Data Engineer with 4 to 5 years of real hands-on experience in data ingestion, data engineering, data quality, data governance, and cloud data warehouse implementations using GCP data services. The ideal candidate will be responsible for designing and developing data pipelines, participating in architectural discussions, and implementing data solutions in a cloud environment. Key Responsibilities: Collaborate with stakeholders to gather requirements and create high-level and detailed technical designs. Develop and maintain data ingestio...
Posted 2 months ago
4.0 - 9.0 years
14 - 20 Lacs
Hyderabad
Work from Office
Interested Candidates share updated cv to mail id dikshith.nalapatla@motivitylabs.com Job Title: GCP Data Engineer Overview: We are looking for a skilled GCP Data Engineer with 4 to 5 years of real hands-on experience in data ingestion, data engineering, data quality, data governance, and cloud data warehouse implementations using GCP data services. The ideal candidate will be responsible for designing and developing data pipelines, participating in architectural discussions, and implementing data solutions in a cloud environment. Key Responsibilities: Collaborate with stakeholders to gather requirements and create high-level and detailed technical designs. Develop and maintain data ingestio...
Posted 2 months ago
3.0 - 5.0 years
5 - 8 Lacs
Hyderabad
Work from Office
Primary skills- GCP, Python CODING MUST, SQL Coding skills, Big Query, Dataflow, Airflow, Kafka and Airflow Dags . Bachelors Degree or equivalent experience in Computer Science or related field Required- Immediate or 15 days Job Description 3+ years experience as a software engineer or equivalent designing large data-heavy distributed systems and/or high-traffic web-apps Experience in at least one programming language (Python-2 yrs strong coding is must) or java. Hands-on experience designing & managing large data models, writing performant SQL queries, and working with large datasets and related technologies Experience designing & interacting with APIs (REST/GraphQL) Experience working with...
Posted 3 months ago
2.0 - 6.0 years
4 - 8 Lacs
Faridabad
Work from Office
Job Summary We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities Design and implement scalable data models using Snowflake and Erwin Data Modeler. Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebui...
Posted 3 months ago
4.0 - 9.0 years
0 - 2 Lacs
Chennai
Hybrid
Job Description: We are seeking a skilled and proactive GCP Data Engineer with strong experience in Python and SQL to build and manage scalable data pipelines on Google Cloud Platform (GCP) . The ideal candidate will work closely with data analysts, architects, and business teams to enable data-driven decision-making. Key Responsibilities: Design and develop robust data pipelines and ETL/ELT processes using GCP services Write efficient Python scripts for data processing, transformation, and automation Develop complex SQL queries for data extraction, aggregation, and analysis Work with tools like BigQuery, Cloud Storage, Cloud Functions , and Pub/Sub Ensure high data quality, integrity, and g...
Posted 3 months ago
3.0 - 6.0 years
5 - 10 Lacs
Mumbai
Work from Office
Job Summary: We are seeking a highly skilled MLOps Engineer to design, deploy, and manage machine learning pipelines in Google Cloud Platform (GCP). In this role, you will be responsible for automating ML workflows, optimizing model deployment, ensuring model reliability, and implementing CI/CD pipelines for ML systems. You will work with Vertex AI, Kubernetes (GKE), BigQuery, and Terraform to build scalable and cost-efficient ML infrastructure. The ideal candidate must have a good understanding of ML algorithms, experience in model monitoring, performance optimization, Looker dashboards and infrastructure as code (IaC), ensuring ML models are production-ready, reliable, and continuously imp...
Posted 3 months ago
6.0 - 10.0 years
15 - 30 Lacs
Pune, Maharashtra, India
On-site
We are looking for an experienced GCP and BigQuery professional to join our team in India. The ideal candidate will have a solid background in data engineering and analytics, with expertise in designing scalable data solutions on the Google Cloud Platform. Responsibilities Design, develop and maintain scalable data pipelines using Google Cloud Platform (GCP) and BigQuery. Analyze and interpret complex datasets to provide actionable insights to stakeholders. Collaborate with data engineers and analysts to optimize data storage and retrieval processes. Implement data quality checks and ensure the accuracy of data in BigQuery. Create and manage dashboards and reports to visualize data findings ...
Posted 3 months ago
5.0 - 10.0 years
8 - 10 Lacs
Chennai, Tamil Nadu, India
On-site
Google Cloud Platform GCS, DataProc, Big Query, Data Flow Programming Languages Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow) Mandatory Key Skills Google Cloud Platform, GCS, DataProc, Big Query, Data Flow, Composer, Data Processing, Java
Posted 3 months ago
4.0 - 9.0 years
10 - 20 Lacs
Gurugram
Work from Office
Roles and Responsibilities Design, develop, test, deploy, and maintain ETL processes using SSIS to extract data from various sources. Develop complex SQL queries to retrieve data from relational databases such as SQL Server. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Troubleshoot issues related to ETL process failures or performance problems. Ensure compliance with security standards by implementing Denodo Platform for data masking. Desired Candidate Profile 4-9 years of experience in ETL development with expertise in Agile methodology. Strong understanding of .NET Core, C#, Microsoft Azure, Big Query, SSRS (SQL Repor...
Posted 3 months ago
7.0 - 10.0 years
0 Lacs
Pune, Chennai, Bengaluru
Work from Office
As a GCP data engineer the colleague should be able to designs scalable data architectures on Google Cloud Platform, using services like Big Query and Dataflow. They write and maintain code (Python, Java), ensuring efficient data models and seamless ETL processes. Quality checks and governance are implemented to maintain accurate and reliable data. Security is a priority, enforcing measures for storage, transmission, and processing, while ensuring compliance with data protection standards. Collaboration with cross-functional teams is key for understanding diverse data requirements. Comprehensive documentation is maintained for data processes, pipelines, and architectures. Responsibilities ex...
Posted 3 months ago
8.0 - 13.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Qualification & Experience: Minimum of 8 years of experience as a Data Scientist/Engineer with demonstrated expertise in data engineering and cloud computing technologies. Technical Responsibilities Excellent proficiency in Python, with a strong focus on developing advanced skills. Extensive exposure to NLP and image processing concepts. Proficient in version control systems like Git. In-depth understanding of Azure deployments. Expertise in OCR, ML model training, and transfer learning. Experience working with unstructured data formats such as PDFs, DOCX, and images. O Strong familiarity with data science best practices and the ML lifecycle. Strong experience with data pipeline development,...
Posted 3 months ago
2.0 - 5.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Manag...
Posted 3 months ago
3.0 - 6.0 years
8 - 14 Lacs
Bengaluru
Work from Office
- Minimum of 3 years of hands-on experience. - Python/ML, Hadoop, Spark : Minimum of 2 years of experience. - At least 3 years of prior experience as a Data Analyst. - Detail-oriented with a structured thinking and analytical mindset. - Proven analytic skills, including data analysis, data validation, and technical writing. - Strong proficiency in SQL and Excel. - Experience with Big Query is mandatory. - Knowledge of Python and machine learning algorithms is a plus. - Excellent communication skills with the ability to be precise and clear. - Learning Ability : Ability to quickly learn and adapt to new analytic tools and technologies. Key Responsibilities : Data Analysis : - Perform comprehe...
Posted 3 months ago
8.0 - 10.0 years
20 - 30 Lacs
Chennai
Hybrid
Role & responsibilities GCP Services - Biq Query, Data Flow, Dataproc, DataPlex, DataFusion, Terraform, Tekton, Cloud SQL, Redis Memory, Airflow, Cloud Storage 2+ Years in Data Transfer Utilities 2+ Years in Git / any other version control tool 2+ Years in Confluent Kafka 1+ Years of Experience in API Development 2+ Years in Agile Framework 4+ years of strong experience in python, Pyspark development. 4+ years of shell scripting to develop the adhoc jobs for data importing/exporting Preferred candidate profile Python, dataflow, Dataproc, GCP Cloud Run, DataForm, Agile Software Development, Big Query, TERRAFORM, Data Fusion, Cloud SQL, GCP, KAFKA,Java. We would like to inform you that only im...
Posted 3 months ago
7.0 - 12.0 years
20 - 35 Lacs
Pune
Work from Office
Experience of 7+ years and have hands on experience on JAVA , GCP Shell script and Python knowledge a plus. Have in depth knowledge on Java, Spring boot Experience in GCP Data Flow, Big Table, Big Query etc
Posted 3 months ago
5.0 - 10.0 years
15 - 22 Lacs
Chennai
Work from Office
Data Warehousing, Python, Big Query, SQL, AIRFLOW, GCP Skills Preferred: Analytical, Problem Solving Experience Required: 5+ years of experience in Datawarehousing, Atleast 2 years f experience in GCP Big Query able to build complex sqls in Big query Required Candidate profile Must: Python, Data warehousing and Big query Exp– 6 yrs Candidate needs to take HackerRank Test (First level) Hybrid (11 Days work from office) Chennai Immediate to 15 days CTC upto 22 LPA
Posted 3 months ago
6.0 - 11.0 years
8 - 13 Lacs
Hyderabad
Work from Office
GCP data engineer Big Query SQL Python Talend ETL Programmer GCP or Any Cloud technology. Job Description: Experienced in GCP data engineer Big Query SQL Python Talend ETL Programmer GCP or Any Cloud technology. Good experience in building the pipeline of GCP Components to load the data into Big Query and to cloud storage buckets. Excellent Data Analysis skills. Good written and oral communication skills Self-motivated able to work independently
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
75151 Jobs | Dublin
Wipro
28327 Jobs | Bengaluru
Accenture in India
23529 Jobs | Dublin 2
EY
21461 Jobs | London
Uplers
15523 Jobs | Ahmedabad
Bajaj Finserv
14612 Jobs |
IBM
14519 Jobs | Armonk
Amazon.com
13639 Jobs |
Kotak Life Insurance
13588 Jobs | Jaipur
Accenture services Pvt Ltd
13587 Jobs |