Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 5.0 years
10 - 13 Lacs
chennai
Work from Office
3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc).
Posted 3 weeks ago
4.0 - 8.0 years
12 - 22 Lacs
hyderabad
Hybrid
Job Overview: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: Design, develop, test, and maintain scalable ETL data pipelines using Python. Work extensively on Google Cloud Platform (GCP) services such as: Dataflow for real-time and batch dat...
Posted 3 weeks ago
3.0 - 5.0 years
18 - 20 Lacs
chennai
Hybrid
Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below client is a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Software Engineer III - Core Engineer III Location: Chennai Work Type: Hybrid Position Description: Leverage modern frameworks, open source tools, and cloud technologies to develop software effectively through paired programming and other methodologies like test-driven development, continuous integration and deployment. Dev...
Posted 3 weeks ago
4.0 - 8.0 years
10 - 20 Lacs
chennai
Hybrid
Greetings from Getronics! We have permanent opportunities for GCP Data Engineers in Chennai Location . Hope you are doing well! This is Narmadha from Getronics Talent Acquisition team. We have multiple opportunities for GCP Data Engineers for our automotive client in Chennai Sholinganallur location. Please find below the company profile and Job Description. If interested, please share your updated resume, recent professional photograph and Aadhaar proof at the earliest to narmadha.baskar@getronics.com Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 4+ Years in IT and minimum 3+ years in GCP Data Engineering Location : Chennai Work Mode : Hybrid Positio...
Posted 3 weeks ago
3.0 - 7.0 years
14 - 18 Lacs
pune
Hybrid
Job Title: GCP Data Engineer Location: Pune, India Experience: 3 to 7 Years Job Type: Full-Time Job Summary: We are looking for a highly skilled GCP Data Engineer with 3 to 7 years of experience to join our data engineering team in Pune . The ideal candidate should have strong experience working with Google Cloud Platform (GCP) , including Dataproc , Cloud Composer (Apache Airflow) , and must be proficient in Python , SQL , and Apache Spark . The role involves designing, building, and optimizing data pipelines and workflows to support enterprise-grade analytics and data science initiatives. Key Responsibilities: Design and implement scalable and efficient data pipelines on GCP , leveraging D...
Posted 3 weeks ago
15.0 - 20.0 years
4 - 8 Lacs
mumbai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data ServicesMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, e...
Posted 3 weeks ago
2.0 - 5.0 years
13 - 17 Lacs
hyderabad
Work from Office
Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love colla...
Posted 3 weeks ago
5.0 - 10.0 years
14 - 24 Lacs
hyderabad
Work from Office
Experience - 5+ Years Job Summary We are looking for an experienced Data Engineer with a strong background in SQL and Python , and hands-on experience with at least one BI Tool. You will be responsible for designing and building scalable data pipelines, transforming complex datasets, and supporting analytics and reporting needs across the organization. Key Responsibilities: Design, develop, and maintain robust ETL/ELT pipelines using SQL and Python. Optimize and troubleshoot complex SQL queries to ensure high performance and scalability. Build and maintain data models and data marts to support business analytics and decision-making. Collaborate with data analysts, data scientists, and busine...
Posted 3 weeks ago
5.0 - 7.0 years
13 - 17 Lacs
bengaluru
Work from Office
Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love colla...
Posted 3 weeks ago
4.0 - 9.0 years
20 - 35 Lacs
pune, gurugram, bengaluru
Hybrid
Salary: 20 to 35 LPA Exp: 5 to 8 years Location: Gurgaon /Pune/Bangalore Notice: Immediate to 30 days..!! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services such as BigQuery, Data Flow, PubSub, Dataproc, and Cloud Storage. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from large datasets stored in Google Cloud SQL databases. Troubleshoot issues related to data processing workflows and provide timely resolutions. Desired Candidate Profile 5-9 years of experience in Data Engineering with expertise GCP & Biq que...
Posted 3 weeks ago
6.0 - 10.0 years
18 - 20 Lacs
chennai
Hybrid
Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below client is a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Software Engineer III - Core Engineer III Location: Chennai Work Type: Hybrid Position Description: Employees in this job function are responsible for designing, developing, testing and maintaining software applications and products to meet customer needs. They are involved in the entire software development lifecycle inclu...
Posted 3 weeks ago
3.0 - 8.0 years
6 - 10 Lacs
mumbai, pune, gurugram
Work from Office
Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrowpeople with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. About The Role : About The Role We're seeking a talented Data Engineer with hands-on experience in the Google Cloud data ecosystem and a...
Posted 3 weeks ago
3.0 - 5.0 years
5 - 15 Lacs
pune, bengaluru
Work from Office
About the Role: We are looking for a skilled GCP Data Engineer to design, build, and optimize data pipelines and architectures on Google Cloud Platform (GCP). The ideal candidate will have strong experience in data warehousing, ETL processes, and cloud-native data tools. Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes using GCP services. Work with BigQuery , Dataflow , Pub/Sub , Cloud Storage , and Composer to implement data solutions. Build and optimize data models to support analytics and reporting needs. Collaborate with data analysts, data scientists, and business teams to understand data requirements. Ensure data quality, governance, and sec...
Posted 3 weeks ago
2.0 - 3.0 years
5 - 5 Lacs
kochi, chennai, thiruvananthapuram
Work from Office
Role Proficiency: This role requires proficiency in data pipeline development including coding testing and implementing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be adept at using ETL tools such as Informatica Glue Databricks and DataProc along with coding skills in Python PySpark and SQL. Works independently according to work allocation. Outcomes: Operate with minimal guidance to develop error-free code test applications and document the development process. Understand application features and component designs to develop them in accordance with user stories and requirements. Code debug test document and communicate the stages of product...
Posted 3 weeks ago
2.0 - 3.0 years
5 - 5 Lacs
kochi, chennai, thiruvananthapuram
Work from Office
Role Proficiency: This role requires proficiency in data pipeline development including coding testing and implementing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be adept at using ETL tools such as Informatica Glue Databricks and DataProc along with coding skills in Python PySpark and SQL. Works independently according to work allocation. Outcomes: Operate with minimal guidance to develop error-free code test applications and document the development process. Understand application features and component designs to develop them in accordance with user stories and requirements. Code debug test document and communicate the stages of product...
Posted 3 weeks ago
2.0 - 3.0 years
5 - 5 Lacs
thiruvananthapuram
Work from Office
Role Proficiency: This role requires proficiency in data pipeline development including coding testing and implementing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be adept at using ETL tools such as Informatica Glue Databricks and DataProc along with coding skills in Python PySpark and SQL. Works independently according to work allocation. Outcomes: Operate with minimal guidance to develop error-free code test applications and document the development process. Understand application features and component designs to develop them in accordance with user stories and requirements. Code debug test document and communicate the stages of product...
Posted 3 weeks ago
4.0 - 9.0 years
15 - 20 Lacs
chennai
Work from Office
Adobe Campaign & Data Analytics with Cloud Migration Expertise Job Summary Synechron seeks a highly skilled and proactive Associate Specialist with experience in Adobe Campaign and related digital marketing technologies. The role is central to designing, implementing, and maintaining marketing campaign tracking solutions, as well as building analytics dashboards to support marketing and finance teams. You will work closely with cross-functional teams within an agile environment to develop scalable data processing solutions, automate workflows, and ensure seamless deployment and support of marketing applications across testing, staging, and production environments. Your expertise will directl...
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
coimbatore, tamil nadu
On-site
Role Overview: You should have strong experience in PySpark, Hadoop, Hive, and Java. Additionally, GCP experience in migrating On-Prem applications to the Cloud or a good understanding of GCP services related to Application Migration is required. Knowledge in Python is necessary for GCP migration, and familiarity with SQL is essential. Hands-on experience in the Hadoop ecosystem and Google Cloud Platform, specifically Dataproc, Composer, and Airflow, is expected. Experience in Spark with Python is a plus. Key Responsibilities: - Understand Hadoop ecosystem and Yarn architecture. - Hands-on experience in Spark with Python, loading, and manipulating large datasets using Spark & Hive into Hadoo...
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
Role Overview: As a Technical-Specialist Big Data (PySpark) Developer, you will be responsible for designing, developing, and unit testing software applications in an Agile development environment. You are expected to ensure the delivery of high-quality, maintainable, scalable, and high-performing software applications. Your strong technological background, expertise in Python and Spark technology, and ability to work independently will be crucial in guiding and mentoring junior resources within the team. Your role will involve extensive use of Continuous Integration tools and practices to support Deutsche Bank's digitalization journey. Key Responsibilities: - Design and propose solutions fo...
Posted 1 month ago
2.0 - 6.0 years
0 Lacs
chennai, tamil nadu
On-site
Role Overview: As a Full Stack Data Engineer, you will collaborate with Data Scientists and Product Development teams to create cutting-edge data products that align with Company Objectives. Your responsibilities will include landing data, developing new data products, enhancing existing solutions, and validating solutions with Analytics & Business partners for production release. Key Responsibilities: - Work with Data Scientists and Product Development teams to develop industry-leading data products - Land data and develop new data products - Enhance existing data products - Collaborate with Analytics & Business partners to validate solutions for production release - Utilize GCP services su...
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
Role Overview: As a Lead II Data Engineering at UST, you will design, build, and maintain data pipelines and infrastructure to support the Data Science team. Leveraging GCP tools and technologies, you'll handle data ingestion, transformation, and optimization for machine learning (ML) applications. Your role will involve ensuring high data quality and collaborating with cross-functional teams to deliver data-driven solutions that drive business innovation. Key Responsibilities: - Architect, build, and maintain scalable, secure data pipelines using GCP tools such as BigQuery, Dataflow, Dataproc, and Pub/Sub. - Write Python and SQL scripts to transform and load data for ML pipelines. - Optimiz...
Posted 1 month ago
4.0 - 9.0 years
10 - 20 Lacs
pune, bengaluru
Hybrid
Key Responsibilities: Design, build, and optimize scalable ETL/ELT pipelines using PySpark on both Hadoop and GCP environments. Utilize Hadoop ecosystem components including Hive , HDFS , YARN , and Oozie for data storage, processing, and workflow orchestration. Implement and manage GCP data services such as BigQuery , Cloud Storage , Dataflow , and Cloud Composer to support data ingestion, transformation, and analytics workloads. Ensure the performance, scalability, and reliability of data pipelines to meet business and operational requirements. Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders to understand data needs and deliver high-qu...
Posted 1 month ago
5.0 - 10.0 years
11 - 15 Lacs
chennai
Work from Office
Overview Join the Prodapt team in supporting a unified, scalable, and secure Jupyter-based environment for data science and machine learning. You will help build, maintain, and optimize the platform that empowers analysts, engineers, and scientists to explore data, develop models, and collaborate at scale. Responsibilities Develop, maintain, and enhance the Notebook platform built on JupyterLab, supporting both cloud and on-premises deployments. Integrate and optimize connections to diverse data sources (BigQuery, Hive, Teradata, Hadoop, Spark, etc.). Enable and support distributed data processing and analytics using Spark/PySpark and Dataproc. Implement and maintain platform features for co...
Posted 1 month ago
0.0 - 5.0 years
6 - 10 Lacs
chennai
Work from Office
Roles and Responsibility Design and implement data pipelines using Google Cloud Platform services. Develop and maintain large-scale data architectures and systems. Collaborate with cross-functional teams to identify and prioritize project requirements. Ensure data quality and integrity by implementing robust testing and validation procedures. Optimize data processing workflows for improved performance and efficiency. Troubleshoot and resolve complex technical issues related to data engineering. Job Requirements Strong understanding of Google Cloud Platform services and technologies. Experience with data modeling, design, and architecture. Excellent problem-solving skills and attention to det...
Posted 1 month ago
3.0 - 8.0 years
9 - 16 Lacs
gurugram, bengaluru
Work from Office
Qualifications: The candidate should have extensive production experience (3-4 Years) in GCP, Other cloud experience would be a strong bonus. Exposure to enterprise application development is a must. Roles & Responsibilities: 3-6 years of IT experience range is preferred. Able to effectively use GCP managed services e.g., Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Good to have knowledge on Cloud Composer, Cloud SQL, Big Table, Cloud Function. Strong experience in Big Data technologies Hadoop, Sqoop, Hive and Spark including DevOPs. Good hands-on expertise on either Python or Java programming. Good Understanding of GCP core services like Googl...
Posted 1 month ago
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
123151 Jobs | Dublin
Wipro
40198 Jobs | Bengaluru
EY
32154 Jobs | London
Accenture in India
29674 Jobs | Dublin 2
Uplers
24333 Jobs | Ahmedabad
Turing
22774 Jobs | San Francisco
IBM
19350 Jobs | Armonk
Amazon.com
18945 Jobs |
Accenture services Pvt Ltd
18931 Jobs |
Capgemini
18788 Jobs | Paris,France