Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 10.0 years
0 Lacs
mumbai, maharashtra, india
Remote
Position Title Sr Infrastructure Engineer- Integration Function/Group Digital and Technology Location Mumbai Shift Timing Regular Role Reports to D&T Manager - Integration Remote/Hybrid/in-Office Hybrid ABOUT GENERAL MILLS We make foodthe world loves: 100 brands. In 100 countries. Across six continents. With iconic brands like Cheerios, Pillsbury, Betty Crocker, Nature Valley, and Hagen-Dazs, we've been serving up food the world loves for 155 years (and counting). Each of our brands has a unique story to tell. How we make our food is as important as the food we make. Our values are baked into our legacy and continue to accelerate us into the future as an innovative force for good. General Mi...
Posted 1 month ago
5.0 - 7.0 years
20 - 25 Lacs
chennai
Work from Office
Position Description: Representing the Ford Credit (FC) Data Engineering Organization as a Google Cloud Platform (GCP) Data Engineer, specializing in migration and transformation, you will be a developer part of a global team to build a complex Datawarehouse in the Google Cloud Platform. This role involves designing, implementing, and optimizing data pipelines, ensuring data integrity during migration, and leveraging GCP services to enhance data transformation processes for scalability and efficiency. This role is for a GCP Data Engineer who can build cloud analytics platforms to meet expanding business requirements with speed and quality using lean Agile practices. You will work on analyzin...
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
hyderabad, telangana
On-site
You have over 10 years of experience in data engineering, specializing in cloud-based solutions. Your role involves designing solutions, reviewing team work, and providing guidance. Proficiency in Google Cloud Platform (GCP) and its various data services such as BigQuery, DBT & Streaming, Dataflow, Pub/Sub, Cloud Storage, and Cloud Composer is essential. Your track record should demonstrate your ability to create scalable data pipelines and architectures. Experience with ETL tools, processes, and implementing ETL processes to transfer data to GCP warehouses like BigQuery is required. Your technical skills should include proficiency in DBT & Streaming, Dataflow, Cloud Storage, Cloud Composer,...
Posted 2 months ago
5.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Google Cloud DevOps Engineer specializing in Terraform and CI/CD Pipeline, you will play a crucial role in provisioning GCP resources based on architectural designs that align with business objectives. Your responsibilities will include monitoring resource availability and usage metrics to provide guidelines for cost and performance optimization. You will be expected to assist IT and business users in resolving GCP service-related issues and provide guidance on cluster automation and migration approaches. Additionally, your role will involve provisioning GCP resources for data engineering and data science projects, including automated data ingestion, migration, and transformation. Key R...
Posted 2 months ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Engineer at Aptiv, you will play a crucial role in designing, developing, and implementing a cost-effective, scalable, reusable, and secured Ingestion framework. Your primary responsibility will be to work closely with business leaders, stakeholders, and source system Subject Matter Experts (SMEs) to understand and define the business needs, translate them into technical specifications, and ingest data into Google Cloud Platform, specifically BigQuery. You will be involved in designing and implementing processes for data ingestion, transformation, storage, analysis, modeling, reporting, monitoring, availability, governance, and security of high volumes of structured and unstructure...
Posted 2 months ago
2.0 - 6.0 years
0 Lacs
chennai, tamil nadu
On-site
As a GCP Data Engineer specialized in Data Migration & Transformation, you will be responsible for designing and constructing robust, scalable data pipelines and architectures on Google Cloud Platform (GCP), particularly focusing on BigQuery. Your primary tasks will involve migrating and transforming large-scale data systems and datasets to GCP while emphasizing performance, scalability, and reliability. It will be crucial for you to automate data lineage extraction and ensure data integrity across various systems and platforms. Collaborating closely with architects and stakeholders, you will play a key role in implementing GCP-native and 3rd-party tools for data ingestion, integration, and ...
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Role: GCP Data Engineer Experience: 5-9 Years Notice: 15 Days less Interview Mode: First Round Virtual/ Second Round Face to face (Mandate) Location: Bangalore Job Description Data Ingestion, Storage, Processing and Migration Acquire, cleanse, and ingest structured and unstructured data on the cloud platforms (in batch or real time) from internal and external data sources Combine data from disparate sources in to a single, unified, authoritative view of data (e.g., Data Lake) Create, maintain and provide test data to support fully automated testing Enable and support data movement from one system / service to another system / service. Reporting : Design, Develop and maintain high performance...
Posted 2 months ago
9.0 - 11.0 years
0 Lacs
pune, maharashtra, india
On-site
Data Engineer (ETL, Python, SQL, GCP) Position Overview Job Title: Data Engineer (ETL, Python, SQL, GCP) Corporate Title: AVP Location: Pune, India Role Description Engineer is responsible for developing and delivering elements of engineering solutions to accomplish business goals. Awareness is expected of the important engineering principles of the bank. Root cause analysis skills develop through addressing enhancements and fixes 2 products build reliability and resiliency into solutions through early testing peer reviews and automating the delivery life cycle. Successful candidate should be able to work independently on medium to large sized projects with strict deadlines. Successful candi...
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You are urgently required to join as a Senior BigQuery Developer (Google Cloud Platform) with a minimum experience of 5-8 years in Hyderabad. In this role, you will be responsible for designing, developing, and maintaining robust, scalable data pipelines and advanced analytics solutions using BigQuery and other GCP-native services. Your primary focus will be on designing, developing, and optimizing BigQuery data warehouses and data marts to support analytical and business intelligence workloads. You will also need to implement data modeling and best practices for partitioning, clustering, and table design in BigQuery. Integration of BigQuery with tools such as Dataform, Airflow, Cloud Compos...
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Data Engineer, you will be responsible for designing and developing scalable data engineering solutions using Google Cloud Platform (GCP) and PySpark. Your main focus will be on optimizing Spark jobs for performance, scalability, and efficient resource utilization. You will also be involved in developing, maintaining, and enhancing ETL pipelines using BigQuery, Apache Airflow, and Cloud Composer. Collaborating with data scientists, analysts, and DevOps teams to translate business requirements into technical solutions will be a key aspect of your role. Ensuring data integrity and security by implementing data governance, compliance, and security best practices will be crucial. Monitoring...
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be responsible for designing and implementing cloud-native and hybrid solutions using GCP services such as Compute Engine, Kubernetes (GKE), Cloud Functions, BigQuery, Pub/Sub, Cloud SQL, and Cloud Storage. Additionally, you will define cloud adoption strategies, migration plans, and best practices for performance, security, and scalability. You will also be required to implement and manage Terraform, Cloud Deployment Manager, or Ansible for automated infrastructure provisioning. The ideal candidate should have expertise as a GCP data architect with network domain skills in GCP (DataProc, cloud composer, data flow, BQ), python, spark Py spark, and hands-on experience in the network ...
Posted 2 months ago
11.0 - 16.0 years
0 Lacs
karnataka
On-site
It is exciting to be part of a company where individuals genuinely believe in the purpose of their work. The commitment to infuse passion and customer-centricity into the business is unwavering. Fractal stands out as a key player in the field of Artificial Intelligence. The core mission of Fractal is to drive every human decision within the enterprise by integrating AI, engineering, and design to support the most esteemed Fortune 500 companies globally. Recognized as one of India's top workplaces by The Great Place to Work Institute, Fractal is at the forefront of innovation in Cloud, Data, and AI technologies, fueling digital transformation across enterprises at an unprecedented pace exceed...
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a Data Engineer at our company, you will be responsible for designing, developing, and maintaining scalable and efficient data pipelines using GCP services such as Dataflow, Cloud Composer (Airflow), and Pub/Sub. Your role will involve designing and implementing robust data models optimized for analytical and operational workloads within GCP data warehousing solutions like BigQuery. You will also be tasked with developing and implementing ETL processes to ingest, cleanse, transform, and load data from various sources into our data warehouse and other data stores on GCP. Furthermore, you will play a key role in building and managing data warehousing solutions on GCP, ensuring data integrit...
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Consultant Delivery (Data Engineer) at Worldline, you will be an integral part of the Data Management team, contributing to a significant Move to Cloud (M2C) project. Your primary focus will be migrating our data infrastructure to the cloud and enhancing our data pipelines for improved performance and scalability. You will have the opportunity to work on a critical initiative that plays a key role in the organization's digital transformation. To excel in this role, you should hold a Bachelor's or Master's degree in Computer Science, Engineering, or a related field, along with a minimum of 5 years of experience as a Data Engineer. Your expertise should include a strong emphasis on cloud-...
Posted 2 months ago
2.0 - 6.0 years
4 - 8 Lacs
faridabad
Work from Office
Job Summary We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities Design and implement scalable data models using Snowflake and Erwin Data Modeler. Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebui...
Posted 2 months ago
5.0 - 7.0 years
0 Lacs
pune, maharashtra, india
On-site
Job description Some careers shine brighter than others. If you're looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to...
Posted 2 months ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
CloudWerx is seeking a dynamic Senior Engineer, Data to join our vibrant Data Analytics & Engineering Team in Hyderabad, India. As a Senior Cloud Data Engineer, you will play a crucial role in architecting and implementing state-of-the-art data solutions that drive business transformation. Working with a diverse client base, ranging from startups to industry leaders, you will tackle complex data challenges using the latest Google Cloud Platform (GCP) technologies. This role offers a unique blend of technical expertise and client interaction, allowing you to not only build sophisticated data systems but also consult directly with clients to shape their data strategies and witness the real-wor...
Posted 2 months ago
0.0 years
0 Lacs
hyderabad, telangana, india
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology ...
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
telangana
On-site
The job requires a GCP DevOps professional with a minimum of 5 to 8 years of experience. The position is based in either Pune or Hyderabad and follows a hybrid work mode. The ideal candidate should be able to join within a notice period of immediate to a maximum of 60 days. This is a full-time employment opportunity with LTIMindtree. Key skills required for the role include proficiency in GCP, Kubernetes, Terraform, BigQuery, Dataflow, Cloud Composer, as well as experience with Ansible, Jenkins, or Cloud Build. For any queries or further information, please reach out to Prabal Pandey at Prabal.Pandey@alphacom.in.,
Posted 2 months ago
5.0 - 10.0 years
10 - 20 Lacs
bengaluru
Work from Office
Greetings from Sun Technologies. Position: Platform Engineer Experience: 5+Years Work Location: HBR Layout, Bangalore Work Mode : Work from Office 5 days Job Type: Permanent/Fulltime Position Interview Mode : Virtual Shift Details: 24/7 Rotational (2 ways cab with dinner from company) Mandatory Skills : GCP, Terraform, Cloud composer, Dataflow Good to have : Artificial Intelligence Platform Engineer responsibilities: Design, implement, and manage data pipelines using Apache Airflow (Cloud Composer) on GCP. Build and optimize ETL/ELT workflows using Dataflow (Apache Beam). Define and deploy infrastructure-as-code using Terraform to manage GCP resources. Automate pipeline deployments, monitori...
Posted 2 months ago
4.0 - 6.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Dear Candidates, HCL is hiring for Data Engineer who is expert in GCP & DBT. Interested candidates kindly share your updated resume to [HIDDEN TEXT] Responsibilities: Take end-to-end responsibility to build, optimize and support of existing and new data products towards the defined target vision Be a champion of DevOps mindset and principles and able to manage CI/CD pipelines and terraform as well as Cloud infrastructure, in our context, it is GCP (Google Cloud Platform). Ensure that our built data products work as independent units of deployment and non-functional aspects of the data products follow the defined standards for security, scalability, observability, and performance. Work close ...
Posted 2 months ago
11.0 - 16.0 years
0 Lacs
bengaluru, karnataka, india
On-site
It&aposs fun to work in a company where people truly BELIEVE in what they are doing! We&aposre committed to bringing passion and customer focus to the business. Fractal is one of the most prominent players in the Artificial Intelligence space. Fractal&aposs mission is to power every human decision in the enterprise and brings AI, engineering, and design to help the world&aposs most admired Fortune 500 companies. Fractal has consistently been rated as India&aposs best companies to work for, by The Great Place to Work Institute. Cloud, Data, and AI technologies are seeing tremendous innovation and are driving digital transformation across all the enterprises at an unprecedented pace (more than...
Posted 2 months ago
5.0 - 10.0 years
5 - 10 Lacs
bengaluru, karnataka, india
On-site
Job Summary: We are seeking a talented GCP Data Engineer to join our team and help us design and implement robust data pipelines and analytics solutions on Google Cloud Platform (GCP). The ideal candidate will have strong expertise in BigQuery, DataFlow, Cloud Composer, and DataProc, along with experience in AI/ML tools such as Google Vertex AI or Dialogflow. Key Responsibilities: Design, develop, and maintain data pipelines and workflows using DataFlow, Cloud Composer, and DataProc. Develop optimized queries and manage large-scale datasets using BigQuery. Collaborate with cross-functional teams to gather requirements and translate business needs into scalable data solutions. Implement best ...
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
You are looking for a GCP Cloud Engineer for a position based in Pune. As a GCP Data Engineer, you will be responsible for designing, implementing, and optimizing data solutions on Google Cloud Platform. Your expertise in GCP services, solution design, and programming skills will be crucial for developing scalable and efficient cloud solutions. Your key responsibilities will include designing and implementing GCP-based data solutions following best practices, developing workflows and pipelines using Cloud Composer and Apache Airflow, building and managing data processing clusters using Dataproc, working with GCP services like Cloud Functions, Cloud Run, and Cloud Storage, and integrating mul...
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
navi mumbai, maharashtra
On-site
As a GCP Data Engineer at our organization, you will be a key member of our growing data team. We are looking for a highly skilled and experienced individual who is passionate about data and has a strong track record of designing, building, and maintaining scalable data solutions on Google Cloud Platform (GCP). Your role will involve transforming raw data into actionable insights, enabling data-driven decision-making throughout the organization. Your responsibilities will include designing, developing, implementing, and maintaining ETL/ELT data pipelines using various GCP services and programming languages. You will leverage Google BigQuery as a primary data warehouse, design optimal schemas...
Posted 2 months ago
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
112680 Jobs | Dublin
Wipro
38528 Jobs | Bengaluru
EY
31593 Jobs | London
Accenture in India
29380 Jobs | Dublin 2
Uplers
23909 Jobs | Ahmedabad
Turing
21712 Jobs | San Francisco
Amazon.com
18899 Jobs |
IBM
18825 Jobs | Armonk
Accenture services Pvt Ltd
18675 Jobs |
Capgemini
18333 Jobs | Paris,France