Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Scala Engineer at our company, you will be responsible for supporting data processing and transformation efforts tied to reporting across our Insurance, Pensions & Investments (IP&I) business. You will primarily work on applications built in Scala using the Spark framework and contribute to the migration of our Annuities platform to Google Cloud Platform (GCP). Your role will involve incremental enhancements on existing applications, working within Agile methodologies (Scrum and/or Kanban), and collaborating with actuarial and accounting colleagues to deliver technical solutions to financial business problems. Your contribution plays a crucial role in Helping Britain Prosper. Key Respon...
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
As an experienced GCP Data Engineer at Quantiphi, you will be responsible for designing and implementing end-to-end data pipelines using BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Composer for both batch and streaming data ingestion. Your key responsibilities will include building and maintaining data models to support marketing, campaign performance, and user journey analytics, integrating data from Google Ads, Campaign Manager 360 (CM360), Display & Video 360 (DV360), Search Ads 360, Google Analytics 4, and developing ETL/ELT pipelines to ensure data quality and reliability. You will collaborate with marketing, data science, and analytics teams to design and support marketing dashboar...
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a GCP Data Engineer, you will be responsible for developing data warehousing projects on GCP platforms. Your role will require around 3+ years of experience in this field. You should possess strong analytical and problem-solving skills, effective business communication abilities, and the capacity to work independently from beginning to end. Key Responsibilities: - Demonstrating strong expertise in SQL & PL/SQL - Utilizing GCP services & BigQuery knowledge; GCP certification is considered an added advantage - Having good experience in GCP Dataproc, Cloud Composer, DAGs, and airflow - Exhibiting proficiency in Teradata or any other database - Python knowledge is an added advantage - Leading...
Posted 1 month ago
8.0 - 15.0 years
0 Lacs
noida, uttar pradesh, india
On-site
Job Description (Posting). A highly skilled and experienced AI/ML Architect to lead the design, development, and deployment of innovative AI/ML solutions. You will be responsible for translating business requirements into scalable, robust, and production-ready AI/ML architectures, primarily leveraging the Google Cloud Platform (GCP). You will be a technical leader, mentoring other engineers and driving best practices in ML Ops.Total Years of Experience: 8-15 YearsRelevant Experience: 7+ Years in AI/ML Architecture and DevelopmentKey Responsibilities: Solution Architecture: Design and architect end-to-end AI/ML solutions tailored to specific business needs, ensuring scalability, performance, ...
Posted 1 month ago
6.0 - 10.0 years
15 - 19 Lacs
pune
Work from Office
Responsibilities: * Design, implement & optimize data solutions using BigQuery, Data Build Tool & Python. * Collaborate with cross-functional teams on ETL projects with focus on robustness & data governance. Provident fund
Posted 1 month ago
8.0 - 10.0 years
0 Lacs
india
On-site
Job Description What's this role about Job Summary We are seeking a skilled GCP Data Engineering Developer with strong hands-on experience in designing and implementing data migration solutions from traditional SQL databases to Google Cloud Platform (GCP) . The ideal candidate will have a solid understanding of banking data structures and compliance requirements, and will contribute to building secure, scalable, and efficient data pipelines for banking analytics and operations. Key Responsibilities Data Migration & Engineering Develop and implement data migration workflows from SQL-based systems (e.g., Oracle, SQL Server, MySQL) to GCP services like BigQuery, Cloud SQL, Cloud Spanner, and Cl...
Posted 1 month ago
9.0 - 13.0 years
0 Lacs
pune, maharashtra
On-site
**Job Description**: **Role Overview**: As a Data Engineer at Deutsche Bank, you will be responsible for developing and delivering engineering solutions to achieve business goals. You will work on medium to large-sized projects with strict deadlines in a cross-application mixed technical environment. Your role will involve collaborating with a geographically dispersed team to build reliability and resiliency into solutions through testing, peer reviews, and automating the delivery life cycle. This position is crucial for the buildout of Compliance tech internal development team in India. **Key Responsibilities**: - Design, develop, and maintain data pipelines using Python and SQL programming...
Posted 1 month ago
15.0 - 17.0 years
0 Lacs
chennai, tamil nadu, india
On-site
At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. About the Role We are seeking a Senior Technical Program Manager (TPM) to lead strategic initiatives across the LUMI Big Data Platform, focusing on data engineerin...
Posted 1 month ago
6.0 - 13.0 years
0 Lacs
chennai, tamil nadu, india
On-site
Greetings from HCLTECH! We are hiring for GCP Data Engineer role for the location #Chennai #Bangalore #Hyderabad Role: GCP Data Engineer Experience: 6 Years to 13 Years Notice Period: Immediate to 45 Days Location: Chennai, Bangalore, Hyderabad Interview Mode: F2F Venue: Sholinganallur, Chennai Date: 08- Nov-2025 Job Description: Mandatory Skill: GCP BigQuery Cloud Composer or Airflow (Anyone is fine) Dataflow or Dataproc or Datafusion (Anyone is fine) Python or Pyspark (Anyone is fine) DBT SQL Nice to have: GKE, Spanner, Harness, GEN AI, looker Interested candidates, please share your resume to [HIDDEN TEXT] along with the below details Candidate Name Number Email Total Experience Relevant ...
Posted 1 month ago
11.0 - 16.0 years
0 Lacs
karnataka
On-site
As a company deeply committed to bringing passion and customer focus to the business, Fractal is a prominent player in the Artificial Intelligence space with a mission to power every human decision in the enterprise. Fractal brings AI, engineering, and design expertise to help the world's most admired Fortune 500 companies. Rated as one of India's best companies to work for by The Great Place to Work Institute, we are at the forefront of innovation in Cloud, Data, and AI technologies, driving digital transformation across enterprises at an unprecedented pace. **Responsibilities:** - Evaluate the current technology landscape and recommend a forward-looking, short, and long-term technology str...
Posted 1 month ago
5.0 - 8.0 years
6 - 9 Lacs
hyderabad, telangana, india
On-site
Job Title: GCP Data Engineer/BigQuery Developer Relevant Experience (in Yrs.) 5-8 years Location: Hyderabad, Pune Technical/Functional Skills 1) Expertise on GCP services Airflow/Cloud Composer/Google Cloud storage/ BigQuery, 2) Experience in developing. maintaining and supporting Airflow/Composer DAG's 3) Good hands-on experience in Python programming, working with GCP API's 4) Experience with Data Engineering services in GCP- Big Query, Data flow, DataProc Pub-Sub, Cloud Function, Composer, Storage etc. 5) Experience with writing complex queries, stored procedures in BigQuery 6) Experience in cost/process/run time optimization of Airflow/Composer DAG's and/or stored procedure/queries in Bi...
Posted 1 month ago
8.0 - 11.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Qualifications 8-11 years of IT experience range is preferred. BE/B.Tech/MCA/MS-IT/M.Tech or any other Engineering degrees in related fields The candidate should have extensive production experience (5 Years) with GCP, Other cloud experience would be a strong bonus. Strong background in Data engineering 4-5 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc. Exposure to enterprise application development is a must. Roles & Responsibilities Able to effectively use GCP managed services e.g., Dataproc, Dataflow, pub/sub, Cloud functions, Cloud composer, Big Query, Big Table, - At least 4 of these Services. Strong experience in Big Data technologies Hadoop, Sqoop, H...
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
As a Senior Cloud Data Engineer at CloudWerx, you will play a crucial role in architecting and implementing cutting-edge data solutions using the latest Google Cloud Platform (GCP) technologies. Your work will involve collaborating with a diverse range of clients to solve complex data challenges and drive business transformation. Here is an insight into your impact and the qualifications required for this role: **Role Overview:** You will lead technical discussions with clients, design and optimize data pipelines, mentor junior engineers, and drive the adoption of cutting-edge GCP technologies. Your role will also involve identifying opportunities for process improvements and automation, col...
Posted 1 month ago
12.0 - 14.0 years
0 Lacs
chennai, tamil nadu, india
On-site
Job Description We are the movers of the world and the makers of the future. We get up every day, roll up our sleeves, and build a better world together. At Ford, we're all a part of something bigger than ourselves. Are you ready to change the way the world moves At Ford Motor Credit Company, we are modernizing our enterprise data warehouse in Google Cloud to enhance data, analytics, and AI/ML capabilities, improve customer experience, ensure regulatory compliance, and boost operational efficiencies. As a Senior Data Engineering Manager, you will lead a team responsible for the strategic development and delivery of data products and application support. You will oversee the integration of da...
Posted 1 month ago
12.0 - 14.0 years
0 Lacs
chennai, tamil nadu, india
On-site
Job Description As a Senior Data Engineering Manager, you will lead a team responsible for the strategic development and delivery of data products. You will oversee the integration of data from various sources, guiding the evolution of our analytical data landscape, including the merging of historical data from legacy platforms with data ingested from new platforms. Your role involves directing the analysis and manipulation of large datasets, ensuring the activation of data assets to enable enterprise platforms and analytics within Google Cloud Platform. You will also lead the design and implementation of data transformation and modernization initiatives on Google Cloud Platform, ensuring th...
Posted 1 month ago
8.0 - 10.0 years
0 Lacs
india
On-site
Cloud Leader - CloudOps Experience : 8 to 10 years B.Sc Computers, BE, B.Tech or with equivalent experience Location : Mangalore (Onsite) Skills Required 5+ years of technical lead or technical project management experience in cloud-based environments. Proven experience leading cross-functional teams in Cloud Operations or Cloud Migration. Strong understanding of Cloud Services. Proficient with ticketing tools like Jira SM, ServiceNow, Freshservice. Excellent communication, leadership, and stakeholder management skills. Strong knowledge of ITIL process. Strong hands-on experience with GCP services: BigQuery, Dataflow, Pub/Sub, Cloud Storage, Cloud Composer, Dataproc, Cloud Functions. Program...
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
Role Overview: You are sought after for the position of Senior BigQuery Developer at FIS Clouds in Hyderabad. Your role will involve designing, developing, and maintaining robust, scalable data pipelines and advanced analytics solutions using BigQuery and other GCP-native services. Collaborating closely with data scientists, analysts, and business stakeholders, you will ensure efficient and secure access to enterprise data. Key Responsibilities: - Design, develop, and optimize BigQuery data warehouses and data marts to support analytical and business intelligence workloads. - Implement data modeling and best practices for partitioning, clustering, and table design in BigQuery. - Integrate Bi...
Posted 1 month ago
6.0 - 10.0 years
14 - 24 Lacs
bengaluru
Hybrid
Job Description: The Google Cloud DevOps Engineer will be responsible for automating infrastructure provisioning and configuration management using Terraform and Ansible. The role involves designing, implementing, and maintaining CI/CD pipelines on GCP using Azure DevOps. The ideal candidate will have extensive experience with GCP resources, particularly in data engineering, and possess strong scripting skills in Python and Bash. Responsibilities: Automate infrastructure provisioning and configuration management using Terraform and Ansible. Design, implement, and maintain CI/CD pipelines on GCP using Azure DevOps. Manage and optimize GCP resources, including Compute Engine, Data Fusion, Data...
Posted 1 month ago
6.0 - 10.0 years
14 - 24 Lacs
pune
Hybrid
Job Description: The Google Cloud DevOps Engineer will be responsible for automating infrastructure provisioning and configuration management using Terraform and Ansible. The role involves designing, implementing, and maintaining CI/CD pipelines on GCP using Azure DevOps. The ideal candidate will have extensive experience with GCP resources, particularly in data engineering, and possess strong scripting skills in Python and Bash. Responsibilities: Automate infrastructure provisioning and configuration management using Terraform and Ansible. Design, implement, and maintain CI/CD pipelines on GCP using Azure DevOps. Manage and optimize GCP resources, including Compute Engine, Data Fusion, Data...
Posted 1 month ago
5.0 - 7.0 years
0 Lacs
bhubaneswar, odisha, india
On-site
We are urgently seeking an experienced Corporate Trainer to deliver an intensive, hands-on program on Big Data & Google Cloud Platform (GCP) for corporate professionals in Bhubaneswar. The ideal candidate should have strong technical expertise and real-time industry exposure in Big Data, Cloud, and Data Engineering domains. Training Details Location: Bhubaneswar (Offline Classroom Mode) Start Date: 1st Week of November 2025 Duration: 36 Days Audience: Corporate / Industry Professionals TOC: Available Compensation: ?3,000 per day Skills & Topics Covered Big Data Frameworks: Hadoop, Spark (Core, SQL, Streaming, Advanced) Google Cloud Platform (GCP): Dataproc, Dataflow, BigQuery, Cloud Composer...
Posted 1 month ago
8.0 - 12.0 years
1 - 3 Lacs
hyderabad
Work from Office
Position: Senior GCP Cloud Engineer / Specialist Experience: 8+ years Client Information: Shared upon successful screening Salary: 1.5 Lakh 3.0 Lakh/month Shift: Night Shift (8:30 PM IST 5:30 AM IST) Location: Aditya Trade Center, Ameerpet, Hyderabad Preferred Certification: GCP Certified (hands-on experience is prioritized due to prevalence of manipulated test-taking) Key Skills & Expertise: Core GCP Services: Cloud Run, Cloud Composer, Cloud Functions, Dataflow, Dataproc Security & Integration: IAM, service accounts, GCP Access Keys management, integration with systems like Salesforce, implementation of cloud security best practices Other: Strong understanding of GCP architecture and deplo...
Posted 1 month ago
4.0 - 6.0 years
10 - 18 Lacs
gurugram, bengaluru
Hybrid
Job Title: Data Engineer GCP BigQuery Location: Bangalore / Gurgaon Experience: 46 years Employment Type : Full-time About the Role We are looking for a skilled Data Engineer with strong experience in Google Cloud Platform (GCP) and BigQuery to design, build, and optimize data pipelines and analytics solutions. The ideal candidate will have hands-on experience in large-scale data processing, ETL development, and data modeling, and will collaborate closely with data analysts, data scientists, and business stakeholders. Key Responsibilities Design, develop, and maintain scalable ETL/ELT data pipelines on GCP (BigQuery, Dataflow, Cloud Composer, Pub/Sub, etc.). Build and optimize data models an...
Posted 1 month ago
10.0 - 18.0 years
15 - 25 Lacs
chennai
Work from Office
POSITION TITLE: Lead MSBI Developer / Data Engineer This is a senior, hands-on role for a technical specialist focused on BI platform modernization. The primary mission is to lead the end-to-end analysis of our legacy Microsoft SSAS cubes and SSIS ETL workflows, create definitive technical documentation, and then use that knowledge to support the migration to a modern cloud data platform (GCP). Skills Required: Demonstrated ability to document complex systems , Ability to communicate and work with cross-functional teams and all levels of management , Microsoft Sql Servers, MSSQL, ETL Skills Preferred: Cloud Composer, Airflow PySpark, Big Query,, Google Cloud Platform - Biq Query, Data Flow, ...
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Senior Data Scientist with experience in Google Cloud Platform (GCP) at this organization, your role involves leading complex data science projects, developing and deploying advanced machine learning (ML) models, and utilizing GCP's ecosystem for strategic business decisions. Your expertise in technical skills, business acumen, and collaboration with team members will be crucial for success. **Responsibilities:** - **End-to-End Project Leadership:** Lead and manage data science projects from problem definition to model deployment and monitoring. - **Advanced Analytics and Modeling:** - Design and implement advanced statistical and ML models for solving complex business problems. - Utili...
Posted 1 month ago
4.0 - 9.0 years
16 - 31 Lacs
pune, bengaluru, mumbai (all areas)
Hybrid
Design, deploy & monitor ML pipelines on GCP using Vertex AI, Dataflow, BigQuery. Migrate on-prem/Hadoop models to GCP. Implement Feature Store, MLflow tracking, & post-model monitoring for Data & Model Drift. Required Candidate profile 4–9 yrs of hands-on experience in MLOps using Python, Spark/PySpark, and GCP. Skilled in CI/CD, Docker, Terraform, and ML pipeline automation for model deployment, monitoring, and retraining.
Posted 1 month ago
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
174558 Jobs | Dublin
Wipro
55192 Jobs | Bengaluru
EY
44116 Jobs | London
Accenture in India
37169 Jobs | Dublin 2
Turing
30851 Jobs | San Francisco
Uplers
30086 Jobs | Ahmedabad
IBM
27225 Jobs | Armonk
Capgemini
23907 Jobs | Paris,France
Accenture services Pvt Ltd
23788 Jobs |
Infosys
23603 Jobs | Bangalore,Karnataka