Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
6 - 10 Lacs
kolkata
Work from Office
About The Role Project Role : Cloud Services Engineer Project Role Description : Act as liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Must have skills : Hitachi Data Systems (HDS) Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Services Engineer, you will serve as a vit...
Posted 1 month ago
7.0 - 10.0 years
7 - 11 Lacs
hyderabad, pune, bengaluru
Work from Office
Skills: GCP,DevSecOps Jenkins, Kubernets, Google Cloud Platformusing its services such as Data Proc, Data Flow, BigQuery, CI/CDtools, Infrastructure-as-code (IaC) tools, Hadoop ecosystem, Pythonprogramming languages, Agile /Scrum methodologies, Experience: 7-10 Years Developer background, 3+ years of experience using automation scripting languages: Python, Ansible, Bash or similar& 5+ years of experience with infrastructure as code development. Notice Period: immediate Education full time graduation at least B.tech
Posted 1 month ago
8.0 - 13.0 years
6 - 10 Lacs
hyderabad, bengaluru
Work from Office
Skill Extensive experience with Google Data Products (Cloud Data Fusion,BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Dataprep, etc.). Expertise in Cloud Data Fusion,BigQuery & Dataproc Experience in MDM, Metadata Management, Data Quality and Data Lineage tools. E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management. Experience with SQL and NoSQL modern data stores. E2E Solution Design skills - Prototyping, Usability testing and data visualization literacy. Excellent knowledge of the software development life cycle
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
nagpur, maharashtra
On-site
You are applying for the position of Senior GCP Cloud Administrator for Forbes Advisor, a part of the Forbes Marketplace initiative. As a Senior GCP Cloud Administrator, your role will involve managing and optimizing various aspects of Google Cloud Platform (GCP) to ensure efficient data management and operational practices. **Responsibilities:** - Manage and configure roles/permissions in GCP IAM by following the principle of least privileged access - Optimize Big Query service through slot assignments, SQL Queries, FinOps practices, troubleshooting critical data queries, etc. - Collaborate with various teams for efficient data management and operational practices in GCP - Create automation...
Posted 1 month ago
6.0 - 10.0 years
20 - 30 Lacs
hyderabad, pune, bengaluru
Work from Office
Location : Bangalore, Chennai, Hyderabad, Pune, Kolkata 6+ Years Experience in the Google Cloud on Big Data/ Analytics/ Data Lake / Datawarehouse DataProc , DataPrep, Data Plex , Cloud Bigtable , Dataflow , Cloud Composer , BigQuery, Databricks, Kafka, Nifi, CDC processing, Snowflake, Datastore , Firestore, Docker, App Engine , Spark, Cloud Data Fusion, Apigee API Management, Kafka, Attunity, Golden Gate, Map Reduce, Hadoop, Hive, HBase, Cassandra, PySpark, Flume, Hive, Impala Must Have : Design & Implement ETL/data pipeline and Data Lake/ Datawarehouse in Google Cloud Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data so...
Posted 1 month ago
3.0 - 8.0 years
0 - 0 Lacs
hyderabad, chennai, bengaluru
Hybrid
3+ years of experience in ETL & Data Warehousing Should have experience in developing Data Engineering solutions Airflow, GCP BigQuery, Cloud Storage, Dataflow, Cloud Functions, Pub/Sub, Cloud Run, etc. Should have built solution automations in any of the above ETL tools Should have executed at least 2 GCP Cloud Data Warehousing projects Should have worked at least 2 projects using Agile/SAFe methodology Should Have mid level experience in Pyspark and Teradata Should Have mid level experience in Should have working experience on any DevOps tools like GitHub, Jenkins, Cloud Native, etc & on semi-structured data formats like JSON, Parquet and/or XML files & written complex SQL queries for data...
Posted 1 month ago
10.0 - 15.0 years
20 - 35 Lacs
pune
Work from Office
Job Title: Big Data Architect GCP Experience: 10–12 Years Location: Pune Role We are looking for a Big Data Architect with deep GCP expertise to define, design, and lead cloud-native data platforms for enterprise clients. You will be responsible for architecting large-scale data lakes, warehouses, and pipelines on GCP , modernizing on-premise workloads, and guiding technical teams to deliver robust and scalable solutions. Key Responsibilities Architect modern data platforms on GCP using BigQuery, Dataproc, Dataflow, Pub/Sub, Cloud Composer, GCS . Lead data modernization and migration programs from on-premise systems (Teradata, Netezza, Exadata, Hadoop) to GCP. Define end-to-end solution arch...
Posted 1 month ago
5.0 - 10.0 years
6 - 9 Lacs
gurugram
Work from Office
GA4 & Digital Data Integration: Design, build, and deploy robust ETL and data management processes specifically for ingesting, transforming, and loading high-volume digital analytics data from Google Analytics 4 (GA4) into BigQuery. BigQuery Data Architecture: Develop and optimize BigQuery datasets, tables, and views to support various analytical needs, ensuring efficient querying and data integrity. Data Pipeline Development: Design, build, and deploy ETL job workflows with reliable error/exception handling and rollback frameworks, primarily utilizing GCP services. GCP Resource Management: Monitoring and optimizing data processing and storage resources on GCP, with a focus on BigQuery perfo...
Posted 1 month ago
7.0 - 12.0 years
30 - 45 Lacs
bengaluru
Remote
Role & responsibilities Develop, construct, test and maintain data acquisition pipelines for large volumes of structed and unstructured data. This includes batch and real-time processing (in google cloud). Build large and complex datasets based on business requirements. Construct big data’ pipeline architecture. Identify opportunities for data acquisition via working with stakeholders and business clients. Translate business needs to technical requirements. Leverage a variety of tools in the Google Cloud Ecosystem such as Python, Data Flow, DataStream, CDC (Change Data Capture), Cloud Functions, Cloud Run, Pub Sub, BigQuery, Cloud Storage to integrate systems and data pipelines. Use logs & a...
Posted 1 month ago
6.0 - 11.0 years
15 - 25 Lacs
pune, chennai, bengaluru
Hybrid
Role & responsibilities Job Description : Sr. GCP Data Engineer Experience : 6 to 13 Years Work location : Chennai, Bangalore, Hyderabad, Pune-Hybrid Shift Timing : 2 to 11 PM I nterview proces s : L1 and L2 round Job description: 6+ years experience Should have experience in GCP BigQuery, DataProc(PySpark) Good to have experience on Informatica Preferred candidate profile Share your update resume on Nikita.solunke@ascendion.com and Do revert with below mentioned details. 1) Total year of experience- 2) Relevant year of experience- 3) Current CTC- 4) Expected CTC- 5) Notice Period- 6) Current Location- 7) Preferred Location-
Posted 1 month ago
7.0 - 11.0 years
0 Lacs
chennai, tamil nadu
On-site
Role Overview: You will be responsible for designing, building, and maintaining data solutions including data infrastructure, pipelines, etc. for collecting, storing, processing, and analyzing large volumes of data efficiently and accurately. Collaborating with business and technology stakeholders to understand current and future data requirements will be a key aspect of your role. Key Responsibilities: - Collaborate with business and technology stakeholders to understand current and future data requirements - Design, build, and maintain reliable, efficient, and scalable data infrastructure for data collection, storage, transformation, and analysis - Plan, design, build, and maintain scalabl...
Posted 1 month ago
6.0 - 9.0 years
5 - 15 Lacs
bengaluru
Work from Office
Role : GCP Architect Job Description Role & responsibilities : We have an urgent requirement for a GCP Architect, as per below JD. We are open for either contract or permanent position with us: Overall 8-10 years of experience in data engineering and AI stack 4-6 years of experience in architecting modern data warehousing solutions on Google Hands-on proficiency : Google BQ, Google Looker, Google GCS, Cloud run, Vertex Working knowledge of Databricks and Databricks Unity Catalog (GCP experience preferred) Experience in design and build solution on Vertex AI RAG, Evaluation, Agent builder for LLM apps ML flow ops with vertex AI pipelines, Model registry, CI/CD overview Experience in working w...
Posted 1 month ago
2.0 - 5.0 years
4 - 8 Lacs
hyderabad
Work from Office
About The Role Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. About The Role - Grade Specific The involves leading and managing a team of data engineers, overseeing data engineering projects, ensuring technical excellence, and fostering collaboration with stakeholders. They play a critical role in driving the success of data engineering initiatives and ensuring the delivery of reliable and high-quality data solutions to support the organization"s data-driven objectives. Skills (competencies) Ab Initio Agile (Software Developme...
Posted 1 month ago
5.0 - 10.0 years
5 - 15 Lacs
hyderabad, chennai, bengaluru
Hybrid
GCP Dataflow, GCP Cloud Composer, GCP BigQuery, GCP Cloud Storage, Dataproc. Java, Python, Scala. ETL/ELT, Big Data Hadoop Ecosystem, ANSI-SQL. DevOps, CI/CD, API, Agile GCP Datastream, Dataform, Datafusion, Workflows, Pub/Sub, and DMS
Posted 1 month ago
3.0 - 7.0 years
11 - 15 Lacs
bengaluru
Work from Office
At Sogeti, we believe the best is inside every one of us. Whether you are early in your career or at the top of your game, well encourage you to fulfill your potentialto be better. Through our shared passion for technology, our entrepreneurial culture , and our focus on continuous learning, well provide everything you need to doyour best work and become the best you can be. Part of the Capgemini Group, Sogeti makes business value through technology for organizations that need to implement innovation at speed and want a localpartner with global scale. With a hands-on culture and close proximity to its clients, Sogeti implements solutions that will help organizations work faster, better, andsm...
Posted 1 month ago
6.0 - 11.0 years
15 - 30 Lacs
pune, bengaluru, delhi / ncr
Hybrid
Locations: Gurgaon, Noida, Pune, Mumbai, Chennai, Bangalore Experience : 5-13 Years Mandatory Skills : BigQuery, Airflow, GCS, Python, SQL, Pyspark (Optional) Skill Requirements: Essential Skills and experience: Excellent Programming skills in Python with object-oriented design Excellent knowledge in current computing trends and technologies Knowledge or working experience in Generative AI (GenAI) technologies is preferred. Experience in designing and implementing cloud infrastructure, platforms, and applications Hands-On Experience on Infrastructure as a Code using terraform /pulumi /typescript in GCP Hands-On Experience with Google Cloud platform & product deployment and automation Hands-O...
Posted 1 month ago
2.0 - 6.0 years
7 - 11 Lacs
pune, chennai, bengaluru
Work from Office
Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrowpeople with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. About The Role : Google Cloud Infrastructure Support Engineer will be responsible for ensuring the reliability, performance, and securit...
Posted 1 month ago
2.0 - 6.0 years
6 - 10 Lacs
pune
Work from Office
Req ID: 332236 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Business Consulting-Technical analyst with ETL,GCP using Pyspark to join our team in Pune, Mahrshtra (IN-MH), India (IN). Key Responsibilities: Data Pipeline Development: Designing, implementing, and optimizing data pipelines on GCP using PySpark for efficient and scalable data processing. ETL Workflow Development: Building and maintaining ETL workflows for extracting, transforming, and loading data into various GCP services. GCP Service Utilization: Leve...
Posted 1 month ago
3.0 - 5.0 years
5 - 15 Lacs
pune
Work from Office
About the Role: We are looking for a skilled Google Cloud Platform (GCP) Engineer who can design, implement, and manage scalable cloud solutions. The ideal candidate will have strong expertise in cloud infrastructure, data engineering, and automation to support business-critical applications and data pipelines. Key Responsibilities: Design, implement, and manage GCP infrastructure services (Compute Engine, Cloud Storage, IAM, VPC, Pub/Sub, etc.). Develop, optimize, and maintain ETL/ELT pipelines using BigQuery, Dataflow, Composer (Airflow), and Cloud Functions . Automate infrastructure provisioning using Terraform, Deployment Manager, or equivalent tools . Ensure security, scalability, and r...
Posted 1 month ago
3.0 - 5.0 years
5 - 15 Lacs
hyderabad
Work from Office
We are looking for an experienced Data Engineer with expertise in Google Cloud Platform (GCP) to design, build, and maintain scalable data pipelines and cloud-native solutions. The ideal candidate should have strong hands-on experience in GCP services, SQL, and data engineering best practices. Key Responsibilities Design and develop ETL/ELT pipelines using Dataflow, Dataproc, and Cloud Composer (Airflow) . Build and optimize data warehouses and data lakes using BigQuery, Cloud Storage, and Pub/Sub . Collaborate with analysts, data scientists, and business stakeholders to deliver reliable data solutions. Implement data quality, governance, and security standards. Automate workflows, improve p...
Posted 1 month ago
4.0 - 8.0 years
11 - 15 Lacs
kolkata, mumbai, new delhi
Work from Office
MLOps Engineer Location: Hyderabad & Chennai Workplace Type: Hybrid About the Rol eWe are seeking a highly skilled and motivated MLOps Engineer to join our growing data science team In this role, you will be responsible for building, deploying, and maintaining machine learning models in production using Google Cloud Platform (GCP) You will work closely with data scientists and engineers to ensure the reliability, scalability, and performance of our ML systems The ideal candidate has a strong background in software engineering, data engineering, and machine learning, with hands-on experience using GCP services such as BigQuery, Airflow, and Dataproc You will be instrumental in automating our ...
Posted 1 month ago
4.0 - 8.0 years
10 - 15 Lacs
pune
Work from Office
About The Role : Job TitleData Engineer (ETL, Python, SQL, GCP) Corporate TitleAVP LocationPune, India Role Description Engineer is responsible for developing and delivering elements of engineering solutions to accomplish business goals. Awareness is expected of the important engineering principles of the bank. Root cause analysis skills develop through addressing enhancements and fixes 2 products build reliability and resiliency into solutions through early testing peer reviews and automating the delivery life cycle. Successful candidate should be able to work independently on medium to large sized projects with strict deadlines. Successful candidates should be able to work in a cross appli...
Posted 1 month ago
1.0 - 5.0 years
6 - 10 Lacs
bengaluru
Work from Office
Job TitleData Engineer Experience5"“8 Years LocationDelhi, Pune, Bangalore (Hyderabad & Chennai also acceptable) Time ZoneAligned with UK Time Zone Notice PeriodImmediate Joiners Only Role Overview We are seeking experienced Data Engineers to design, develop, and optimize large-scale data processing systems You will play a key role in building scalable, efficient, and reliable data pipelines in a cloud-native environment, leveraging your expertise in GCP, BigQuery, Dataflow, Dataproc, and more Key Responsibilities Design, build, and manage scalable and reliable data pipelines for real-time and batch processing. Implement robust data processing solutions using GCP services and open-source tec...
Posted 1 month ago
12.0 - 20.0 years
18 - 22 Lacs
mumbai, mangaluru, bengaluru
Hybrid
Location: Bangalore, Mumbai, Mangalore, Udupi, Delhi/ NCR Region About Niveus Solutions: Niveus Solutions is a distinguished Google Cloud Premier Partner, dedicated to empowering businesses with transformative cloud solutions. We pride ourselves on leveraging the comprehensive suite of Google Cloud technologies, spanning from robust infrastructure modernization to cutting-edge AI/ML, advanced analytics, and secure hybrid/multi-cloud strategies. Our team is composed of highly skilled industry experts committed to solving complex business challenges and accelerating digital transformation for our diverse clientele. About the Role: We are seeking an exceptionally accomplished and visionary Prin...
Posted 1 month ago
6.0 - 10.0 years
5 - 15 Lacs
pune, bengaluru, delhi / ncr
Work from Office
Role Data Engineer Required Technical Skill Set on at least one of them (Broad Skills mentioned here not all of them are required) Very good experience in implementing data platform modernization using GCP data services. Good data modelling skills on SQL/NoSQL based data platforms. Must-Have • Google Data Engineer (Overall 5+ Years with 2 years relevant experience) • Experience working in GCP based Big Data deployments (Batch/Realtime) leveraging components like GCP Big Query, air flow, Google Cloud Storage, Data fusion, Data flow, Data Proc etc. • Good skills in Python Language, PYSPARK • Good Skills in Linux • Exposure to creation of CI-CD pipelines for promoting big data release deploymen...
Posted 1 month ago
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
123151 Jobs | Dublin
Wipro
40198 Jobs | Bengaluru
EY
32154 Jobs | London
Accenture in India
29674 Jobs | Dublin 2
Uplers
24333 Jobs | Ahmedabad
Turing
22774 Jobs | San Francisco
IBM
19350 Jobs | Armonk
Amazon.com
18945 Jobs |
Accenture services Pvt Ltd
18931 Jobs |
Capgemini
18788 Jobs | Paris,France