Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 13.0 years
0 Lacs
pune, maharashtra
On-site
You are a highly skilled and experienced Cloud Architect/Engineer with deep expertise in Google Cloud Platform (GCP). Your primary responsibility is to design, build, and manage scalable and reliable cloud infrastructure on GCP. You will leverage various GCP services such as Compute Engine, Cloud Run, BigQuery, Pub/Sub, Cloud Functions, Dataflow, Dataproc, IAM, and Cloud Storage to ensure high-performance cloud solutions. Your role also includes developing and maintaining CI/CD pipelines, automating infrastructure deployment using Infrastructure as Code (IaC) principles, and implementing best practices in cloud security, monitoring, performance tuning, and logging. Collaboration with cross-f...
Posted 3 months ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
You will be working as a Technical Lead Data Engineer for a leading data and AI/ML solutions provider based in Gurgaon. In this role, you will be responsible for designing, developing, and leading complex data projects primarily on Google Cloud Platform and other modern data stacks. Your key responsibilities will include leading the design and implementation of robust data pipelines, collaborating with cross-functional teams to deliver end-to-end data solutions, owning project modules, developing technical roadmaps, and implementing data governance frameworks on GCP. You will be required to integrate GCP data services like BigQuery, Dataflow, Dataproc, Cloud Composer, Vertex AI Studio, and G...
Posted 3 months ago
5.0 - 7.0 years
5 - 14 Lacs
Pune, Gurugram, Bengaluru
Work from Office
• Handson experience in objectoriented programming using Python, PySpark, APIs, SQL, BigQuery, GCP • Building data pipelines for huge volume of data • Dataflow Dataproc and BigQuery • Deep understanding of ETL concepts
Posted 3 months ago
5.0 - 8.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Skills desired: Strong at SQL (Multi pyramid SQL joins) Python skills (FastAPI or flask framework) PySpark Commitment to work in overlapping hours GCP knowledge(BQ, DataProc and Dataflow) Amex experience is preferred(Not Mandatory) Power BI preferred (Not Mandatory) Flask, Pyspark, Python, Sql
Posted 3 months ago
4.0 - 8.0 years
12 - 18 Lacs
Hyderabad
Hybrid
Egen is a fast-growing and entrepreneurial company with a data-first mindset. We bring together the best engineering talent working with the most advanced technology platforms, including Google Cloud and Salesforce, to help clients drive action and impact through data and insights. We are committed to being a place where the best people choose to work so they can apply their engineering and technology expertise to envision what is next for how data and platforms can change the world for the better. We are dedicated to learning, thrive on solving tough problems, and continually innovate to achieve fast, effective results. Job Summary We are seeking a talented and passionate Python Developer t...
Posted 3 months ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Engineer, VP at our Pune location in India, you will be responsible for managing and performing work across various areas of the bank's IT Platform/Infrastructure. Your role will involve analysis, development, and administration, with possible oversight of engineering delivery for specific departments. Your day-to-day tasks will include planning and developing engineering solutions to achieve business goals, ensuring reliability and resiliency in solutions, and promoting maintainability and reusability. You will play a key role in architecting well-integrated solutions and reviewing engineering plans to enhance capability and reusability. You will collaborate with a cross-functio...
Posted 3 months ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a GCP Senior Data Engineer/Architect, you will play a crucial role in our team by designing, developing, and implementing robust and scalable data solutions on the Google Cloud Platform (GCP). Collaborating closely with Architects and Business Analysts, especially for our US clients, you will translate data requirements into effective technical solutions. Your responsibilities will include designing and implementing scalable data warehouse and data lake solutions, orchestrating complex data pipelines, leading cloud data lake implementation projects, participating in cloud migration projects, developing containerized applications, optimizing SQL queries, writing automation scripts in Pytho...
Posted 3 months ago
8.0 - 13.0 years
0 Lacs
hyderabad, telangana
On-site
You are an experienced GCP Data Engineer with 8+ years of expertise in designing and implementing robust, scalable data architectures on Google Cloud Platform. Your role involves defining and leading the implementation of data architecture strategies using GCP services to meet business and technical requirements. As a visionary GCP Data Architect, you will be responsible for architecting and optimizing scalable data pipelines using Google Cloud Storage, BigQuery, Dataflow, Cloud Composer, Dataproc, and Pub/Sub. You will design solutions for large-scale batch processing and real-time streaming, leveraging tools like Dataproc for distributed data processing. Your responsibilities also include ...
Posted 3 months ago
12.0 - 15.0 years
35 - 60 Lacs
Chennai, Bengaluru
Hybrid
Job Description: Job Title: GCP Solution Architect Location : Chennai | Bangalore Experience : 12-15 years in IT Key Responsibilities Architect and lead GCP-native data and AI solutions tailored to AdTech use casessuch as real-time bidding, campaign analytics, customer segmentation, and look alike modeling. Design high-throughput data pipelines, audience data lakes, and analytics platforms leveraging GCP services like BigQuery, Dataflow, Pub/Sub, Cloud Storage, Vertex AI, etc. Collaborate with ad operations, marketing teams, and digital product owners to understand business goals and translate them into scalable and performant solutions. Integrate with third-party AdTech and MarTech platform...
Posted 3 months ago
5.0 - 10.0 years
25 - 35 Lacs
Noida, Pune, Bengaluru
Work from Office
Description: We are seeking a proficient Data Governance Engineer to lead the development and management of robust data governance frameworks on Google Cloud Platform (GCP). The ideal candidate will bring in-depth expertise in data management, metadata frameworks, compliance, and security within cloud environments to ensure high-quality, secure, and compliant data practices aligned with organizational goals. Requirements: 4+ years of experience in data governance, data management, or data security. Hands-on experience with Google Cloud Platform (GCP) including BigQuery, Dataflow, Dataproc, and Google Data Catalog. Strong command over metadata management, data lineage, and data quality tools ...
Posted 3 months ago
4.0 - 7.0 years
18 - 20 Lacs
Pune
Hybrid
Job Title: GCP Data Engineer Location: Pune, India Experience: 4 to 7 Years Job Type: Full-Time Job Summary: We are looking for a highly skilled GCP Data Engineer with 4 to 7 years of experience to join our data engineering team in Pune . The ideal candidate should have strong experience working with Google Cloud Platform (GCP) , including Dataproc , Cloud Composer (Apache Airflow) , and must be proficient in Python , SQL , and Apache Spark . The role involves designing, building, and optimizing data pipelines and workflows to support enterprise-grade analytics and data science initiatives. Key Responsibilities: Design and implement scalable and efficient data pipelines on GCP , leveraging D...
Posted 3 months ago
7.0 - 10.0 years
20 - 27 Lacs
Noida
Work from Office
Job Responsibilities: Technical Leadership: • Provide technical leadership and mentorship to a team of data engineers. • Design, architect, and implement highly scalable, resilient, and performant data pipelines, using GCP technologies is a plus (e.g., Dataproc, Cloud Composer, Pub/Sub, BigQuery). • Guide the team in adopting best practices for data engineering, including CI/CD, infrastructure-as-code, and automated testing. • Conduct code reviews, design reviews, and provide constructive feedback to team members. • Stay up-to-date with the latest technologies and trends in data engineering, Data Pipeline Development: • Develop and maintain robust and efficient data pipelines to ingest, proc...
Posted 3 months ago
7.0 - 10.0 years
1 - 6 Lacs
Chennai
Work from Office
Key Responsibilities Design and develop large-scale data pipelines using GCP services (BigQuery, Dataflow, Dataproc, Pub/Sub). Implement batch and real-time ETL/ELT pipelines using Apache Beam and Spark. Manage and optimize BigQuery queries, partitioning, clustering, and cost control. Build distributed processing jobs on Dataproc (Hadoop/Spark) clusters. Develop and maintain streaming data pipelines with Pub/Sub and Dataflow. Work with Cloud Spanner to support highly available and globally scalable databases. Integrate data from various sources, manage schema evolution, and ensure data quality. Collaborate with data analysts, data scientists, and business teams to deliver scalable data solut...
Posted 3 months ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Data Specialist, you will be responsible for utilizing your expertise in ETL Fundamentals, SQL, BigQuery, Dataproc, Python, Data Catalog, Data Warehousing, and various other tools to contribute to the successful implementation of data projects. Your role will involve working with technologies such as Cloud Trace, Cloud Logging, Cloud Storage, and Datafusion to build and maintain a modern data platform. To excel in this position, you should possess a minimum of 5 years of experience in the data engineering field, with a focus on GCP cloud data implementation suite including BigQuery, Pub Sub, Data Flow/Apache Beam, Airflow/Composer, and Cloud Storage. Your strong understanding of very la...
Posted 3 months ago
5.0 - 10.0 years
4 - 9 Lacs
Chennai, Bengaluru
Work from Office
Dear Candidate, This is with reference to your profile on the job portal. Deloitte India Consulting has an immediate requirement for the following role. Job Notice period: Looking for immediate 4 Weeks (Max) Location : Any Job Description – Skill : GCP Data Engineer Incase if you are interested, please share your updated resume along with the following details.(Mandatory) to Smouni@deloitte.com Candidate Name Mobile No. Email ID Skill Total Experience Education Details Current Location Requested location Current Firm Current CTC Exp CTC Notice Period/LWD Feedback
Posted 3 months ago
15.0 - 20.0 years
9 - 14 Lacs
Hyderabad
Work from Office
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI to improve performance and efficiency, including but not limited to deep learning, neural networks, chatbots, natural language processing. Must have skills : Google Cloud Machine Learning Services Good to have skills : Google Pub/Sub, GCP Dataflow, Google DataprocMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an AI / ML Engineer, you will engage in the development of applications and systems that leverage artificial intelligence to enhance performance and efficiency. Your typical day will involve collaborating with cros...
Posted 3 months ago
4.0 - 8.0 years
16 - 25 Lacs
Gurugram
Hybrid
Bachelors/Masters degree in Computer Science, Management of Information System or equivalent. 2+ years of experience in GCP - BigQuery, Dataproc, Dataflow. 4 or more years of relevant software engineering experience ( Big Data: Python, SQL, Hadoop, Hive, Spark) in a data-focused role. Strong experience in Big Data, Python, SQL, Spark and cloud exp (GCP/AWS/Azure). Experience in designing and building highly scalable and reliable data pipelines using Big Data ( Airflow, Python, Redshift/Snowflake ). Software development experience with proficiency in Python, Java, Scala, or another language. Good knowledge of Big Data querying tools, such as Hive, Experience with Spark/PySpark. Ability to ana...
Posted 3 months ago
4.0 - 9.0 years
5 - 14 Lacs
Pune, Chennai, Bengaluru
Work from Office
Dear Candidate, This is with reference to your profile on the job portal. Deloitte India Consulting has an immediate requirement for the following role. Job Summary: We are looking for a skilled GCP Data Engineer to design, build, and maintain scalable data pipelines and solutions on Google Cloud Platform . The ideal candidate will have hands-on experience with GCP services, data warehousing, ETL processes, and big data technologies. Key Responsibilities: Design and implement scalable data pipelines using Cloud Dataflow , Apache Beam , and Cloud Composer . Develop and maintain data models and data marts in BigQuery . Build ETL/ELT workflows to ingest, transform, and load data from various so...
Posted 3 months ago
12.0 - 20.0 years
25 - 40 Lacs
Kolkata, Hyderabad, Pune
Work from Office
GCP Data Architect
Posted 3 months ago
5.0 - 10.0 years
12 - 22 Lacs
Kolkata, Hyderabad, Pune
Work from Office
GCP Engineer, Lead GCP Engineer
Posted 3 months ago
4.0 - 9.0 years
10 - 20 Lacs
Bengaluru
Remote
Job Description: Job Title: Apache beam software engineer Work Mode: Remote Base Location: Bengaluru Experience Required: 4 to 6 Years Job Summary: We are looking for a Software Engineer with hands-on experience in Apache Beam , Google Cloud Dataflow , and Dataproc , focusing on building reusable data processing frameworks . This is not a traditional data engineering role. The ideal candidate will have strong software development skills in Java or Python and experience in building scalable, modular data processing components and frameworks for batch and streaming use cases. Key Responsibilities: Design and develop framework-level components using Apache Beam , GCP Dataflow , and Dataproc . B...
Posted 3 months ago
4.0 - 8.0 years
10 - 14 Lacs
Chennai
Work from Office
Role Description Provides leadership for the overall architecture, design, development, and deployment of a full-stack cloud native data analytics platform. Designing & Augmenting Solution architecture for Data Ingestion, Data Preparation, Data Transformation, Data Load, ML & Simulation Modelling, Java BE & FE, State Machine, API Management & Intelligence consumption using data products, on cloud Understand Business Requirements and help in developing High level and Low-level Data Engineering and Data Processing Documentation for the cloud native architecture Developing conceptual, logical and physical target-state architecture, engineering and operational specs. Work with the customer, user...
Posted 3 months ago
7.0 - 12.0 years
11 - 15 Lacs
Noida
Work from Office
Primary Skill(s): Lead Data Visualization Engineer with experience in Sigma BI Experience: 7+ Years in of experience in Data Visualization with experience in Sigma BI, PowerBI, Tableau or Looker Job Summary: Lead Data Visualization Engineer with deep expertise in Sigma BI and a strong ability to craft meaningful, insight-rich visual stories for business stakeholders. This role will be instrumental in transforming raw data into intuitive dashboards and visual analytics, helping cross-functional teams make informed decisions quickly and effectively. Key Responsibilities: Lead the design, development, and deployment of Sigma BI dashboards and reports tailored for various business functions. Tra...
Posted 3 months ago
8.0 - 12.0 years
22 - 32 Lacs
Noida, Pune, Bengaluru
Hybrid
Build and Optimize ELT/ETL Pipelines using BigQuery, GCS, Dataflow, PubSub and Orchestration services Composer/Airflow • Hands-On experience in building ETL/ELT Pipelines with developing software code in Python • Experience in working with data warehouses, data warehouse technical architectures, reporting/analytic tools • Develop and implement data quality and governance procedures to ensure the accuracy and reliability of data • Demonstrate extensive skills and success in the implementation of technology projects within a professional environment, with a particular focus on data engineering • Eager to learn and explore new services within GCP to enhance skills and contribution to Projects •...
Posted 3 months ago
5.0 - 10.0 years
8 - 18 Lacs
Hyderabad
Work from Office
Role: GCP Data Engineer Location: Hyderabad Duration: Full time Roles & Responsibilities: * Design, develop, and maintain scalable and reliable data pipelines using Apache Airflow to orchestrate complex workflows. * Utilize Google BigQuery for large-scale data warehousing, analysis, and querying of structured and semi-structured data. * Leverage the Google Cloud Platform (GCP) ecosystem, including services like Cloud Storage, Compute Engine, AI Platform, and Dataflow, to build and deploy data science solutions. * Develop, train, and deploy machine learning models to solve business problems such as forecasting, customer segmentation, and recommendation systems. * Write clean, efficient, and w...
Posted 3 months ago
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
128529 Jobs | Dublin
Wipro
41046 Jobs | Bengaluru
EY
33823 Jobs | London
Accenture in India
30977 Jobs | Dublin 2
Uplers
24932 Jobs | Ahmedabad
Turing
23421 Jobs | San Francisco
IBM
20492 Jobs | Armonk
Infosys
19613 Jobs | Bangalore,Karnataka
Capgemini
19528 Jobs | Paris,France
Accenture services Pvt Ltd
19518 Jobs |