Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 7.0 years
7 - 9 Lacs
Bengaluru
Work from Office
Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)
Posted 1 month ago
5.0 - 7.0 years
7 - 9 Lacs
Gurugram
Work from Office
Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills
Posted 1 month ago
5.0 - 7.0 years
7 - 9 Lacs
Gurugram
Work from Office
Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills
Posted 1 month ago
5.0 - 9.0 years
20 - 35 Lacs
Noida, Gurugram, Delhi / NCR
Hybrid
Salary: 20 to 35 LPA Exp: 5 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services such as BigQuery, Data Flow, PubSub, Dataproc, and Cloud Storage. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from large datasets stored in Google Cloud SQL databases. Troubleshoot issues related to data processing workflows and provide timely resolutions. Desired Candidate Profile 5-9 years of experience in Data Engineering with expertise GCP & Biq query data engineering. Strong understanding of GCP Cloud Platform Administration including Compute Engine (Dataproc), Kubernetes Engine (K8s), Cloud Storage, Cloud SQL etc. . Experience working on big data analytics projects involving ETL processes using tools like Airflow or similar technologies.
Posted 1 month ago
5.0 - 9.0 years
20 - 35 Lacs
Noida, Gurugram, Delhi / NCR
Hybrid
Salary: 20 to 35 LPA Exp: 5 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services such as BigQuery, Data Flow, PubSub, Dataproc, and Cloud Storage. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from large datasets stored in Google Cloud SQL databases. Troubleshoot issues related to data processing workflows and provide timely resolutions. Desired Candidate Profile 5-9 years of experience in Data Engineering with expertise GCP & Biq query data engineering. Strong understanding of GCP Cloud Platform Administration including Compute Engine (Dataproc), Kubernetes Engine (K8s), Cloud Storage, Cloud SQL etc. . Experience working on big data analytics projects involving ETL processes using tools like Airflow or similar technologies.
Posted 1 month ago
4.0 - 8.0 years
20 - 35 Lacs
Noida, Gurugram, Delhi / NCR
Hybrid
Salary: 20 to 35 LPA Exp: 4 to 8 years Location: Gurgaon Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer
Posted 1 month ago
8.0 - 13.0 years
35 - 90 Lacs
Pune, Chennai, Bangalore/Bengaluru
Work from Office
looking for a competent cloud architect to manage our cloud architecture and positioning in the cloud environment. It plays a strategic role in maintaining all cloud systems, including front-end platforms, servers, storage, and management networks.
Posted 1 month ago
5.0 - 10.0 years
10 - 20 Lacs
Bengaluru
Remote
Job Description Job Title: Offshore Data Engineer Base Location: Bangalore Work Mode: Remote Experience: 5+ Years Job Description: We are looking for a skilled Offshore Data Engineer with strong experience in Python, SQL, and Apache Beam . Familiarity with Java is a plus. The ideal candidate should be self-driven, collaborative, and able to work in a fast-paced environment . Key Responsibilities: Design and implement reusable, scalable ETL frameworks using Apache Beam and GCP Dataflow. Develop robust data ingestion and transformation pipelines using Python and SQL . Integrate Kafka for real-time data streams alongside batch workloads. Optimize pipeline performance and manage costs within GCP services. Work closely with data analysts, data architects, and product teams to gather and understand data requirements. Manage and monitor BigQuery datasets, tables, and partitioning strategies. Implement error handling, resiliency, and observability mechanisms across pipeline components. Collaborate with DevOps teams to enable automated delivery (CI/CD) for data pipeline components. Required Skills: 5+ years of hands-on experience in Data Engineering or Software Engineering . Proficiency in Python and SQL . Good understanding of Java (for reading or modifying codebases). Experience building ETL pipelines with Apache Beam and Google Cloud Dataflow . Hands-on experience with Apache Kafka for stream processing. Solid understanding of BigQuery and data modeling on GCP. Experience with GCP services (Cloud Storage, Pub/Sub, Cloud Compose, etc.). Good to Have: Experience building reusable ETL libraries or framework components. Knowledge of data governance, data quality checks, and pipeline observability. Familiarity with Apache Airflow or Cloud Composer for orchestration. Exposure to CI/CD practices in a cloud-native environment (Docker, Terraform, etc.). Tech stack : Python, SQL, Java, GCP (BigQuery, Pub/Sub, Cloud Storage, Cloud Compose, Dataflow), Apache Beam, Apache Kafka, Apache Airflow, CI/CD (Docker, Terraform)
Posted 1 month ago
5.0 - 9.0 years
20 - 27 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Designing and implementing data processing systems using Microsoft Fabric, Azure Data Analytics, Databricks and other distributed frameworks (ex Hadoop, Spark, Snowflake, Airflow) Writing efficient and scalable code to process, transform, and clean large volumes of structured and unstructured data Designing data pipelines: Snowflake Data Cloud uses data pipelines to ingest data into its system from sources like databases, cloud storage, or streaming platforms A Snowflake Data Engineer designs, builds, and fine-tunes these pipelines to make sure that all data is loaded into Snowflake correctly Designing and implementing data processing systems using Microsoft Fabric, Azure Data Analytics, Databricks and other distributed frameworks (ex Hadoop, Spark, Snowflake, Airflow) Writing efficient and scalable code to process, transform, and clean large volumes of structured and unstructured data Designing data pipelines: Snowflake Data Cloud uses data pipelines to ingest data into its system from sources like databases, cloud storage, or streaming platforms A Snowflake Data Engineer designs, builds, and fine-tunes these pipelines to make sure that all data is loaded into Snowflake correctly
Posted 1 month ago
4.0 - 8.0 years
20 - 25 Lacs
Bengaluru
Work from Office
The Data Services team at NetApp helps customers protect and govern their data wherever it lives. We are responsible for driving preference for our primary storage platforms through a consolidated set of data protection and governance offerings. We are seeking a dynamic and experienced Product Manager. The ideal candidate will have a strong background in product management with a basic understanding of Data Protection market, and is expected to work closely with enterprise storage, cloud storage, product marketing and customers to define the vision and roadmap for the space, and to be excellent at driving planning and execution for these broad areas. Job Requirements Experience in product management is required. Drive product strategy, roadmap planning and execution for the backup and recovery Collaborate with engineering, marketing, sales, and other cross-functional teams to ensure successful product development, launch and go-to-market strategies. Build simple but usable products that delight our users and buyer personas alike. Collaborate with senior leadership to align product strategy with business objectives and contribute to the companys overall growth and success. Strong leadership skills - demonstrated ability to lead from the front. Exceptional problem-solving abilities, with a demonstrated track record of delivering revenue-driven product launches. Customer obsession - demonstrated building a strong understanding of customer problems and building capabilities that help resolve their pain points. If you are a highly motivated and experienced product management leader who thrives in a collaborative and innovative environment, we would love to hear from you. Join our team and play a pivotal role in shaping the future of our products and driving our companys success. Job Responsibilities: Work closely with top customers to understand their needs Drive product strategy, roadmap planning and execution for the backup and recovery Collaborate with engineering, marketing, sales, and other cross-functional teams to ensure successful product development, launch and go-to-market strategies. Build simple but usable products that delight our users and buyer personas alike. Collaborate with senior leadership to align product strategy with business objectives and contribute to the companys overall growth and success. Education A minimum of 4+ years experience in product management or related technical role. 4 to 8 years of experience is preferred. A Bachelor of Science Degree in Electrical Engineering or Computer Science, a Master degree, or a PhD or equivalent experience is required.
Posted 1 month ago
4.0 - 8.0 years
6 - 16 Lacs
Hyderabad, Chennai
Hybrid
Role & responsibilities Bachelors degree or four or more years of work experience. Four or more years of work experience. Experience with Data Warehouse concepts and Data Management life cycle. Experience in any DBMS Experience in Shell scripting, Spark, Scala. Experience in GCP/Big Query, composer, Airflow. Experience in real time streaming Experience in DevOps
Posted 1 month ago
5.0 - 7.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)
Posted 1 month ago
2.0 - 7.0 years
6 - 9 Lacs
Chennai
Work from Office
Job Title: GCP Data Engineer Location: Chennai, India Job type: FTE Mandatory Skills: Google Cloud Platform - Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API Job Description: 2+Years in GCP Services, Biq Query, Data Flow, Dataproc, DataPlex,DataFusion, Terraform, Tekton, Cloud SQL, Redis Memory, Airflow, Cloud Storage 2+ Years inData Transfer Utilities 2+ Years in Git / any other version control tool 2+ Years in Confluent Kafka 1+ Years of Experience in API Development 2+ Years in Agile Framework 4+ years of strong experience in python, Pyspark development. 4+ years of shell scripting to develop the adhoc jobsfor data importing/exporting
Posted 1 month ago
3.0 - 5.0 years
10 - 11 Lacs
Pune
Work from Office
We are seeking an experienced Azure Data Factory Engineer to design, develop, and manage data pipelines using Azure Data Factory. The ideal candidate will possess hands-on expertise in ADF components and activities, and have practical knowledge of incremental data loading, file management, API integration, and cloud storage solutions. This role involves automating data workflows, optimizing performance, and ensuring the seamless flow of data within our cloud environment. Key Responsibilities: Design and Develop Data Pipelines: Build and maintain scalable data pipelines using Azure Data Factory, ensuring efficient and reliable data movement and transformation. Incremental Data Loads: Implement and manage incremental data loading processes to ensure that only updated or new data is processed, optimizing data pipeline performance and reducing resource consumption. File Management: Handle data ingestion and management from various file sources, including CSV, JSON, and Parquet formats, ensuring data accuracy and consistency. API Integration: Develop and configure data pipelines to interact with RESTful APIs for data extraction and integration, handling authentication and data retrieval processes effectively. Cloud Storage Management: Work with Azure Blob Storage and Azure Data Lake Storage to manage and utilize cloud storage solutions, ensuring data is securely stored and easily accessible. ADF Automation: Leverage Azure Data Factory s automation capabilities to schedule and monitor data workflows, ensuring timely execution and error-free operations. Performance Optimization: Continuously monitor and optimize data pipeline performance, troubleshoot issues, and implement best practices to enhance efficiency. Collaboration: Work closely with data engineers, analysts, and other stakeholders to gather requirements, provide technical guidance, and ensure successful data integration solutions. Qualifications: Educational Background: Bachelor s degree in Computer Science, Information Technology, or a related field (B. E, B.Tech, MCA, MCS). Advanced degrees or certifications are a plus. Experience: Minimum 3-5 years of hands-on experience with Azure Data Factory, including designing and implementing complex data pipelines. Technical Skills: Strong knowledge of ADF components and activities, including datasets, pipelines, data flows, and triggers. Proficiency in incremental data loading techniques and optimization strategies. Experience working with various file formats and handling large-scale data files. Proven ability to integrate and interact with APIs for data retrieval and processing. Hands-on experience with Azure Blob Storage and Azure Data Lake Storage. Familiarity with ADF automation features and scheduling. Soft Skills: Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Ability to work independently and manage multiple tasks effectively.
Posted 1 month ago
0.0 - 5.0 years
5 - 13 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Design, develop, and deploy modular cloud-based systems. Develop and maintain cloud solutions in accordance with best practices. Ensure efficient functioning of data storage and process functions in accordance with company security policies. Required Candidate profile Azure, AWS, and GCP certifications preferred. Troubleshooting and analytical skills. Strong communication and collaboration skills. Client management skills to discuss systems as needed. Perks and benefits Free meals and snacks. Bonus. Vision insurance.
Posted 1 month ago
0.0 - 2.0 years
5 - 11 Lacs
Noida, Gurugram, Delhi / NCR
Work from Office
Design, develop, and deploy modular cloud-based systems. Develop and maintain cloud solutions in accordance with best practices. Ensure efficient functioning of data storage and process functions in accordance with company security policies. Required Candidate profile Azure, AWS, and GCP certifications preferred. Troubleshooting and analytical skills. Strong communication and collaboration skills. Client management skills to discuss systems as needed. Perks and benefits Free meals and snacks. Bonus. Vision insurance.
Posted 1 month ago
8.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
What you’ll be doing: Assist in developing machine learning models based on project requirements Work with datasets by preprocessing, selecting appropriate data representations, and ensuring data quality. Performing statistical analysis and fine-tuning using test results. Support training and retraining of ML systems as needed. Help build data pipelines for collecting and processing data efficiently. Follow coding and quality standards while developing AI/ML solutions Contribute to frameworks that help operationalize AI models What we seek in you: 8+ years of experience in IT Industry Strong on programming languages like Python One cloud hands-on experience (GCP preferred) Experience working with Dockers Environments managing (e.g venv, pip, poetry, etc.) Experience with orchestrators like Vertex AI pipelines, Airflow, etc Understanding of full ML Cycle end-to-end Data engineering, Feature Engineering techniques Experience with ML modelling and evaluation metrics Experience with Tensorflow, Pytorch or another framework Experience with Models monitoring Advance SQL knowledge Aware of Streaming concepts like Windowing, Late arrival, Triggers etc Storage: CloudSQL, Cloud Storage, Cloud Bigtable, Bigquery, Cloud Spanner, Cloud DataStore, Vector database Ingest: Pub/Sub, Cloud Functions, AppEngine, Kubernetes Engine, Kafka, Micro services Schedule: Cloud Composer, Airflow Processing: Cloud Dataproc, Cloud Dataflow, Apache Spark, Apache Flink CI/CD: Bitbucket+Jenkins / Gitlab, Infrastructure as a tool: Terraform Life at Next: At our core, we're driven by the mission of tailoring growth for our customers by enabling them to transform their aspirations into tangible outcomes. We're dedicated to empowering them to shape their futures and achieve ambitious goals. To fulfil this commitment, we foster a culture defined by agility, innovation, and an unwavering commitment to progress. Our organizational framework is both streamlined and vibrant, characterized by a hands-on leadership style that prioritizes results and fosters growth. Perks of working with us: Clear objectives to ensure alignment with our mission, fostering your meaningful contribution. Abundant opportunities for engagement with customers, product managers, and leadership. You'll be guided by progressive paths while receiving insightful guidance from managers through ongoing feedforward sessions. Cultivate and leverage robust connections within diverse communities of interest. Choose your mentor to navigate your current endeavors and steer your future trajectory. Embrace continuous learning and upskilling opportunities through Nexversity. Enjoy the flexibility to explore various functions, develop new skills, and adapt to emerging technologies. Embrace a hybrid work model promoting work-life balance. Access comprehensive family health insurance coverage, prioritizing the well-being of your loved ones. Embark on accelerated career paths to actualize your professional aspirations. Who we are? We enable high growth enterprises build hyper personalized solutions to transform their vision into reality. With a keen eye for detail, we apply creativity, embrace new technology and harness the power of data and AI to co-create solutions tailored made to meet unique needs for our customers. Join our passionate team and tailor your growth with us!
Posted 1 month ago
3.0 - 5.0 years
5 - 8 Lacs
Bengaluru
Work from Office
What you’ll be doing: Assist in developing machine learning models based on project requirements Work with datasets by preprocessing, selecting appropriate data representations, and ensuring data quality. Performing statistical analysis and fine-tuning using test results. Support training and retraining of ML systems as needed. Help build data pipelines for collecting and processing data efficiently. Follow coding and quality standards while developing AI/ML solutions Contribute to frameworks that help operationalize AI models What we seek in you: Strong on programming languages like Python, Java One cloud hands-on experience (GCP preferred) Experience working with Dockers Environments managing (e.g venv, pip, poetry, etc.) Experience with orchestrators like Vertex AI pipelines, Airflow, etc Understanding of full ML Cycle end-to-end Data engineering, Feature Engineering techniques Experience with ML modelling and evaluation metrics Experience with Tensorflow, Pytorch or another framework Experience with Models monitoring Advance SQL knowledge Aware of Streaming concepts like Windowing, Late arrival, Triggers etc Storage: CloudSQL, Cloud Storage, Cloud Bigtable, Bigquery, Cloud Spanner, Cloud DataStore, Vector database Ingest: Pub/Sub, Cloud Functions, AppEngine, Kubernetes Engine, Kafka, Micro services Schedule: Cloud Composer, Airflow Processing: Cloud Dataproc, Cloud Dataflow, Apache Spark, Apache Flink CI/CD: Bitbucket+Jenkins / Gitlab, Infrastructure as a tool: Terraform Life at Next: At our core, we're driven by the mission of tailoring growth for our customers by enabling them to transform their aspirations into tangible outcomes. We're dedicated to empowering them to shape their futures and achieve ambitious goals. To fulfil this commitment, we foster a culture defined by agility, innovation, and an unwavering commitment to progress. Our organizational framework is both streamlined and vibrant, characterized by a hands-on leadership style that prioritizes results and fosters growth. Perks of working with us: Clear objectives to ensure alignment with our mission, fostering your meaningful contribution. Abundant opportunities for engagement with customers, product managers, and leadership. You'll be guided by progressive paths while receiving insightful guidance from managers through ongoing feedforward sessions. Cultivate and leverage robust connections within diverse communities of interest. Choose your mentor to navigate your current endeavors and steer your future trajectory. Embrace continuous learning and upskilling opportunities through Nexversity. Enjoy the flexibility to explore various functions, develop new skills, and adapt to emerging technologies. Embrace a hybrid work model promoting work-life balance. Access comprehensive family health insurance coverage, prioritizing the well-being of your loved ones. Embark on accelerated career paths to actualize your professional aspirations. Who we are? We enable high growth enterprises build hyper personalized solutions to transform their vision into reality. With a keen eye for detail, we apply creativity, embrace new technology and harness the power of data and AI to co-create solutions tailored made to meet unique needs for our customers. Join our passionate team and tailor your growth with us!
Posted 1 month ago
5.0 - 8.0 years
27 - 42 Lacs
Bengaluru
Work from Office
Job Summary We are seeking a skilled and innovative Cloud Engineer to join our team. As a Cloud Engineer, you will be responsible for developing and maintaining cloud-based solutions, with a focus on coding complex problems, automation and collaborating with the Site Reliability Engineering team for feature deployment in production. You will also be responsible for designing and implementing managed Cloud Services according to the given requirements. Additionally, you should be able to quickly learn the existing code and architecture. Job Requirements • Proficient with Go or C++ or C#. Experience with Python, Java/C-sharp is added advantage. • Hands-on Expertise in Container based technologies preferably Kubernetes & Dockers. • Thorough understanding and extensive experience with building orchestration on at least one of the major hyper-scaler cloud providers (AWS, Microsoft Azure, Google Cloud Platform) • Experienced with Cloud service APIs (e.g. AWS, Azure or GCP) • Experience working in SMB, NFS & internet security protocols • Expertise in REST API design and implementation • Thorough understanding of Linux or other Unix-like Operating Systems • Should own the deliverables from end-end, design, implement, test automation to ensure high quality deliverable • Experience of CI build systems or automated testing • Understands the design, architecture principles and best practices in Cloud along with cloud health monitoring, capacity metering, billing. • Highly knowledgeable in infrastructure like hypervisor, Cloud Storage and experience with cloud services including Databases, Caching, Object and Block Storage, Scaling, Load Balancers, Networking etc. Education • Minimum 5 years of experience and must be hands-on with coding. • B.E/B.Tech or M.S in Computer Science or related technical field.
Posted 1 month ago
6.0 - 9.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Job Title: Senior GCP Data Engineer Location: Chennai, Bangalore, Hyderabad Experience: 6-9 Years Job Summary: We are seeking a GCP Data & Cloud Engineer with strong expertise in Google Cloud Platform services, including BigQuery, Cloud Run, Cloud Storage , and Pub/Sub . The ideal candidate will have deep experience in SQL coding , data pipeline development, and deploying cloud-native solutions. Key Responsibilities: Design, implement, and optimize scalable data pipelines and services using GCP Build and manage cloud-native applications deployed via Cloud Run Develop complex and performance-optimized SQL queries for analytics and data transformation Manage and automate data storage, retrieval, and archival using Cloud Storage Implement event-driven architectures using Google Pub/Sub Work with large datasets in BigQuery , including ETL/ELT design and query optimization Ensure security, monitoring, and compliance of cloud-based systems Collaborate with data analysts, engineers, and product teams to deliver end-to-end cloud solutions Required Skills & Experience: 3+ years of experience working with Google Cloud Platform (GCP) Strong proficiency in SQL coding , query tuning, and handling complex data transformations Hands-on experience with: BigQuery Cloud Run Cloud Storage Pub/Sub Understanding of data pipeline and ETL/ELT workflows in cloud environments Familiarity with containerized services and CI/CD pipelines Experience in scripting languages (e.g., Python, Shell) is a plus Strong analytical and problem-solving skills
Posted 1 month ago
4.0 - 6.0 years
6 - 9 Lacs
Bengaluru
Work from Office
Job Title: GCP Data Engineer Location: Chennai, Bangalore, Hyderabad Experience: 4-6 Years Job Summary: We are seeking a GCP Data & Cloud Engineer with strong expertise in Google Cloud Platform services, including BigQuery, Cloud Run, Cloud Storage , and Pub/Sub . The ideal candidate will have deep experience in SQL coding , data pipeline development, and deploying cloud-native solutions. Key Responsibilities: Design, implement, and optimize scalable data pipelines and services using GCP Build and manage cloud-native applications deployed via Cloud Run Develop complex and performance-optimized SQL queries for analytics and data transformation Manage and automate data storage, retrieval, and archival using Cloud Storage Implement event-driven architectures using Google Pub/Sub Work with large datasets in BigQuery , including ETL/ELT design and query optimization Ensure security, monitoring, and compliance of cloud-based systems Collaborate with data analysts, engineers, and product teams to deliver end-to-end cloud solutions Required Skills & Experience: 4 years of experience working with Google Cloud Platform (GCP) Strong proficiency in SQL coding , query tuning, and handling complex data transformations Hands-on experience with: BigQuery Cloud Run Cloud Storage Pub/Sub Understanding of data pipeline and ETL/ELT workflows in cloud environments Familiarity with containerized services and CI/CD pipelines Experience in scripting languages (e.g., Python, Shell) is a plus Strong analytical and problem-solving skills
Posted 1 month ago
9.0 - 14.0 years
22 - 37 Lacs
Bengaluru
Remote
Minimum 7+ years in data engineering with 5+ years of hands-on experience on GCP. Proven track record with tools and services like BigQuery, Cloud Composer (Apache Airflow), Cloud Functions, Pub/Sub, Cloud Storage, Dataflow, and IAM/VPC. Demonstrated expertise in Apache Spark (batch and streaming), PySpark, and building scalable API integrations. Advanced Airflow skills including custom operators, dynamic DAGs, and workflow performance tuning. Certifications Google Cloud Professional Data Engineer certification preferred. Key Skills Mandatory Technical Skills Advanced Python (PySpark, Pandas, pytest) for automation and data pipelines. Strong SQL with experience in window functions, CTEs, partitioning, and optimization. Proficiency in GCP services including BigQuery, Dataflow, Cloud Composer, Cloud Functions, and Cloud Storage. Hands-on with Apache Airflow, including dynamic DAGs, retries, and SLA enforcement. Expertise in API data ingestion, Postman collections, and REST/GraphQL integration workflows. Familiarity with CI/CD workflows using Git, Jenkins, or Bitbucket. Experience with infrastructure security and governance using IAM and VPC.
Posted 1 month ago
4.0 - 9.0 years
8 - 12 Lacs
Noida
Work from Office
Req ID: 327316 We are currently seeking a GCP & GKE - Sr Cloud Engineer to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Job Title / RoleGCP & GKE - Sr Cloud Engineer Primary SkillCloud-Infrastructure-Google Cloud Platform Minimum work experience4+ yrs Total Experience4+ Years Mandatory Skills: Technical Qualification/ Knowledge Expertise in assessment, designing and implementing GCP solutions including aspects like compute, network, storage, identity, security , DR/business continuity strategy, migration , templates , cost optimization, PowerShell ,Terraforms, Ansible etc. Must have GCP Solution Architect Certification Should have prior experience in executing large complex cloud transformation programs including discovery, assessment, business case creation, design , build , migration planning and migration execution Should have prior experience in using industry leading or native discovery, assessment and migration tools. Good knowledge on the cloud technology, different patterns, deployment methods, compatibility of the applications Good knowledge on the GCP technologies and associated components and variations Anthos Application Platform Working knowledge on GCE, GAE, GKE and GCS Hands-on experience in creating and provisioning compute Instances using GCP console, Terraform and Google Cloud SDK. Creating Databases in GCP and in VM"™s Knowledge of data analyst tool (big query). Knowledge of cost analysis and cost optimization. Knowledge of Git & GitHub. Knowledge on Terraform and Jenkins. Monitoring the VM and Applications using Stack driver. Working knowledge on VPN and Interconnect setup. Hands on experience in setting up HA environment. Hands on experience in Creating VM instances in Google cloud Platform. Hands on experience in Cloud storage and retention policies in storage. Managing Users on Google IAM Service and providing them appropriate permissions. GKE Install Tools - Set up Kubernetes tools Administer a Cluster Configure Pods and Containers Perform common configuration tasks for Pods and containers. Monitoring, Logging, and Debugging Inject Data Into Applications Specify configuration and other data for the Pods that run your workload. Run Applications Run and manage both stateless and stateful applications. Run Jobs Run Jobs using parallel processing. Access Applications in a Cluster Extend Kubernetes Understand advanced ways to adapt your Kubernetes cluster to the needs of your work environment. Manage Cluster Daemons Perform common tasks for managing a DaemonSet, such as performing a rolling update. Extend kubectl with plugins Extend kubectl by creating and installing kubectl plugins. Manage HugePages Configure and manage huge pages as a schedulable resource in a cluster. Schedule GPUs Configure and schedule GPUs for use as a resource by nodes in a cluster. CertificationGCP Engineer & GKE Academic Qualification: B. Tech or equivalent or MCA Process/ Quality Knowledge Must have clear knowledge on ITIL based service delivery. ITIL certification is desired. Knowledge on quality Knowledge on security processes Soft Skills: Good communication skill and capability to work directly with global customers Timely and accurate communication Need to demonstrate the ownership for the technical issues and engage the right stakeholders for timely resolution. Flexibility to learn and lead other technology areas like other public cloud technologies, private cloud, automation
Posted 1 month ago
4.0 - 9.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Req ID: 327258 We are currently seeking a GCP & GKE - Sr Cloud Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Title / RoleGCP & GKE - Sr Cloud Engineer : Primary SkillCloud-Infrastructure-Google Cloud Platform Minimum work experience4+ yrs Total Experience4+ Years Mandatory Skills: Technical Qualification/ Knowledge: Expertise in assessment, designing and implementing GCP solutions including aspects like compute, network, storage, identity, security , DR/business continuity strategy, migration , templates , cost optimization, PowerShell ,Terraforms, Ansible etc. Must have GCP Solution Architect Certification Should have prior experience in executing large complex cloud transformation programs including discovery, assessment, business case creation, design , build , migration planning and migration execution Should have prior experience in using industry leading or native discovery, assessment and migration tools. Good knowledge on the cloud technology, different patterns, deployment methods, compatibility of the applications Good knowledge on the GCP technologies and associated components and variations Anthos Application Platform "¢ Working knowledge on GCE, GAE, GKE and GCS "¢ Hands-on experience in creating and provisioning compute Instances using GCP console, Terraform and Google Cloud SDK. "¢ Creating Databases in GCP and in VM"™s "¢ Knowledge of data analyst tool (big query). "¢ Knowledge of cost analysis and cost optimization. "¢ Knowledge of Git & GitHub. "¢ Knowledge on Terraform and Jenkins. "¢ Monitoring the VM and Applications using Stack driver. "¢ Working knowledge on VPN and Interconnect setup. "¢ Hands on experience in setting up HA environment. "¢ Hands on experience in Creating VM instances in Google cloud Platform. "¢ Hands on experience in Cloud storage and retention policies in storage. "¢ Managing Users on Google IAM Service and providing them appropriate permissions. "¢ GKE "¢ Install Tools - Set up Kubernetes tools "¢ Administer a Cluster "¢ Configure Pods and Containers "¢ Perform common configuration tasks for Pods and containers. "¢ Monitoring, Logging, and Debugging "¢ Inject Data Into Applications "¢ Specify configuration and other data for the Pods that run your workload. "¢ Run Applications "¢ Run and manage both stateless and stateful applications. "¢ Run Jobs "¢ Run Jobs using parallel processing. "¢ Access Applications in a Cluster "¢ Extend Kubernetes "¢ Understand advanced ways to adapt your Kubernetes cluster to the needs of your work environment. "¢ Manage Cluster Daemons "¢ Perform common tasks for managing a DaemonSet, such as performing a rolling update. "¢ Extend kubectl with plugins "¢ Extend kubectl by creating and installing kubectl plugins. "¢ Manage HugePages "¢ Configure and manage huge pages as a schedulable resource in a cluster. "¢ Schedule GPUs "¢ Configure and schedule GPUs for use as a resource by nodes in a cluster. CertificationGCP Engineer & GKE Academic Qualification B. Tech or equivalent or MCA Process/ Quality Knowledge: Must have clear knowledge on ITIL based service delivery. ITIL certification is desired. Knowledge on quality Knowledge on security processes Soft Skills: Good communication skill and capability to work directly with global customers Timely and accurate communication Need to demonstrate the ownership for the technical issues and engage the right stakeholders for timely resolution. Flexibility to learn and lead other technology areas like other public cloud technologies, private cloud, automation
Posted 1 month ago
4.0 - 9.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Req ID: 327315 We are currently seeking a GCP & GKE - Sr Cloud Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Title / RoleGCP & GKE - Sr Cloud Engineer Primary SkillCloud-Infrastructure-Google Cloud Platform Minimum work experience4+ yrs Total Experience4+ Years Mandatory Skills: Technical Qualification/ Knowledge Expertise in assessment, designing and implementing GCP solutions including aspects like compute, network, storage, identity, security , DR/business continuity strategy, migration , templates , cost optimization, PowerShell ,Terraforms, Ansible etc. Must have GCP Solution Architect Certification Should have prior experience in executing large complex cloud transformation programs including discovery, assessment, business case creation, design , build , migration planning and migration execution Should have prior experience in using industry leading or native discovery, assessment and migration tools. Good knowledge on the cloud technology, different patterns, deployment methods, compatibility of the applications Good knowledge on the GCP technologies and associated components and variations Anthos Application Platform Working knowledge on GCE, GAE, GKE and GCS Hands-on experience in creating and provisioning compute Instances using GCP console, Terraform and Google Cloud SDK. Creating Databases in GCP and in VM"™s Knowledge of data analyst tool (big query). Knowledge of cost analysis and cost optimization. Knowledge of Git & GitHub. Knowledge on Terraform and Jenkins. Monitoring the VM and Applications using Stack driver. Working knowledge on VPN and Interconnect setup. Hands on experience in setting up HA environment. Hands on experience in Creating VM instances in Google cloud Platform. Hands on experience in Cloud storage and retention policies in storage. Managing Users on Google IAM Service and providing them appropriate permissions. GKE Install Tools - Set up Kubernetes tools Administer a Cluster Configure Pods and Containers Perform common configuration tasks for Pods and containers. Monitoring, Logging, and Debugging Inject Data Into Applications Specify configuration and other data for the Pods that run your workload. Run Applications Run and manage both stateless and stateful applications. Run Jobs Run Jobs using parallel processing. Access Applications in a Cluster Extend Kubernetes Understand advanced ways to adapt your Kubernetes cluster to the needs of your work environment. Manage Cluster Daemons Perform common tasks for managing a DaemonSet, such as performing a rolling update. Extend kubectl with plugins Extend kubectl by creating and installing kubectl plugins. Manage HugePages Configure and manage huge pages as a schedulable resource in a cluster. Schedule GPUs Configure and schedule GPUs for use as a resource by nodes in a cluster. CertificationGCP Engineer & GKE Academic Qualification: B. Tech or equivalent or MCA Process/ Quality Knowledge Must have clear knowledge on ITIL based service delivery. ITIL certification is desired. Knowledge on quality Knowledge on security processes Soft Skills: Good communication skill and capability to work directly with global customers Timely and accurate communication Need to demonstrate the ownership for the technical issues and engage the right stakeholders for timely resolution. Flexibility to learn and lead other technology areas like other public cloud technologies, private cloud, automation
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough