Home
Jobs
Companies
Resume

23 Cloud Composer Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 11.0 years

17 - 30 Lacs

Hyderabad/Secunderabad, Bangalore/Bengaluru, Delhi / NCR

Hybrid

Naukri logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of GCP Sr Data Engineer We are seeking a highly experienced and visionary Senior Google Cloud Data Engineer to spearhead the design, development, and optimization of our data infrastructure and pipelines on the Google Cloud Platform (GCP). With over 10 years of hands-on experience in data engineering, you will be instrumental in building scalable, reliable, and performant data solutions that power our advanced analytics, machine learning initiatives, and real-time reporting. You will provide technical leadership, mentor team members, and champion best practices for data engineering within a GCP environment. Responsibilities Architect, design, and implement end-to-end data pipelines on GCP using services like Dataflow, Cloud Composer (Airflow), Pub/Sub, and BigQuery. Build and optimize data warehousing solutions leveraging BigQuery's capabilities for large-scale data analysis. Design and implement data lakes on Google Cloud Storage, ensuring efficient data organization and accessibility. Develop and maintain scalable ETL/ELT processes to ingest, transform, and load data from diverse sources into GCP. Implement robust data quality checks, monitoring, and alerting mechanisms within the GCP data ecosystem. Collaborate closely with data scientists, analysts, and business stakeholders to understand their data requirements and deliver high-impact solutions on GCP. Lead the evaluation and adoption of new GCP data engineering services and technologies. Implement and enforce data governance policies, security best practices, and compliance requirements within the Google Cloud environment. Provide technical guidance and mentorship to other data engineers on the team, promoting knowledge sharing and skill development within the GCP context. Troubleshoot and resolve complex data-related issues within the GCP infrastructure. Contribute to the development of data engineering standards, best practices, and comprehensive documentation specific to GCP. Qualifications we seek in you! Minimum Qualifications / Skills • Bachelor's or Master's degree in Computer Science, Engineering, or a related field. • 10+ years of progressive experience in data engineering roles, with a strong focus on cloud technologies. • Deep and demonstrable expertise with the Google Cloud Platform (GCP) and its core data engineering services (e.g., BigQuery, Dataflow, Cloud Composer, Cloud Storage, Pub/Sub, Cloud Functions). • Extensive experience designing, building, and managing large-scale data pipelines and ETL/ELT workflows specifically on GCP. • Strong proficiency in SQL and at least one programming language relevant to data engineering on GCP (e.g., Python). • Comprehensive understanding of data warehousing concepts, data modeling techniques optimized for BigQuery, and NoSQL database options on GCP (e.g., Cloud Bigtable, Firestore). • Solid grasp of data governance principles, data security best practices within GCP (IAM, KMS), and compliance frameworks. • Excellent problem-solving, analytical, and debugging skills within a cloud environment. • Exceptional communication, collaboration, and presentation skills, with the ability to articulate technical concepts clearly to various audiences. Preferred Qualifications/ Skills Google Cloud certifications relevant to data engineering (e.g., Professional Data Engineer). Experience with infrastructure-as-code tools for GCP (e.g., Terraform, Deployment Manager). Familiarity with data streaming technologies on GCP (e.g., Dataflow, Pub/Sub). Experience with machine learning workflows and MLOps on GCP (e.g., Vertex AI). Knowledge of containerization technologies (Docker, Kubernetes) and their application within GCP data pipelines (e.g., Dataflow FlexRS). Experience with data visualization tools that integrate well with GCP (e.g., Looker). Familiarity with data cataloging and data lineage tools on GCP (e.g., Data Catalog). Experience in [mention specific industry or domain relevant to your company]. Proven experience in leading technical teams and mentoring junior engineers in a GCP environment. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted 2 days ago

Apply

5.0 - 10.0 years

0 Lacs

Pune, Bengaluru

Hybrid

Naukri logo

Job Summary We are seeking a highly skilled Hadoop Developer / Lead Data Engineer to join our data engineering team based in Bangalore or Pune. The ideal candidate will have extensive experience with Hadoop ecosystem technologies and cloud-based big data platforms, particularly on Google Cloud Platform (GCP). This role involves designing, developing, and maintaining scalable data ingestion, processing, and transformation frameworks to support enterprise data needs. Minimum Qualifications Bachelor's degree in computer science, Computer Information Systems, or related technical field. 5-10 years of experience in software engineering or data engineering, with a strong focus on big data technologies. Proven experience in implementing software development life cycles (SDLC) in enterprise environments. Technical Skills & Expertise Big Data Technologies: Expertise in Hadoop platform, Hive , and related ecosystem tools. Strong experience with Apache Spark (using SQL, Scala, and/or Java). Experience with real-time data streaming using Kafka . Programming Languages & Frameworks: Proficient in PySpark and SQL for data processing and transformation. Strong coding skills in Python . Cloud Technologies (Google Cloud Platform): Experience with BigQuery for data warehousing and analytics. Familiarity with Cloud Composer (Airflow) for workflow orchestration. Hands-on with DataProc for managed Spark and Hadoop clusters. Responsibilities Design, develop, and implement scalable data ingestion and transformation pipelines using Hadoop and GCP services. Build real-time and batch data processing solutions leveraging Spark, Kafka, and related technologies. Ensure data quality, governance, and lineage by implementing automated validation and classification frameworks. Collaborate with cross-functional teams to deploy and operationalize data analytics tools at enterprise scale. Participate in production support and on-call rotations to maintain system reliability. Follow established SDLC practices to deliver high-quality, maintainable solutions. Preferred Qualifications Experience leading or mentoring data engineering teams. Familiarity with CI/CD pipelines and DevOps best practices for big data environments. Strong communication skills with an ability to collaborate across teams.

Posted 4 days ago

Apply

12.0 - 15.0 years

40 - 45 Lacs

Chennai

Work from Office

Naukri logo

Skill & Experience Strategic Planning and Direction Maintain architecture principles, guidelines and standards Project & Program Management Data Warehousing Big Data Data Analytics &; Data Science for solutioning Expert in Big Query, Dataproc, Data Fusion, Dataflow, Bigtable, Fire Store, CloudSQL, Cloud Spanner, Google Cloud Storage, Cloud Composer, Cloud Interconnect, Etc Strong Experience in Big Data- Data Modelling, Design, Architecting & Solutioning Understands programming language like SQL, Python, R-Scala. Good Python skills, - Experience from data visualisation tools such as Google Data Studio or Power BI Knowledge in A/B Testing, Statistics, Google Cloud Platform, Google Big Query, Agile Development, DevOps, Date Engineering, ETL Data Processing Strong Migration experience of production Hadoop Cluster to Google Cloud. Experience in designing & mplementing solution in mentioned areas:Strong Google Cloud Platform Data Components BigQuery, BigTable, CloudSQL, Dataproc, Data Flow, Data Fusion, Etc

Posted 2 weeks ago

Apply

7.0 - 12.0 years

20 - 25 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

We are looking for a Senior GCP Data Engineer / GCP Technical Lead with strong expertise in Google Cloud Platform (GCP), Apache Spark, and Python to join our growing data engineering team. The ideal candidate will have extensive experience working with GCP data services and should be capable of leading technical teams, designing robust data pipelines, and interacting directly with clients to gather requirements and ensure project delivery. Project Duration : 1 year and extendable Role & responsibilities Design, develop, and deploy scalable data pipelines and solutions using GCP services like DataProc and BigQuery. Lead and mentor a team of data engineers to ensure high-quality deliverables. Collaborate with cross-functional teams and client stakeholders to define technical requirements and deliver solutions aligned with business goals. Optimize data processing and transformation workflows for performance and cost-efficiency. Ensure adherence to best practices in cloud data architecture, data security, and governance. Mandatory Skills: Google Cloud Platform (GCP) especially DataProc and BigQuery Apache Spark Python Programming Preferred Skills: Experience in working with large-scale data processing frameworks. Exposure to DevOps/CI-CD practices in a cloud environment. Hands-on experience with other GCP tools like Cloud Composer, Pub/Sub, or Cloud Storage is a plus. Soft Skills: Strong communication and client interaction skills. Ability to work independently and as part of a distributed team. Excellent problem-solving and team management capabilities.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Chennai, Tamil Nadu

Work from Office

Naukri logo

Duration: 12Months Work Type: Onsite Position Description: We seeking an experienced GCP Data Engineer who can build cloud analytics platform to meet ever expanding business requirements with speed and quality using lean Agile practices. You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the Google Cloud Platform (GCP). You will be responsible for designing the transformation and modernization on GCP, as well as landing data from source applications to GCP. Experience with large scale solution and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on Google Cloud Platform. Skills Required: Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production Experience in analyzing complex data, organizing raw data and integrating massive datasets from multiple data sources to build subject areas and reusable data products Experience in working with architects to evaluate and productionalize appropriate GCP tools for data ingestion, integration, presentation, and reporting Experience in working with all stakeholders to formulate business problems as technical data requirement, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management Proficient in Machine Learning model architecture, data pipeline interaction and metrics interpretation. This includes designing and deploying a pipeline with automated data lineage. Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions. Test and compare competing solutions and report out a point of view on the best solution. Integration between GCP Data Catalog and Informatica EDC. Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. Skills Preferred: Strong drive for results and ability to multi-task and work independently Self-starter with proven innovation skills Ability to communicate and work with cross-functional teams and all levels of management Demonstrated commitment to quality and project timing Demonstrated ability to document complex systems Experience in creating and executing detailed test plans Experience Required: 3 to 5 Yrs Education Required: BE or Equivalent

Posted 3 weeks ago

Apply

4.0 - 7.0 years

8 - 14 Lacs

Noida

Hybrid

Naukri logo

Data Engineer (L3) || GCP Certified Employment Type : Full-Time Work Mode : In-office/ Hybrid Notice : Immediate joiners As a Data Engineer, you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development ""scrums"" and solution reviews, mentor junior Data Engineering Specialists, lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design. Required Skills : Design, develop, and support data pipelines and related data products and platforms. Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms. Perform application impact assessments, requirements reviews, and develop work estimates. Develop test strategies and site reliability engineering measures for data products and solutions. Participate in agile development ""scrums"" and solution reviews. Mentor junior Data Engineers. Lead the resolution of critical operations issues, including post-implementation reviews. Perform technical data stewardship tasks, including metadata management, security, and privacy by design. Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies Demonstrate SQL and database proficiency in various data engineering tasks. Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect. Develop Unix scripts to support various data operations. Model data to support business intelligence and analytics initiatives. Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation. Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have). data pipelines, agile development,scrums, GCP Data Technologies, Python, DAGs, Control-M, Apache Airflow, Data solution architecture Qualifications : Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field. 4+ years of data engineering experience. 2 years of data solution architecture and design experience. GCP Certified Data Engineer (preferred). Job Type : Full-time

Posted 3 weeks ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Pune

Work from Office

Naukri logo

Job Summary We are looking for a seasoned Data Modeler / Data Analyst to design and implement scalable, reusable logical and physical data models on Google Cloud Platformprimarily BigQuery. You will partner closely with data engineers, analytics teams, and business stakeholders to translate complex business requirements into performant data models that power reporting, self-service analytics, and advanced data science workloads. Key Responsibilities Gather and analyze business requirements to translate them into conceptual, logical, and physical data models on GCP (BigQuery, Cloud SQL, Cloud Spanner, etc.). Design star/snowflake schemas, data vaults, and other modeling patterns that balance performance, flexibility, and cost. Implement partitioning, clustering, and materialized views in BigQuery to optimize query performance and cost efficiency. Establish and maintain data modelling standards, naming conventions, and metadata documentation to ensure consistency across analytic and reporting layers. Collaborate with data engineers to define ETL/ELT pipelines and ensure data models align with ingestion and transformation strategies (Dataflow, Cloud Composer, Dataproc, dbt). Validate data quality and lineage; work with BI developers and analysts to troubleshoot performance issues or data anomalies. Conduct impact assessments for schema changes and guide version-control processes for data models. Mentor junior analysts/engineers on data modeling best practices and participate in code/design reviews. Contribute to capacity planning and cost-optimization recommendations for BigQuery datasets and reservations. Must-Have Skills 6-8 years of hands-on experience in data modeling, data warehousing, or database design, including at least 2 years on GCP BigQuery. Proficiency in dimensional modeling, 3NF, and modern patterns such as data vault. Expert SQL skills with demonstrable ability to optimize complex analytical queries on BigQuery (partitioning, clustering, sharding strategies). Strong understanding of ETL/ELT concepts and experience working with tools such as Dataflow, Cloud Composer, or dbt. Familiarity with BI/reporting tools (Looker, Tableau, Power BI, or similar) and how model design impacts dashboard performance. Experience with data governance practices—data cataloging, lineage, and metadata management (e.g., Data Catalog). Excellent communication skills to translate technical concepts into business-friendly language and collaborate across functions. Good to Have Experience of working on Azure Cloud (Fabric, Synapse, Delta Lake) Education Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, Statistics, or a related field. Equivalent experience will be considered.

Posted 3 weeks ago

Apply

8 - 12 years

25 - 40 Lacs

Hyderabad

Remote

Naukri logo

Senior GCP Cloud Administrator Experience: 8 - 12 Years Exp Salary : Competitive Preferred Notice Period : Within 30 Days Shift : 10:00AM to 7:00PM IST Opportunity Type: Remote Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : GCP, Identity and Access Management (IAM), BigQuery, SRE, GKE, GCP certification Good to have skills : Terraform, Cloud Composer, Dataproc, Dataflow, AWS Forbes Advisor (One of Uplers' Clients) is Looking for: Senior GCP Cloud Administrator who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: Forbes Advisor is a global platform dedicated to helping consumers make the best financial choices for their individual lives. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 month ago

Apply

4 - 9 years

10 - 14 Lacs

Pune

Hybrid

Naukri logo

Job Description; Technical Skills; Top skills for this positions is : Google Cloud Platform (Composer, Big Query, Airflow, DataProc, Data Flow, GCS) Data Warehousing knowledge Hands on experience in Python language and SQL database. Analytical technical skills to be able to predict the consequences of configuration changes (impact analysis), to identify root causes that are not obvious and to understand the business requirements. Excellent communication with different stakeholders (business, technical, project) Good understading of the overall Big Data and Data Science ecosystem Experience with buiding and deploying containers as services using Swarm/Kubernetes Good understanding of container concepts like buiding lean and secure images Understanding modern DevOps pipelines Experience with stream data pipelines using Kafka or Pub/Sub (mandatory for Kafka resources) Good to have: Professional Data Engineer or Associate Data Engineer Certification Roles and Responsibilities; Design, build & manage Big data ingestion and processing applications on Google Cloud using Big Query, Dataflow, Composer, Cloud Storage, Dataproc Performance tuning and analysis of Spark, Apache Beam (Dataflow) or similar distributed computing tools and applications on Google Cloud Good understanding of google cloud concepts, environments and utilities to design cloud optimal solutions for Machine Learning Applications Build systems to perform real-time data processing using Kafka, Pub-sub, Spark Streaming or similar technologies Manage the development life-cycle for agile software development projects Convert a proof of concept into an industrialization for Machine Learning Models (MLOps). Provide solutions to complex problems. Deliver customer-oriented solutions in a timely, collaborative manner Proactive thinking, planning and understanding of dependencies Develop & implement robust solutions in test & production environments.

Posted 1 month ago

Apply

4 - 9 years

10 - 14 Lacs

Pune

Hybrid

Naukri logo

Job Description; Technical Skills; Top skills for this positions is : Google Cloud Platform (Composer, Big Query, Airflow, DataProc, Data Flow, GCS) Data Warehousing knowledge Hands on experience in Python language and SQL database. Analytical technical skills to be able to predict the consequences of configuration changes (impact analysis), to identify root causes that are not obvious and to understand the business requirements. Excellent communication with different stakeholders (business, technical, project) Good understading of the overall Big Data and Data Science ecosystem Experience with buiding and deploying containers as services using Swarm/Kubernetes Good understanding of container concepts like buiding lean and secure images Understanding modern DevOps pipelines Experience with stream data pipelines using Kafka or Pub/Sub (mandatory for Kafka resources) Good to have: Professional Data Engineer or Associate Data Engineer Certification Roles and Responsibilities; Design, build & manage Big data ingestion and processing applications on Google Cloud using Big Query, Dataflow, Composer, Cloud Storage, Dataproc Performance tuning and analysis of Spark, Apache Beam (Dataflow) or similar distributed computing tools and applications on Google Cloud Good understanding of google cloud concepts, environments and utilities to design cloud optimal solutions for Machine Learning Applications Build systems to perform real-time data processing using Kafka, Pub-sub, Spark Streaming or similar technologies Manage the development life-cycle for agile software development projects Convert a proof of concept into an industrialization for Machine Learning Models (MLOps). Provide solutions to complex problems. Deliver customer-oriented solutions in a timely, collaborative manner Proactive thinking, planning and understanding of dependencies Develop & implement robust solutions in test & production environments.

Posted 1 month ago

Apply

11 - 21 years

25 - 40 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 3 - 20 Yrs Location- Pan India Job Description : - Skills: GCP, BigQuery, Cloud Composer, Cloud DataFusion, Python, SQL 5-20 years of overall experience mainly in the data engineering space, 2+ years of Hands-on experience in GCP cloud data implementation, Experience of working in client facing roles in technical capacity as an Architect. must have implementation experience of GCP based clous Data project/program as solution architect, Proficiency of using Google Cloud Architecture Framework in Data context Expert knowledge and experience of core GCP Data stack including BigQuery, DataProc, DataFlow, CloudComposer etc. Exposure to overall Google tech stack of Looker/Vertex-AI/DataPlex etc. Expert level knowledge on Spark.Extensive hands-on experience working with data using SQL, Python Strong experience and understanding of very large-scale data architecture, solutioning, and operationalization of data warehouses, data lakes, and analytics platforms. (Both Cloud and On-Premise) Excellent communications skills with the ability to clearly present ideas, concepts, and solutions If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in or you can reach me @ 8939853050 With Regards, Sankar G Sr. Executive - IT Recruitment

Posted 1 month ago

Apply

8 - 13 years

25 - 40 Lacs

Bengaluru

Remote

Naukri logo

Senior GCP Cloud Administrator Experience: 8 - 12 Years Exp Salary : Competitive Preferred Notice Period : Within 30 Days Shift : 10:00AM to 7:00PM IST Opportunity Type: Remote Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : GCP, Identity and Access Management (IAM), BigQuery, SRE, GKE, GCP certification Good to have skills : Terraform, Cloud Composer, Dataproc, Dataflow, AWS Forbes Advisor (One of Uplers' Clients) is Looking for: Senior GCP Cloud Administrator who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: Forbes Advisor is a global platform dedicated to helping consumers make the best financial choices for their individual lives. We support your pursuit of success by making smart financial decisions simple, to help you get back to doing the things you care about most. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 2 months ago

Apply

5 - 10 years

15 - 22 Lacs

Chennai, Bengaluru, Hyderabad

Work from Office

Naukri logo

GCP Senior Developer 5+ years of Total Experience 3+ years of direct hands-on experience with the GCP BigQuery, Apache Airflow/ GCP Cloud Composer Experience designing cloud solutions Experience in Developing scripts for DWH Platforms using Hadoop, Python, Pyspark Experiencing in migration activities from DWH to Cloud platforms

Posted 2 months ago

Apply

5 - 9 years

6 - 10 Lacs

Chennai

Work from Office

Naukri logo

Job Description Overview We are seeking an experienced Senior BigQuery Developer to join our team and lead the design, development, and optimization of data pipelines and BigQuery solutions. The ideal candidate will have a strong background in data engineering, SQL, and Google Cloud Platform (GCP) services, with expertise in building scalable, secure, and high-performance data systems. Responsibilities Design, develop, and optimize complex BigQuery solutions to meet business requirements. Create and maintain ETL/ELT pipelines using tools like Dataflow, Cloud Composer, or Apache Airflow . Write and optimize advanced SQL queries for performance and scalability in BigQuery. Collaborate with data analysts, scientists, and stakeholders to understand data requirements and deliver actionable insights. Implement robust data governance, security, and compliance frameworks for BigQuery datasets. Integrate BigQuery with other GCP services like Cloud Storage, Pub/Sub, and Looker Studio for end-to-end data workflows. Develop and manage scheduled jobs and scripts for data ingestion and transformation. Perform root cause analysis and resolve performance issues in data pipelines and BigQuery queries. Stay updated with new features and advancements in Google Cloud technologies. Requirements 5+ years of experience in data engineering, with a focus on Google BigQuery. Proficiency in advanced SQL and experience with query optimization in BigQuery. Hands-on experience with Google Cloud Platform (GCP) services, including Dataflow, Cloud Functions, and Cloud Composer. Strong understanding of data warehouse architecture and best practices. Experience with data modeling and schema design in BigQuery. Familiarity with programming languages such as Python or Java for automation and scripting. Experience working with large datasets and real-time data processing. Knowledge of data governance, security policies, and access controls in GCP.

Posted 2 months ago

Apply

8 - 11 years

10 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

10+ years of experience in data engineering+ with a focus on cloud-based solutions. Experience in designing of solution and should be able to review and help team Extensive experience with Google Cloud Platform (GCP) and its data services+ including BigQuery+ DBT & Streaming+ Dataflow+ Pub+Sub+ Cloud Storage+ and Cloud Composer. Proven track record of designing and building scalable data pipelines and architectures. Experience with ETL tools and processes. Design+ develop+ and maintain robust and scalable data pipelines using GCP services such as Dataflow+ Pub+Sub+ Cloud Functions+ and Cloud Composer. Implement ETL (Extract+ Transform+ Load) processes to ingest data from various sources into GCP data warehouses like BigQuery.

Posted 3 months ago

Apply

10 - 20 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

Post Trade Automation Technology within the Investment Banking Division of Deutsche Bank is responsible for building and managing the Documentation & Settlement applications to support Operations Division in managing the trade lifecycle across Interest Rates, Credit, Equities (OTC Derivatives), Foreign Exchange & Money Market businesses. Our group is building cross-asset Documentation and Settlement cloud platforms. Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Flexible working arrangements Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Analysis, design and development of FXPCA system components with enhancements to GEM business a priority. Maintain and enhance existing applications by implementing planned engineering changes as part of agile feature team. Implement and comply with bank policies (naming conventions, encryption, security settings, capacity, availability and other non-functional requirements). Design, implementation, execution and results analysis of automated unit, integration, regression, resilience and performance tests. Code reviews based on the four-eye principle. Level-3 technical support as well as problem and root cause analysis. Your Skills and Experience: B.Tech/BE or M.Tech/ME/MCA/M.Sc. preferably in Computer-Science A hands-on technologist with 8-10 years of experience in Java based technologies, Micro-services architecture, CI/CD pipeline Java 8/11/17, Spring, Quarkus, REST APIs, Junit, Linux, Shell Script, messaging technology such as JMS,MQ, equivalent, Maven, Artifactory Experience with Github workflows actions and cloud deployment, preferably GCP. Experience working in an Agile/DevOps environment with JIRA, Confluence Strong analytical and design skills Proficient communication skills (written/verbal) Desired skills GCP services including GKE, CloudSQL, Cloud Composer TDD, BDD Python 3.x, Terraform Experience in Foreign Exchange Settlements/Investment Banking/Financial domain

Posted 3 months ago

Apply

10 - 15 years

12 - 17 Lacs

Pune

Work from Office

Naukri logo

Senior engineer is responsible for developing and delivering elements of engineering solutions to accomplish business goals. Awareness is expected of the important engineering principles of the bank. Root cause analysis skills develop through addressing enhancements and fixes 2 products build reliability and resiliency into solutions through early testing peer reviews and automating the delivery life cycle. Successful candidate should be able to work independently on medium to large sized projects with strict deadlines. Successful candidates should be able to work in a cross application mixed technical environment and must demonstrate solid hands-on development track record while working on an agile methodology. The role demands working alongside a geographically dispersed team. The position is required as a part of the buildout of Compliance tech internal development team in India. The overall team will primarily deliver improvements in compliance tech capabilities that are major components of the regular regulatory portfolio addressing various regulatory common commitments to mandate monitors. Your key responsibilities Analyzing data sets and designing and coding stable and scalable data ingestion workflows also integrating into existing workflows Working with team members and stakeholders to clarify requirements and provide the appropriate ETL solution. Work as a senior developer for developing analytics algorithm on top of ingested data. Work as though senior developer for various data sourcing in Hadoop also GCP. Own unit testing UAT deployment end user sign off and prod go live. Ensuring new code is tested both at unit level and system level design develop and peer review new code and functionality. Operate as a team member of an agile scrum team. Root cause analysis skills to identify bugs and issues for failures. Support Prod support and release management teams in their tasks. Your skills and experience: More than 10+ years of coding experience in experience and reputed organizations Hands on experience in Bitbucket and CI/CD pipelines Proficient in Hadoop, Python, Spark, SQL Unix and Hive Basic understanding of on Prem and GCP data security Hands on development experience on large ETL/ big data systems .GCP being a big plus Hands on experience on cloud build, artifact registry ,cloud DNS ,cloud load balancing etc. Hands on experience on Data flow, Cloud composer, Cloud storage ,Data proc etc. Basic understanding of data quality dimensions like Consistency, Completeness, Accuracy, Lineage etc. Hands on business and systems knowledge gained in a regulatory delivery environment. Desired Banking experience regulatory and cross product knowledge. Passionate about test driven development. Prior experience with release management tasks and responsibilities. Data visualization experience is good to have.

Posted 3 months ago

Apply

10 - 14 years

25 - 30 Lacs

Pune

Work from Office

Naukri logo

Role Description The Business Architect defines the technical solution design of specific IT platforms and provides guidance to the squad members in order to design, build, test and deliver high quality software solutions. A key element in this context is translation of functional and non-functional business requirements into an appropriate technical solution design, leveraging best practices and consistent design patterns. The Business Architect collaborates closely with Product Owners Chapter leads and Squad members to ensure consisten adherence to the agreed-upon application design and is responsible for maintaining an appropriate technical design documentation. The Solution architect ensures that the architectures and designs of solutions conform to the principles blueprints, standards, patterns etc,that have been established by the Enterprise Architecture in this context the Business Architect is closely collaborating with the respective solution Architect to ensure architecture compliance. The business Architect also actively contributes to the definition and enrichment of design patterns and standards with the aim to leverage those across squads and tribes. Your key responsibilities Define the technical Architecture of IT Solutions in line with functional and non-functional requirements following consistent design patterns and best practices. Ensure that the solution design is in sync with WM target Architecture blueprints and principles, as well as with overarching DB architecture and security standards. Create appropriate technical design documentation and ensure this is kept up-to-date. Provide guidance to the squad members to design, build, test and deliver high quality software solutions in line with business requirements Responsible for all aspects of the solution architecture (i.e. Maintainablity, scalability, effective integration with other solutions, usage of shared solutions and components where possible, optimization of the resource consumption etc. ) with the object to meet the appropriate balance between business needs and total cost of ownership Closely collaborate with enterprise architecture to ensure architecture compliance and make sure that any design options are discussed in a timely manner to allow sufficient time for deliberate decision taking Present architecture proposals to relevant forums along with enterprise architect at different levels and drive the process to gain the necessary architecture approvals. Collaborate with relevant technology stakeholders within other squads and across tribes to ensure the cross-squad and cross-tribe solution architecture synchronization and alignment Contribute to definition and enrichment of appropriate design patterns and standards that can be leveraged across WM squads tribes Serve as a Counsel to designers and developers and carry out reviews of software designs and high level detailed level design documentation provided by other squad members Lead the technical discussions with CCO, Data factory, Central Data quality and Complience, end to end and control functions for technical queries contribute to peer level solution architecture reviews e.g. within a respective chapter Your skills and experience Ability experience in defining the high level and low level technical solution designs for complex initiatives very good analytical skills and ability to oversee structure complex tasks Hands on skills with various google cloud components like storage buckets, BigQuery, Dataproc, cloud composer, cloud functions etc aling with Pyspark, Scala is essential,. Good to have experience in Cloud SQL, Dataflow, Java and Unix Experience with implementing a google cloud based solution is essesntial persuasive power and persistence in driving adherence to solution design within the squad Ability to apply the appropriate architectural patterns considering the relevant functional and nonfunctional requirements proven ability to balance business demands and IT capabilities in terms of standardization reducing risk and increasing the IT flexibility comfortable working in an open, highly collaborative team ability to work in an agile and dynamic environment and to build up the knowledge related to new technology/ solutions in an effective and timely manner ability to communicate effectively with other technology stakeholders feedback: seek feedback from others, provides feedback to others in support of their development and is open and honest while dealing constructively with criticism inclusive leadership: values individuals and embraces diversity by integrating differences in promoting diversity and inclusion across teams and functions coaching: understands and anticipates people's needs skills and abilities in order to coach, motivate and empower them for success broad set of architecture knowledge and application design skills and - depending on the specific squad requirements - in-depth expertise with regards to specific architecture domains (e.g. service and integration architecture web and mobile front end architecture guitar architecture security architecture infrastructure architecture) and related technology stacks and design patterns experience in establishing thought leadership in solution architecture practices and ability to lead design and development teams and defining building and delivering first class software solutions familiar with current and emerging technologies, tools, frameworks and design patterns experience in effectively collaborating across multiple teams and geographies ability to appropriately consider other dimensions(e.g. financials, risk, time to market) On top of the architecture drivers in order to propose balanced and physical architecture solutions Experience Qualifications : 10+ years relevant experience as technology Manager within the IT support industry experience in financial banking industry preferred Minimum 8 years' experience supporting Oracle Platform in a mid-size to large corporate environment Preferably from Banking Wealth Management experience Must have experience working in agile organization.

Posted 3 months ago

Apply

7 - 12 years

10 - 20 Lacs

Pune

Hybrid

Naukri logo

Lead Data Engineer Experience: 7 - 10 Years Salary: Competitive Preferred Notice Period : within 30 days Shift : 10:00 AM to 6:00 PM IST Opportunity Type: Hybrid - Pune Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Partners) What do you need for this opportunity? Must have skills required : Python, SQL, GCP, Dataflow, Pub/Sub, Cloud Storage, Big Query Good to have skills : AWS, Docker, Kubernetes, Generative AI, Azure Our Hiring Partner is Looking for: Lead Data Engineer who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description We are seeking an experienced and dynamic Lead Data Engineer to join our team. This role is pivotal in advancing our data engineering practices on the Google Cloud Platform (GCP) and offers a unique opportunity to work with cutting-edge technologies, including Generative AI. Key Responsibilities: Lead the design, implementation, and optimization of scalable data pipelines and architectures on GCP using key services such as Big Query, Dataflow, Pub/Sub, and Cloud Storage. Collaborate with cross-functional teams to define data requirements and develop strategic solutions that address business needs. Enhance existing data infrastructure, ensuring high levels of performance, reliability, and security. Drive the integration and deployment of machine learning models and advanced analytics solutions, incorporating Generative AI where applicable. Establish and enforce best practices in data governance, data quality, and data security. Mentor and guide junior engineers, fostering a culture of innovation and continuous improvement. Stay informed about the latest trends in data engineering, GCP advancements, and Generative AI technologies to drive innovation within the team. Qualifications: Bachelors or masters degree in computer science, Information Technology, or a related field. 7+ to 10 years of experience in data engineering, with a strong emphasis on GCP technologies. Demonstrated expertise in building and managing data solutions using GCP services like Big Query, Dataflow, and Cloud Composer. Proficiency in SQL and programming languages such as Python, Java, or Scala. Strong understanding of data modelling, warehousing concepts, and real-time data processing. Familiarity with containerization and orchestration tools like Docker and Kubernetes. Excellent analytical, problem-solving, and communication skills. Leadership experience with a proven ability to mentor and develop junior team members. Preferred Qualifications: GCP Professional Data Engineer certification. Experience with Generative AI technologies and their practical applications. Knowledge of additional cloud platforms such as AWS or Azure. Experience with implementing data governance frameworks and tools. How to apply for this opportunity Register or log in on our portal Click 'Apply,' upload your resume, and fill in the required details. Post this, click Apply Now' to submit your application. Get matched and crack a quick interview with our hiring partner. Land your global dream job and get your exciting career started! About Our Hiring Partner: At Inferenz, our team of innovative technologists and domain experts help accelerating the business growth through digital enablement and navigating the industries with data, cloud and AI services and solutions. We dedicate our resources to increase efficiency and gain a greater competitive advantage by leveraging various next generation technologies. Our technology expertise has helped us delivering the innovative solutions in key industries such as Healthcare & Life Sciences, Consumer & Retail, Financial Services and Emerging industries. About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. You will also be assigned to a dedicated Talent Success Coach during the engagement. ( Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 3 months ago

Apply

4 - 8 years

20 - 25 Lacs

Pune

Hybrid

Naukri logo

Experience: 4-8years Job Location: Pune, Hybrid Primary Skills : Terraform,Python, Shell, JSON, automation tools Secondary Skills : ITSM, JIRA, GitHub, CICD practices, json, API invocation Role purpose: Min 4+ years experience in creating GCP infrastructure with data integration patterns for streaming and batch load processes for large scale data platforms / data warehouses Good knowledge and experience in Terraform, Unix, Network Basics, Docker, Kubernetes, Google Cloud Knowledge(VPC, Compute Engine, Load Balancer, Cloud Build), Helm Good knowledge and experience in using CI-CD Pipelines, Jenkins, Cloud build Good understanding of GCP cloud platform Hands on experience in terraform Knowledge and experience in working in agile lean methodologies Preferred GCP DevOps certified, Hashicorp Terraform Certification. Essential Prior work experience of working in DWH Good knowledge of Data Flow, BigQuery , Composer, Stack driver Relevant work experience (3 to 5+) years Desired ITIL Telecom Domain Knowledge GCP Data Engineer

Posted 3 months ago

Apply

13 - 20 years

30 - 40 Lacs

Chennai, Hyderabad

Work from Office

Naukri logo

Title : GCP Data Engineer Experience : 14+ Years Location : Chennai/Hyderabad Skillset : GCP, BigQuery, Dataflow, Terraform, AirFlow, CloudSql, Cloud Storage, cloud Composer, Pubsub If anyone interested, Kindly drop CV to sharmeelasri26@gmail.com

Posted 3 months ago

Apply

6 - 11 years

15 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Role: GCP Data Engineer Location: Bangalore Experience: 6-12 years Mode: work from office Job Description: We are seeking a talented GCP Data Engineer to join our team and help us design and implement robust data pipelines and analytics solutions on Google Cloud Platform (GCP). The ideal candidate will have strong expertise in BigQuery , DataFlow , Cloud Composer , and DataProc , along with experience in AI/ML tools such as Google Vertex AI or Dialogflow . Key Responsibilities: Design, develop, and maintain data pipelines and workflows using DataFlow , Cloud Composer , and DataProc . Develop optimized queries and manage large-scale datasets using BigQuery . Collaborate with cross-functional teams to gather requirements and translate business needs into scalable data solutions. Implement best practices for data engineering, including version control, CI/CD pipelines, and data governance. Work on AI/ML use cases, leveraging Google Vertex AI or Dialogflow to create intelligent solutions. Perform data transformations, aggregations, and ETL processes to prepare data for analytics and reporting. Monitor and troubleshoot data workflows to ensure reliability, scalability, and performance. Document technical processes and provide guidance to junior team members. Qualifications: Experience: 3-5+ years of professional experience in GCP data engineering or related fields. Skills: Proficiency in BigQuery , DataFlow , Cloud Composer , and DataProc . Exposure to Google Vertex AI , Dialogflow , or other AI/ML platforms. Strong programming skills in Python , SQL , and familiarity with Terraform for GCP infrastructure. Experience with distributed data processing frameworks like Apache Spark is a plus. Knowledge of data security, governance, and best practices for cloud platforms

Posted 3 months ago

Apply

4 - 9 years

8 - 12 Lacs

Hyderabad

Hybrid

Naukri logo

We're seeking curious engineers to work within our feature teams. You will support customers by ensuring the products we deliver are fit for purpose and meet the quality and standards expected. We're looking for someone who wants to further develop their engineering skillset. Well support you to be able to be able to take on technical challenges and develop your understanding of how data solutions and services are developed, tested and implemented. If this excites you then you would be an asset to our team! We are pioneering the transformation of current processes from our on-premises platforms and have started our journey in deploying solutions on the Google Cloud Platform. This role sits within our platforms Analytics Lab who are looking to create strategic data products and build data and analytics capability. We work using agile delivery practices, so a self-led individual capable of accurately estimating and planning their own work would be valued highly. What Youd Get Involved With: Design, develop, maintain and improve data processes to support regulatory and prudential change with high quality solutions and providing oversight and leadership to help others do the same. Building data pipelines for current and future analytics and reporting solutions Implement and embody engineering standards, using constructive feedback to create opportunities for learning. Work with the Product Owner and customers to understand, refine and prioritise items for the feature team backlog Using strong problem solving skills and a combination of technical knowledge, experience and judgement to identify available options and clearly set-out the way forward. Key skills required: Passion for software and data engineering, adopting the mindset of a curious engineer Experience of DBT, SQL, Python, Java, SAS or other open source technologies used for analytics Ability to understand business requirements and create business ready solutions Show well-developed interpersonal, communication and influencing skills, particularly the ability to convey key business information arising from complex issues to non-technical people Desirable skills include: Cloud understanding, particularly GCP Knowledge of Terraform Data engineering background and good knowledge of waterfall and agile development practices. Insights into industry solutions for data management, storage and analytics coupled with experience of financial data, including Credit Risk, capital and impairment processes

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies