Jobs
Interviews

25 Cloud Build Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer at Deutsche Bank in Pune, India, you will be responsible for developing and delivering engineering solutions to achieve business objectives. You are expected to have a strong grasp of essential engineering principles and possess root cause analysis skills to address enhancements and fixes in product reliability and resiliency. You should be capable of working independently on medium to large projects with strict deadlines and adapt to a cross-application mixed technical environment. Your role involves hands-on development experience in ETL, Big Data, Hadoop, Spark, and GCP while following an agile methodology. Collaboration with a geographically dispersed team is essential in this role. The position is part of the Compliance tech internal development team in India, focusing on delivering improvements in compliance tech capabilities to meet regulatory commitments and mandates. You will be involved in analyzing data sets, designing stable data ingestion workflows, and integrating them into existing workflows. Additionally, you will work closely with team members and stakeholders to provide ETL solutions, develop analytics algorithms, and handle data sourcing in Hadoop and GCP. Your responsibilities include unit testing, UAT deployment, end-user sign-off, and supporting production and release management teams. To excel in this role, you should have over 10 years of coding experience in reputable organizations, proficiency in technologies such as Hadoop, Python, Spark, SQL, Unix, and Hive, as well as hands-on experience in Bitbucket and CI/CD pipelines. Knowledge of data security in on-prem and GCP environments, cloud services, and data quality dimensions is crucial. Experience in regulatory delivery environments, banking, test-driven development, and data visualization tools like Tableau would be advantageous. At Deutsche Bank, you will receive support through training, coaching, and a culture of continuous learning to enhance your career progression. The company fosters a collaborative environment where employees are encouraged to act responsibly, think commercially, and take initiative. Together, we strive for excellence and celebrate the achievements of our diverse workforce. Deutsche Bank promotes a positive, fair, and inclusive work environment and welcomes applications from all individuals. For more information about Deutsche Bank and our values, please visit our company website: [https://www.db.com/company/company.htm](https://www.db.com/company/company.htm),

Posted 21 hours ago

Apply

4.0 - 9.0 years

3 - 15 Lacs

Mumbai, Maharashtra, India

On-site

What You'll Do: Design, develop, and deploy cloud-native applications on Google Cloud Platform (GCP). Build and maintain CI/CD pipelines using tools like Jenkins, GitHub Actions, or Cloud Build. Implement infrastructure as code (IaC) using Terraform or Deployment Manager. Collaborate with architects, DevOps engineers, and other developers to deliver scalable cloud solutions. Set up monitoring, logging, and alerting with Stackdriver, Prometheus, or similar tools. Ensure applications are secure, reliable, and cost-efficient in the cloud environment. Support containerized workloads with Docker and Kubernetes (GKE). What We're Looking For: 3+ years of experience in software development with a focus on cloud-based architectures. Hands-on experience in Google Cloud Platform (GCP) App Engine, Cloud Functions, Cloud Run, GKE, etc. Solid understanding of DevOps principles and tools. Proficiency in Python, Go, or Java. Experience with CI/CD pipelines, IaC, containerization, and Kubernetes. Knowledge of monitoring, logging, and security practices in the cloud. GCP Certifications (e.g., Professional Cloud Developer or DevOps Engineer) are a big plus! Nice to Have: Experience with multi-cloud environments (Azure, AWS). Familiarity with microservices architecture and API management. Exposure to Agile/Scrum methodologies.

Posted 6 days ago

Apply

4.0 - 9.0 years

3 - 15 Lacs

Remote, , India

On-site

What You'll Do: Design, develop, and deploy cloud-native applications on Google Cloud Platform (GCP). Build and maintain CI/CD pipelines using tools like Jenkins, GitHub Actions, or Cloud Build. Implement infrastructure as code (IaC) using Terraform or Deployment Manager. Collaborate with architects, DevOps engineers, and other developers to deliver scalable cloud solutions. Set up monitoring, logging, and alerting with Stackdriver, Prometheus, or similar tools. Ensure applications are secure, reliable, and cost-efficient in the cloud environment. Support containerized workloads with Docker and Kubernetes (GKE). What We're Looking For: 3+ years of experience in software development with a focus on cloud-based architectures. Hands-on experience in Google Cloud Platform (GCP) App Engine, Cloud Functions, Cloud Run, GKE, etc. Solid understanding of DevOps principles and tools. Proficiency in Python, Go, or Java. Experience with CI/CD pipelines, IaC, containerization, and Kubernetes. Knowledge of monitoring, logging, and security practices in the cloud. GCP Certifications (e.g., Professional Cloud Developer or DevOps Engineer) are a big plus! Nice to Have: Experience with multi-cloud environments (Azure, AWS). Familiarity with microservices architecture and API management. Exposure to Agile/Scrum methodologies.

Posted 6 days ago

Apply

8.0 - 12.0 years

24 - 32 Lacs

Gurgaon, Haryana, India

On-site

We are seeking a GCP DevOps Lead with deep hands-on expertise in Google Cloud Platform and strong leadership skills to drive DevOps practices, automation, and cloud-native delivery. The ideal candidate will bring a solid foundation in CI/CD, Infrastructure as Code, cloud security, and multi-cloud adaptability. This is a strategic role requiring technical depth, leadership acumen, and a passion for innovation. Key Responsibilities Architect, implement, and manage scalable and secure GCP infrastructure using Terraform and Infrastructure as Code principles. Design and maintain CI/CD pipelines using GitLab Actions, GitHub Actions, Jenkins, and Cloud Build for seamless and automated deployment. Drive DevOps automation and cloud-native practices to improve reliability, scalability, and cost-efficiency. Lead and mentor a DevOps team, ensuring delivery excellence and alignment with business and technical goals. Collaborate with cross-functional teams (Developers, QA, Security, Platform) to streamline development and deployment workflows. Set up and manage containerized environments using Docker and orchestration with Kubernetes/GKE . Ensure infrastructure is optimized for security, availability, and cost. Must-Have Skills Strong hands-on experience with Google Cloud Platform (GCP) . Proficient in Infrastructure as Code (IaC) using Terraform . Expertise in CI/CD tools : GitLab Actions, GitHub Actions, Jenkins, Cloud Build. Proficient in Python , Bash , or similar scripting languages. In-depth understanding of cloud security, cost optimization, and performance tuning. Proven leadership experience managing DevOps teams and guiding stakeholders.

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You will be responsible for designing, developing, and maintaining applications and data solutions using Python and Java on cloud platforms such as AWS or GCP. Your role will involve building and managing cloud-native software solutions, as well as designing and implementing DevOps solutions to automate software development and deployment processes. Additionally, you will be tasked with designing and implementing event-driven systems. As part of your responsibilities, you will manage code repositories, handle code merges, conduct quality checks, and automate deployment processes using tools like Git, Maven, Docker, Solar, Cloud Build, GitHub Actions, and other related technologies. To qualify for this position, you must have a Bachelor's degree in computer science, Information Technology, or a related field. You should possess at least 6 years of experience in building applications and data solutions using Python and Java, along with a minimum of 3 years of experience in developing cloud-native software solutions on AWS or GCP. Furthermore, you should demonstrate expertise in application CI/CD solutions, including managing code repositories, continuous integration, automated deployment, and utilizing tools such as GitHub, Docker, GCP Cloud Build, AWS CodeBuild, GitHub Actions, and similar technologies. Proficiency in building cloud-native application operation solutions is also required, with proven experience and skill sets in tools like Google Stackdriver, GCP Operation Suite, AWS CloudWatch, Prometheus, Grafana, and others.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

Iamneo, a fast-growing B2B EdTech SaaS company, is seeking a Senior DevOps & Cloud Operations Engineer to take end-to-end ownership of cloud infrastructure and DevOps practices. This role is crucial in driving scalable, secure, and high-performance deployment environments for applications. If you have a passion for innovation and growth and are eager to redefine the future of tech learning, iamneo is the place for you. As the Senior DevOps & Cloud Operations Engineer, your responsibilities will include architecting, deploying, and managing scalable cloud infrastructure, leading infrastructure optimization initiatives, designing and implementing CI/CD pipelines, automating infrastructure provisioning and configuration, managing containerized environments, implementing configuration management, setting up monitoring and alerting systems, writing automation and operational scripts, ensuring security controls and compliance, conducting infrastructure audits, backups, and disaster recovery drills, troubleshooting and resolving infrastructure-related issues, collaborating with product and development teams, supporting platform transitions and cloud migration efforts, and mentoring junior engineers. The ideal candidate should have at least 5 years of hands-on experience in DevOps, cloud infrastructure, and system reliability, strong expertise in GCP and Azure, proficiency in CI/CD, infrastructure-as-code, and container orchestration, scripting skills in Bash, Python, or similar languages, a solid understanding of cloud-native and microservices architectures, strong problem-solving and communication skills, and an ownership mindset. Preferred qualifications include GCP and/or Azure certifications, experience with Agile and DevOps cultural practices, prior experience deploying web applications using Node.js, Python, or similar technologies, and the ability to thrive in fast-paced environments. If you are someone who enjoys working in a multi-cloud, automation-first environment and excels at building robust systems that scale, iamneo welcomes your application.,

Posted 1 week ago

Apply

1.0 - 4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

???? About Es Magico Es Magico is an AI-first enterprise transformation organisation that goes beyond consulting we deliver scalable execution across sectors such as BFSI, Healthcare, Entertainment, and Education. With offices in Mumbai and Bengaluru, our mission is to augment the human workforce by deploying bespoke AI employees across business functions, innovating swiftly and executing with trust. We also partner with early-stage startups as a venture builder, transforming 0 ? 1 ideas into AI-native, scalable products. ???? Role: MLOps Engineer ??? Location: Bengaluru (Hybrid) ??? Experience: 14 years ??? Joining: Immediate ???? Key Responsibilities Design, develop, and maintain scalable ML pipelines for training, testing, and deployment. Automate model deployment, monitoring, and version control across dev/staging/prod environments. Integrate CI/CD pipelines for ML models using tools like MLflow, Kubeflow, Airflow, etc. Manage containerized workloads using Docker and orchestrate with Kubernetes or GKE. Collaborate closely with data scientists and product teams to optimize ML model lifecycle. Monitor performance and reliability of deployed models and troubleshoot issues as needed. ????? Technical Skills Experience with MLOps frameworks: MLflow, TFX, Kubeflow, or SageMaker Pipelines. Proficient in Python and common ML libraries (scikit-learn, pandas, etc.). Solid understanding of CI/CD practices and tools (e.g., GitHub Actions, Jenkins, Cloud Build). Familiar with Docker, Kubernetes, and Google Cloud Platform (GCP). Comfortable with data pipeline tools like Airflow, Prefect, or equivalent. ???? Preferred Qualifications 14 years of experience in MLOps, ML engineering, or DevOps with ML workflows. Prior experience with model monitoring, drift detection, and automated retraining. Exposure to data versioning tools like DVC or Delta Lake is a plus. GCP certifications or working knowledge of Vertex AI is a strong advantage. ???? How to Apply Send your resume to [HIDDEN TEXT] with the subject line: Application MLOps Engineer. Show more Show less

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

kolkata, west bengal

On-site

As a Solution Architect & Technical Lead at RebusCode, you will play a crucial role in driving the design and architecture of our Big Data Analytics solutions within the Market Research industry. Your responsibilities will include providing technical leadership, ensuring governance, documenting solutions, and sharing knowledge effectively. Moreover, you will be actively involved in project management and ensuring timely delivery of projects. To excel in this role, you should have a minimum of 5 years of experience in software development, out of which at least 2 years should be in architecture or technical leadership positions. A proven track record of delivering enterprise-grade, cloud-native SaaS applications on Azure and/or GCP is essential for this role. Your technical skills should encompass a wide range of areas including Cloud & Infrastructure (Azure App Services, Functions, Kubernetes; GKE, Cloud Functions; Service Bus, Pub/Sub; Blob Storage, Cloud Storage; Key Vault, Secret Manager; CDN), Development Stack (C#/.NET 6/7/8, ASP.NET Core Web API, Docker, container orchestration), Data & Integration (SQL Server, Oracle, Cosmos DB, Spanner, BigQuery, ETL patterns, message-based integration), CI/CD & IaC (Azure DevOps, Cloud Build, GitHub Actions; ARM/Bicep, Terraform; container registries, automated testing), Security & Compliance (TLS/SSL certificate management, API gateway policies, encryption standards), and Monitoring & Performance (Azure Application Insights, Log Analytics, Stackdriver, performance profiling, load testing tools). Nice-to-have qualifications include certifications such as Azure Solutions Architect Expert, Google Professional Cloud Architect, PMP or PMI-ACP. Familiarity with front-end frameworks like Angular and React, as well as API client SDK generation, would be an added advantage. Prior experience in building low-code/no-code integration platforms or automation engines is also beneficial. Exposure to alternative clouds like AWS or on-prem virtualization platforms like VMware and OpenShift will be a plus. Join us at RebusCode, where you will have the opportunity to work on cutting-edge Big Data Analytics solutions and contribute to the growth and success of our market research offerings.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

As a Google Cloud Engineer at our company, you will play a crucial role in designing, building, deploying, and maintaining our cloud infrastructure and applications on Google Cloud Platform (GCP). Your collaboration with development, operations, and security teams will ensure that our cloud environment is scalable, secure, highly available, and cost-optimized. If you are enthusiastic about cloud-native technologies, automation, and overcoming intricate infrastructure challenges, we welcome you to apply. Your responsibilities will include: - Designing, implementing, and managing robust, scalable, and secure cloud infrastructure on GCP utilizing Infrastructure as Code (IaC) tools like Terraform. - Deploying, configuring, and managing core GCP services such as Compute Engine, Kubernetes Engine (GKE), Cloud SQL, Cloud Storage, Cloud Functions, BigQuery, Pub/Sub, and networking components. - Developing and maintaining CI/CD pipelines for automated deployment and release management using various tools. - Implementing and enforcing security best practices within the GCP environment, including IAM, network security, data encryption, and compliance adherence. - Monitoring cloud infrastructure and application performance, identifying bottlenecks, and implementing optimization solutions. - Troubleshooting and resolving complex infrastructure and application issues in production and non-production environments. - Collaborating with development teams to ensure cloud-native deployment, scalability, and resilience of applications. - Participating in on-call rotations for critical incident response and timely resolution of production issues. - Creating and maintaining comprehensive documentation for cloud architecture, configurations, and operational procedures. - Keeping up-to-date with new GCP services, features, and industry best practices to propose and implement improvements. - Contributing to cost optimization efforts by identifying and implementing efficiencies in cloud resource utilization. We require you to have: - A Bachelors or Masters degree in Computer Science, Software Engineering, or a related field. - 6+ years of experience with C#, .NET Core, .NET Framework, MVC, Web API, Entity Framework, and SQL Server. - 3+ years of experience with cloud platforms, preferably GCP, including designing and deploying cloud-native applications. - 3+ years of experience with source code management, CI/CD pipelines, and Infrastructure as Code. - Strong experience with Javascript and a modern Javascript framework, with VueJS preferred. - Proven leadership and mentoring skills with development teams. - Strong understanding of microservices architecture and serverless computing. - Experience with relational databases like SQL Server and PostgreSQL. - Excellent problem-solving, analytical, and communication skills, along with Agile/Scrum environment experience. What can make you stand out: - GCP Cloud Certification. - UI development experience with HTML, JavaScript, Angular, and Bootstrap. - Agile environment experience with Scrum, XP. - Relational database experience with SQL Server, PostgreSQL. - Proficiency in Atlassian tools like JIRA, Confluence, and Github. - Working knowledge of Python and exceptional problem-solving and analytical abilities, along with strong teamwork skills.,

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

As a GCP DevOps Lead, you will be responsible for leading the architecture and infrastructure strategy using Google Cloud Platform (GCP). Your role will involve designing, implementing, and managing CI/CD pipelines, infrastructure as code, and deployment automation to ensure high availability, scalability, and performance of cloud environments. You will guide a team of DevOps engineers in daily operations and project execution, while also implementing and maintaining monitoring, logging, and alerting frameworks such as Stackdriver, Prometheus, and Grafana. Driving security best practices, collaborating with cross-functional teams, and optimizing cost, resource usage, and performance in GCP will be key aspects of your responsibilities. The ideal candidate should possess 7+ years of total experience in DevOps, Cloud, or Infrastructure roles, with at least 3 years of hands-on experience with Google Cloud Platform (GCP). Strong skills in CI/CD tools like Jenkins, GitLab CI/CD, or Cloud Build, along with proficiency in Docker and Kubernetes (GKE) are essential. Experience with Terraform, Ansible, or Deployment Manager, source control systems like Git and Bitbucket, scripting languages such as Python, Bash, or Go, and knowledge of networking components and monitoring tools are also required. Understanding DevSecOps practices and security compliance standards will be beneficial in this role.,

Posted 2 weeks ago

Apply

10.0 - 20.0 years

10 - 18 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Google Cloud Certification Associate Cloud Engineer or Professional Cloud Architect/Engineer Hands-on experience with GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) Strong command of Linux , shell scripting , and networking fundamentals Proficiency in Terraform , Cloud Build , Cloud Functions , or other GCP-native tools Experience with containers and orchestration – Docker, Kubernetes (GKE) Familiarity with monitoring/logging – Cloud Monitoring , Prometheus , Grafana Understanding of IAM , VPCs , firewall rules , service accounts , and Cloud Identity Excellent written and verbal communication skills.

Posted 2 weeks ago

Apply

10.0 - 18.0 years

12 - 22 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Google Cloud Certification Associate Cloud Engineer or Professional Cloud Architect/Engineer Hands-on experience with GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) Strong command of Linux , shell scripting , and networking fundamentals Proficiency in Terraform , Cloud Build , Cloud Functions , or other GCP-native tools Experience with containers and orchestration – Docker, Kubernetes (GKE) Familiarity with monitoring/logging – Cloud Monitoring , Prometheus , Grafana Understanding of IAM , VPCs , firewall rules , service accounts , and Cloud Identity Excellent written and verbal communication skills

Posted 2 weeks ago

Apply

10.0 - 17.0 years

10 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Google Cloud Certification Associate Cloud Engineer or Professional Cloud Architect/Engineer Hands-on experience with GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) Strong command of Linux , shell scripting , and networking fundamentals Proficiency in Terraform , Cloud Build , Cloud Functions , or other GCP-native tools Experience with containers and orchestration – Docker, Kubernetes (GKE) Familiarity with monitoring/logging – Cloud Monitoring , Prometheus , Grafana Understanding of IAM , VPCs , firewall rules , service accounts , and Cloud Identity Excellent written and verbal communication skills

Posted 2 weeks ago

Apply

10.0 - 15.0 years

10 - 19 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Google Cloud Certification Associate Cloud Engineer or Professional Cloud Architect/Engineer Hands-on experience with GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) Strong command of Linux , shell scripting , and networking fundamentals Proficiency in Terraform , Cloud Build , Cloud Functions , or other GCP-native tools Experience with containers and orchestration – Docker, Kubernetes (GKE) Familiarity with monitoring/logging – Cloud Monitoring , Prometheus , Grafana Understanding of IAM , VPCs , firewall rules , service accounts , and Cloud Identity Excellent written and verbal communication skills

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As a Senior Data Scientist in the Global Data Science & Advanced Analytics team at Colgate-Palmolive, your role will involve leading projects within the Analytics Continuum. You will be responsible for conceptualizing and developing machine learning, predictive modeling, simulations, and optimization solutions to address business questions with clear dollar objectives. Your work will have a significant impact on revenue growth management, price elasticity, promotion analytics, and marketing mix modeling. Your responsibilities will include: - Conceptualizing and building predictive modeling solutions to address business use cases - Applying machine learning and AI algorithms to develop scalable solutions for business deployment - Developing end-to-end business solutions from data extraction to statistical modeling - Conducting model validations and continuous improvement of algorithms - Deploying models using Airflow and Docker on Google Cloud Platforms - Leading pricing, promotion, and marketing mix initiatives from scoping to delivery - Studying large datasets to discover trends and patterns - Presenting insights in a clear and interpretable manner to business teams - Developing visualizations using frameworks like Looker, PyDash, Flask, PlotLy, and streamlit - Collaborating closely with business partners across different geographies To qualify for this position, you should have: - A degree in Computer Science, Information Technology, Business Analytics, Data Science, Economics, or Statistics - 5+ years of experience in building statistical models and deriving insights - Proficiency in Python and SQL for coding and statistical modeling - Hands-on experience with statistical models such as linear regression, random forest, SVM, logistic regression, clustering, and Bayesian regression - Knowledge of GitHub, Airflow, and visualization frameworks - Understanding of Google Cloud and related services like Kubernetes and Cloud Build Preferred qualifications include experience with revenue growth management, pricing, marketing mix models, and third-party data. Knowledge of machine learning techniques and Google Cloud products will be advantageous for this role. Colgate-Palmolive is committed to fostering an inclusive environment where diversity is valued, and every individual is treated with respect. As an Equal Opportunity Employer, we encourage applications from candidates with diverse backgrounds and perspectives. If you require accommodation during the application process due to a disability, please complete the request form provided. Join us in building a brighter, healthier future for all.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As a Senior Specialist in Global Data Science at Colgate-Palmolive, you will play a crucial role in the GLOBAL DATA SCIENCE & ADVANCED ANALYTICS vertical. This department focuses on working on business cases with significant financial impacts for the company, providing solutions to business questions, recommended actions, and scalability options across markets. Your position as a Data Scientist will involve leading projects within the Analytics Continuum, where you will be responsible for conceptualizing and developing machine learning, predictive modeling, simulations, and optimization solutions aimed at achieving clear financial objectives for Colgate-Palmolive. Your responsibilities will include building predictive modeling solutions, applying ML and AI algorithms to analytics, developing end-to-end business solutions from data extraction to building business presentations, conducting model validations, and continuous improvement of algorithms. You will also deploy models using Airflow and Docker on Google Cloud Platforms, own Pricing and Promotion, Marketing Mix projects, and present insights to business teams in an easily interpretable manner. To qualify for this role, you should have a BE/BTECH in Computer Science or Information Technology, an MBA or PGDM in Business Analytics or Data Science, additional certifications in Data Science, or an MSC/MSTAT in Economics or Statistics. You should have at least 5 years of experience in building statistical models, hands-on experience with coding languages such as Python and SQL, and knowledge of visualization frameworks like PyDash, Flask, and PlotLy. Understanding of Cloud Frameworks like Google Cloud and Snowflake is essential. Preferred qualifications include experience in managing statistical models for Revenue Growth Management or Marketing Mix models, familiarity with third-party data sources, knowledge of machine learning techniques, and experience with Google Cloud products. At Colgate-Palmolive, we are committed to fostering an inclusive environment where individuals with diverse backgrounds and perspectives can thrive. Our goal is to develop talent that best serves our consumers globally and ensure that everyone feels a sense of belonging within our organization. We are an Equal Opportunity Employer dedicated to empowering all individuals to contribute meaningfully to our business.,

Posted 3 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

As a Data Engineer (ETL, Big Data, Hadoop, Spark, GCP) at Assistant Vice President level, located in Pune, India, you will be responsible for developing and delivering engineering solutions to achieve business objectives. You are expected to have a strong understanding of crucial engineering principles within the bank, and be skilled in root cause analysis through addressing enhancements and fixes in product reliability and resiliency. Working independently on medium to large projects with strict deadlines, you will collaborate in a cross-application technical environment, demonstrating a solid hands-on development track record within an agile methodology. Furthermore, this role involves collaborating with a globally dispersed team and is integral to the development of the Compliance tech internal team in India, delivering enhancements in compliance tech capabilities to meet regulatory commitments. Your key responsibilities will include analyzing data sets, designing and coding stable and scalable data ingestion workflows, integrating them with existing workflows, and developing analytics algorithms on ingested data. You will also be working on data sourcing in Hadoop and GCP, owning unit testing, UAT deployment, end-user sign-off, and production go-live. Root cause analysis skills will be essential for identifying bugs and issues, and supporting production support and release management teams. You will operate in an agile scrum team and ensure that new code is thoroughly tested at both unit and system levels. To excel in this role, you should have over 10 years of coding experience with reputable organizations, hands-on experience in Bitbucket and CI/CD pipelines, and proficiency in Hadoop, Python, Spark, SQL, Unix, and Hive. A basic understanding of on-prem and GCP data security, as well as hands-on development experience with large ETL/big data systems (with GCP experience being a plus), are required. Familiarity with cloud services such as cloud build, artifact registry, cloud DNS, and cloud load balancing, along with data flow, cloud composer, cloud storage, and data proc, is essential. Additionally, knowledge of data quality dimensions and data visualization is beneficial. You will receive comprehensive support, including training and development opportunities, coaching from experts in your team, and a culture of continuous learning to facilitate your career progression. The company fosters a collaborative and inclusive work environment, empowering employees to excel together every day. As part of Deutsche Bank Group, we encourage applications from all individuals and promote a positive and fair workplace culture. For further details about our company and teams, please visit our website: https://www.db.com/company/company.htm.,

Posted 1 month ago

Apply

8.0 - 11.0 years

7 - 13 Lacs

Chennai

Hybrid

Position Description: Seeking a highly experienced talented and motivated Software Development Engineer in Test (SDET) with a strong focus on end-to-end automation. In this role, you will be instrumental in ensuring the quality, reliability, and performance of our applications by designing, developing, and maintaining comprehensive automated test suites. You'll work closely with development, product, and other QA engineers to build robust test coverage across our UI and API layers, playing a key part in our continuous delivery pipeline. If you're passionate about breaking software, building scalable test frameworks, and love diving into code, this role is for you! This position requires you to be a subject matter expert in full-stack test automation, with a demonstrated ability to build and scale test automation scripts in complex, regulated environments. Skills Required: Rest Assured, Appium, Developing Automation Scripts , cucumber, git, Selenium, Java Skills Preferred: cloudbuild, JMeter, AI, Mobile Automation, GCP Experience Required: Must Have: • 8+ years of direct, hands-on testing, QA, or automation experience with Finance domain • Strong Scripting and Programming knowledge in languages such as Java, Python, JavaScript, or Groovy, with proven ability to build robust, maintainable automation frameworks and scripts for complex financial applications. • Must have hands-on Experience in Developing Automation Scripts for UI using frameworks/tools like Selenium , Appium, Playwright. Experience with BDD frameworks like Cucumber is required. (Experience with tools like Tosca is also valuable but focus on code-based automation skills). • Must have strong experience in API Automation using tools/frameworks like SoapUI, or Rest Assured, specifically for testing APIs, web services. • Must have experience in designing and automating End-to-End user journeys that simulate real-world financial scenarios across multiple channels and system touchpoints. • Extensive experience with database testing and advanced SQL scripting for data validation, test data management. • Must have Experience in GitHub for version control . • Very strong experience in designing, implementing, and maintaining CI/CD pipelines . • Must have working experience in Mobile cloud platforms like Headspin or Perfecto for automating testing. • Experience in using Test management tools like Xray, TestRail, or ALM. • Must have experience in Jira. • Ability to work effectively in diversified global teams and projects, collaborating across different time zones and cultures. • Excellent communication, collaboration, and interpersonal skills. Nice to Have: • Knowledge of performance testing concepts and tools ([e.g., JMeter, LoadRunner]). • Exposure to Unix and Linux environments for managing test execution or environments. • Exposure to AI tools like GenAI . • Knowledge on Current Market Trends about the Automation tools and Frameworks, specifically in the FinTech. Education Required: Bachelor's Degree Additional Information : End to End Automation Strategy & Standards: • Establish, lead, and continuously refine the test automation strategy specifically for Ford Credits applications and integrated financial products, ensuring rigorous quality standards aligned with business goals, regulatory requirements, and audit needs. • Designing and automating End-to-End user journeys that simulate real-world financial scenarios across multiple channels and system touchpoints. • Define and implement comprehensive test automation standards, best practices, and guidelines tailored for testing complex, high-transaction financial systems. Full Stack Automation Development: • Design, develop, and maintain scalable, robust automated test suites covering the full application stack – including UI (Web and Desktop applications), APIs, and Microservices . • Develop and expand advanced test automation frameworks, modernizing them to align with DevOps principles and cloud-native architectures. CI/CD Integration : • CI/CD Integration: Integrate automated tests into our Continuous Integration/Continuous Deployment (CI/CD) pipelines to enable frequent and reliable releases. Comprehensive Testing & Validation: • Build and execute a comprehensive automated testing strategy covering integration, regression,, and end-to-end testing, with a strong emphasis on validating core financial application and financial data accuracy. Conduct meticulous software testing, verification, and validation of changes, especially focusing on preventing defects and incidents that could impact financial operations or financial data integrity in production.

Posted 1 month ago

Apply

10.0 - 15.0 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Job Description: Cloud Infrastructure & Deployment Design and implement secure, scalable, and highly available cloud infrastructure on GCP. Provision and manage compute, storage, network, and database services. Automate infrastructure using Infrastructure as Code (IaC) tools such as Terraform or Deployment Manager. Architecture & Design Translate business requirements into scalable cloud solutions. Recommend GCP services aligned with application needs and cost optimization. Participate in high-level architecture and solution design discussions. DevOps & Automation Build and maintain CI/CD pipelines (e.g., using Cloud Build, Jenkins, GitLab CI). Integrate monitoring, logging, and alerting (e.g., Stackdriver / Cloud Operations Suite). Enable autoscaling, load balancing, and zero-downtime deployments. Security & Compliance Ensure compliance with security standards and best Migration & Optimization Support cloud migration projects from on-premise or other cloud providers to GCP. Optimize performance, reliability, and cost of GCP workloads. Documentation & Support Maintain technical documentation and architecture diagrams. Provide L2/L3 support for GCP-based services and incidents. Required Skills and Qualifications: Google Cloud Certification Associate Cloud Engineer or Professional Cloud Architect/Engineer Hands-on experience with GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) Strong command of Linux , shell scripting , and networking fundamentals Proficiency in Terraform , Cloud Build , Cloud Functions , or other GCP-native tools Experience with containers and orchestration – Docker, Kubernetes (GKE) Familiarity with monitoring/logging – Cloud Monitoring , Prometheus , Grafana Understanding of IAM , VPCs , firewall rules , service accounts , and Cloud Identity Excellent written and verbal communication skills

Posted 1 month ago

Apply

10.0 - 12.0 years

12 - 13 Lacs

Pune

Remote

Job Description: Cloud Infrastructure & Deployment Design and implement secure, scalable, and highly available cloud infrastructure on GCP. Provision and manage compute, storage, network, and database services. Automate infrastructure using Infrastructure as Code (IaC) tools such as Terraform or Deployment Manager. Architecture & Design Translate business requirements into scalable cloud solutions. Recommend GCP services aligned with application needs and cost optimization. Participate in high-level architecture and solution design discussions. Role & responsibilities Preferred candidate profile

Posted 1 month ago

Apply

10.0 - 18.0 years

12 - 22 Lacs

Bengaluru

Hybrid

GCP Certified with Sr Architect Experience Designing and Architecture, GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) containers and orchestration Terraform, Cloud Build, Cloud Functions, or other GCP-native tools IAM, VPCs, firewall rules, service accounts, and Cloud Identity Grafana – any Monitoring tool

Posted 1 month ago

Apply

3.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Description: We are seeking a skilled and proactive GCP Cloud Engineer with 3 5 years of hands on experience in managing and optimizing cloud infrastructure using Google Cloud Platform GCP The ideal candidate will be responsible for designing deploying and maintaining secure and scalable cloud environments collaborating with cross functional teams and driving automation and reliability across our cloud infrastructure Key Responsibilities: Design and implement cloud native solutions on Google Cloud Platform Deploy and manage infrastructure using Terraform Cloud Deployment Manager or similar IaC tools Manage GCP services such as Compute Engine GKE Kubernetes Cloud Storage Pub Sub Cloud Functions BigQuery etc Optimize cloud performance cost and scalability Ensure security best practices and compliance across the GCP environment Monitor and troubleshoot issues using Stackdriver Cloud Monitoring Collaborate with development DevOps and security teams Automate workflows CI CD pipelines using tools like Jenkins GitLab CI or Cloud Build Technical Requirements: 3 5 years of hands on experience with GCP Strong expertise in Terraform GCP networking and cloud security Proficient in container orchestration using Kubernetes GKE Experience with CI CD DevOps practices and shell scripting or Python Good understanding of IAM VPC firewall rules and service accounts Familiarity with monitoring logging tools like Stackdriver or Prometheus Strong problem solving and troubleshooting skills Additional Responsibilities: GCP Professional certification e g Professional Cloud Architect Cloud Engineer Experience with hybrid cloud or multi cloud architecture Exposure to other cloud platforms AWS Azure is a plus Strong communication and teamwork skills Preferred Skills: Cloud Platform ->Google Cloud Platform Developer->GCP/ Google Cloud,Java,Java->Springboot,.Net,Python

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Role & responsibilities Key Skills: 3 years of experience with building modern applications utilizing GCP services like Cloud Build, Cloud Functions/ Cloud Run, GKE, Logging, GCS, CloudSQL & IAM. Primary proficiency in Python and experience with a secondary language such as Golang or Java. In-depth knowledge and hands-on experience with GKE/K8s. You place a high emphasis on Software Engineering fundamentals such as code and configuration management, CICD/Automation and automated testing. Working with operations, security, compliance and architecture groups to develop secure, scalable and supportable solutions. Working and delivering solution in a complex enterprise environment. Proficiency in designing and developing scalable and decoupled microservices and adeptness in implementing event-driven architecture to ensure seamless and responsive service interactions. Proficiency in designing scalable and robust solutions leveraging cloud-native technologies and architectures. Expertise in managing diverse stakeholder expectations and adept at prioritizing tasks to align with strategic objectives and deliver optimal outcomes. Good to have knowledge, skills and experiences The ‘good to have’ knowledge, skill and experience (KSE) the role requires are: Ability to integrate Kafka to handle real-time data. Proficiency in monitoring tools Experience using Robot Framework for automated UAT is highly desirable.

Posted 1 month ago

Apply

2.0 - 3.0 years

4 - 7 Lacs

Hyderabad, Gachibowli

Work from Office

Job Summary Synechron is seeking a highly motivated and skilled Senior Cloud Data Engineer GCP to join our cloud solutions team. In this role, you will collaborate closely with clients and internal stakeholders to design, implement, and manage scalable, secure, and high-performance cloud-based data solutions on Google Cloud Platform (GCP). You will leverage your technical expertise to ensure the integrity, security, and efficiency of cloud data architectures, enabling the organization to derive maximum value from cloud data assets. This role contributes directly to our mission of delivering innovative digital transformation solutions and supports the organizations strategic objectives of scalable and sustainable cloud infrastructure. Software Requirements Required Skills: Proficiency with Google Cloud Platform (GCP) services (Compute Engine, Cloud Storage, BigQuery, Cloud Pub/Sub, Dataflow, etc.) Basic scripting skills with Python, Bash, or similar languages Familiarity with virtualization and cloud networking concepts Understanding of cloud security best practices and compliance standards Experience with infrastructure as code tools (e.g., Terraform, Deployment Manager) Strong knowledge of data management, data pipelines, and ETL processes Preferred Skills: Experience with other cloud platforms (AWS, Azure) Knowledge of SQL and NoSQL databases Familiarity with containerization (Docker, GKE) Experience with data visualization tools Overall Responsibilities Design, implement, and operate cloud data solutions that are secure, scalable, and optimized for performance Collaborate with clients and internal teams to identify infrastructure and data architecture requirements Manage and monitor cloud infrastructure and ensure operational reliability Resolve technical issues related to cloud data workflows and storage solutions Participate in project planning, timelines, and technical documentation Contribute to best practices and continuous improvement initiatives within the organization Educate and support clients in adopting cloud data services and best practices Technical Skills (By Category) Programming Languages: Essential: Python, Bash scripts Preferred: SQL, Java, or other data processing languages Databases & Data Management: Essential: BigQuery, Cloud SQL, Cloud Spanner, Cloud Storage Preferred: NoSQL databases like Firestore, MongoDB Cloud Technologies: Essential: Google Cloud Platform core services (Compute, Storage, BigQuery, Dataflow, Pub/Sub) Preferred: Cloud monitoring, logging, and security tools Frameworks & Libraries: Essential: Data pipeline frameworks, Cloud SDKs, APIs Preferred: Apache Beam, Data Studio Development Tools & Methodologies: Essential: Infrastructure as Code (Terraform, Deployment Manager) Preferred: CI/CD tools (Jenkins, Cloud Build) Security Protocols: Essential: IAM policies, data encryption, network security best practices Preferred: Compliance frameworks such as GDPR, HIPAA Experience Requirements 2-3 years of experience in cloud data engineering, cloud infrastructure, or related roles Hands-on experience with GCP is preferred; experience with AWS or Azure is a plus Background in designing and managing cloud data pipelines, storage, and security solutions Proven ability to deliver scalable data solutions in cloud environments Experience working with cross-functional teams on cloud deployments Alternative experience pathways: academic projects, certifications, or relevant internships demonstrating cloud data skills Day-to-Day Activities Develop and deploy cloud data pipelines, databases, and analytics solutions Collaborate with clients and team members to plan and implement infrastructure architecture Perform routine monitoring, maintenance, and performance tuning of cloud data systems Troubleshoot technical issues affecting data workflows and resolve performance bottlenecks Document system configurations, processes, and best practices Engage in continuous learning on new cloud features and data management tools Participate in project meetings, code reviews, and knowledge sharing sessions Qualifications Bachelors or Masters degree in computer science, engineering, information technology, or a related field Relevant certifications (e.g., Google Cloud Professional Data Engineer, Cloud Architect) are preferred Training in cloud security, data management, or infrastructure design is advantageous Commitment to professional development and staying updated with emerging cloud technologies Professional Competencies Critical thinking and problem-solving skills to resolve complex cloud architecture challenges Ability to work collaboratively with multidisciplinary teams and clients Strong communication skills for technical documentation and stakeholder engagement Adaptability to evolving cloud technologies and project priorities Organized with a focus on quality and detail-oriented delivery Proactive learner with a passion for innovation in cloud data solutions Ability to manage multiple tasks effectively and prioritize in a fast-paced environment

Posted 2 months ago

Apply

5.0 - 10.0 years

4 - 7 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Key Responsibilities: 1. ETL Pipeline Development: Design, develop, and maintain scalable ETL processes to extract, transform, and load data from various structured and unstructured sources into GCP-based data warehouses (BigQuery, Cloud SQL, Cloud Storage, etc.). Develop efficient SQL queries and scripts to support data transformation, aggregation, and validation. Optimize ETL workflows to ensure low-latency data processing and high performance. 2. Google Cloud Dataform & Data Transformation: Utilize Google Cloud Dataform to implement SQL-based data transformations in BigQuery following best practices in data modeling, version control, and dependency management. Develop modular SQL workflows using Dataform to simplify transformation logic and enhance reusability. Integrate Dataform into existing ETL/ELT pipelines to streamline data engineering and analytics workflows. Leverage Dataform's automated testing, scheduling, and Git-based version control for collaborative development and data quality assurance. 3. Data Integration & Management: Work with diverse data sources (databases, APIs, streaming data, and cloud storage) to integrate data into centralized repositories. Ensure data consistency, integrity, and accuracy through rigorous testing and validation. Implement incremental data loads, change data capture (CDC), and batch/real-time ETL strategies . Leverage GCP services like Dataflow, Dataproc, Cloud Functions, and Pub/Sub to handle data ingestion and transformation. 4. Database & SQL Development: Write complex SQL queries, stored procedures, and functions to support analytical and operational data needs. Optimize SQL queries for performance tuning and cost efficiency in BigQuery, Cloud SQL, and other relational databases. Ensure proper indexing, partitioning, and clustering strategies for optimal query performance. 5. Cloud & DevOps Integration: Deploy and monitor ETL workflows using GCP-native tools (Cloud Composer/Airflow, Dataform, Dataflow, Dataprep, etc.). Implement CI/CD pipelines for ETL jobs using Terraform, Cloud Build, GitHub Actions, or Jenkins. Work with Infrastructure and DevOps teams to ensure secure and reliable deployment of ETL solutions in a cloud environment. 6. Data Quality & Governance: Implement data validation, data cleansing, and error-handling mechanisms in ETL pipelines. Monitor data pipeline performance and ensure timely resolution of issues and failures. Work with stakeholders to define data governance policies , metadata management, and access controls. 7. Documentation & Collaboration: Maintain comprehensive documentation for ETL workflows, data transformations, and technical design. Collaborate with data engineers, data analysts, and business teams to understand data needs and optimize data processing workflows. Conduct code reviews and provide mentorship to junior developers when necessary. Required Skills & Qualifications: 1. Technical Skills: ETL Development: Hands-on experience in designing and implementing ETL pipelines. Proficiency in ETL tools such as Apache Airflow (Cloud Composer), Dataflow, or Informatica . SQL & Database Management: Strong expertise in SQL (DDL, DML, performance tuning, indexing, partitioning, stored procedures, etc.). Experience working with relational (Cloud SQL, PostgreSQL, MySQL) and NoSQL databases (Bigtable, Firestore, MongoDB, etc.). Cloud (GCP) Expertise: Strong hands-on experience with Google Cloud Platform (GCP) services: BigQuery (data warehousing & analytics) Cloud Storage (data lake storage) Cloud Composer (Apache Airflow) (workflow orchestration) Cloud Functions (serverless ETL tasks) Cloud Dataflow (Apache Beam-based data processing) Pub/Sub (real-time streaming) Dataproc (Hadoop/Spark-based processing) Google Cloud Dataform (SQL-based transformations for BigQuery) Programming & Scripting: Experience with Python, SQL scripting, and Shell scripting for ETL automation. Knowledge of PySpark or Apache Beam is a plus. CI/CD & DevOps: Experience in deploying ETL workflows using Terraform, Cloud Build, or Jenkins. Familiarity with Git/GitHub for version control.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies