Jobs
Interviews

279 Cloud Sql Jobs - Page 6

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

30 - 35 Lacs

Pune

Work from Office

: Job TitleSenior Engineer LocationPune, India Corporate TitleAVP Role Description Investment Banking is technology centric businesses, with an increasing move to real-time processing, an increasing appetite from customers for integrated systems and access to supporting data. This means that technology is more important than ever for business. The IB CARE Platform aims to increase the productivity of both Google Cloud and on-prem application development by providing a frictionless build and deployment platform that offers service and data reusability. The platform provides the chassis and standard components of an application ensuring reliability, usability and safety and gives on-demand access to services needed to build, host and manage applications on the cloud/on-prem. In addition to technology services the platform aims to have compliance baked in, enforcing controls/security reducing application team involvement in SDLC and ORR controls enabling teams to focus more on application development and release to production faster. We are looking for a platform engineer to join a global team working across all aspects of the platform from GCP/on-prem infrastructure and application deployment through to the development of CARE based services. Deutsche Bank is one of the few banks with the scale and network to compete aggressively in this space, and the breadth of investment in this area is unmatched by our peers. Joining the team is a unique opportunity to help build a platform to support some of our most mission critical processing systems. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your Key Responsibilities As a CARE platform engineer you will be working across the board on activities to build/support the platform and liaising with tenants. To be successful in this role the below are key responsibility areas: Responsible for managing and monitoring cloud computing systems and providing technical support to ensure the systems efficiency and security Work with platform leads and platform engineers at technical level. Liaise with tenants regarding onboarding and providing platform expertise. Contribute to the platform offering as part of Sprint deliverables. Support the production platform as part of the wider team. Your skills and experience Understanding of GCP and services such as GKE, IAM, identity services and Cloud SQL. Kubernetes/Service Mesh configuration. Experience in IaaS tooling such as Terraform. Proficient in SDLC / DevOps best practices. Github experience including Git workflow. Exposure to modern deployment tooling, such as ArgoCD, desirable. Programming experience (such as Java/Python) desirable. A strong team player comfortable in a cross-cultural and diverse operating environment Result oriented and ability to deliver under tight timelines. Ability to successfully resolve conflicts in a globally matrix driven organization. Excellent communication and collaboration skills Must be comfortable with navigating ambiguity to extract meaningful risk insights. How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 months ago

Apply

7.0 - 12.0 years

45 - 55 Lacs

Bengaluru

Work from Office

: Job TitleLead Solution Architect, VP LocationBangalore, India Role Description We are seeking a highly skilled and experienced Solution Architect to join CM Tech team owning firmwide golden reference data source cRDS. As a Solution Architect, you will play a pivotal role in shaping the future of CM Tech architecture, leading the development of innovative technical solutions, and contributing to the strategic direction of the application. You will be responsible for defining, documenting, and implementing the overall architecture of cRDS and other client onboarding applications, ensuring its scalability, performance, and security while aligning with business requirements and industry best practices. What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Define CMT contribution to RCP solutions. Scope solutions to existing and new CMT components. Capture and document assumptions made in lieu of requirement / information for PO to risk accept. Define high-level data entities, functional decomposition. Support component guardians in aligning component roadmap to product strategy and initiative demand. Work with 'CTO' function to define and document. Outline define CMT and non-CMT component interactions and interaction contracts for refinement by engineering teams. Identify problems and opportunities - form business case - propose solutions. Definition of phased transitions from current state to target state. Ensure non-functional requirements are considered and include projection. Ensure Authentication and Authorisation are considered. Ensure solution design is suitable to build estimation and groomable Jiras from. Provide guardrails on what requirements a component should and should not cover - act as point of escalation. Hands-on software development Knowledge of solution design and Architecting Experience in Agile and Scrum delivery. Should be able to contribute towards good software design. Participate in daily stand-up meetings. Strong communication with stakeholders Articulate issues and risks to management in timely manner Train and mentor junior team members to bring them up to speed. Your skills and experience Must have (Strong technical knowledge required) 7+ years of experience in designing and implementing complex enterprise-scale applications. Proven experience in designing and implementing microservices architectures. Deep understanding of distributed systems and cloud-native technologies. Experience with architectural patterns like event-driven architectures, API gateways, and message queues. Strong understanding of Java Core concepts, design patterns, and best practices. Experience with Spring Boot framework, including dependency injection, Spring Data, and Spring Security. Hands-on experience with a BPM tool (Camunda preferred), including process modeling, workflow automation, and integration with backend systems. Experience with Google Cloud Platform, including services like Cloud Run, Cloud SQL, and Cloud Storage desirable. Experience with containerization technologies like Docker and Kubernetes. Strong SQL knowledge and experience with advanced database concepts, including relational database design, query optimization, and transaction management. Experience with version control systems like Git and collaborative development tools like Jira and Confluence. Excellent communication and presentation skills, with the ability to effectively convey complex technical concepts to both technical and non-technical audiences. Strong problem-solving skills, with the ability to analyze complex business problems and propose innovative technical solutions. Experience in collaborating with stakeholders, understanding their needs, and translating them into technical solutions. Technical leadership skills and experience mentoring junior engineers. Nice to have Experience with cloud technologies such as Docker, Kubernetes, Openshift, Azure, AWS, GCP Additional languages such as Kotlin, scala & Python Experience with Big data / Streaming technologies Experience with end to end design and delivery of solutions Experience with UI frameworks like Angular or React RDBMS /Oracle design, development, tuning Sun/Oracle or architecture specific certifications How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 months ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Bengaluru

Work from Office

A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In your role, you will be responsible for: Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence

Posted 2 months ago

Apply

8.0 - 13.0 years

15 - 20 Lacs

Bengaluru, Karnataka, India

On-site

Srs Business Solutions India is looking for an experienced GCP Cloud Engineer to join our team in Bangalore. We need a professional with extensive experience in building modern cloud applications on Google Cloud Platform and a strong foundation in software engineering best practices. This is an excellent opportunity for immediate joiners who are ready to make an impact. Key Responsibilities Build modern applications primarily utilizing a range of GCP services, including: Cloud Build Cloud Functions / Cloud Run Google Kubernetes Engine (GKE) Logging Google Cloud Storage (GCS) CloudSQL Identity and Access Management (IAM) Demonstrate in-depth knowledge and hands-on experience with GKE/Kubernetes . Apply strong Software Engineering fundamentals , including: Code and configuration management CI/CD (Continuous Integration/Continuous Delivery) / Automation Automated testing Collaborate effectively with operations, security, compliance, and architecture groups to develop secure, scalable, and supportable cloud solutions. Skills & Qualifications Experience: 8+ years of overall experience, with at least 3 years specifically building modern applications utilizing GCP services . Primary proficiency in Python , with experience in a secondary language such as Golang or Java . High emphasis on Software Engineering fundamentals . Ability to work effectively with various cross-functional teams. Note: We are specifically seeking immediate joiners (candidates whose notice period is served or are currently serving their notice period). Please apply only if you meet this criterion.

Posted 2 months ago

Apply

5.0 - 8.0 years

5 - 9 Lacs

Chennai

Work from Office

GCP Engineer GCP developer should have the expertise on the components like Scheduler, DataFlow, BigQuery, Pub/Sub and Cloud SQL etc. Good understanding of GCP cloud environment/services (IAM, Networking, Pub/Sub, Cloud Run, Cloud Storage, Cloud SQL/PostgreSQL, Cloud Spanner etc) based on real migration projects Knowledge of Java / Java frameworks. Have leveraged/ worked with any or all technology areas like Spring boot, Spring batch, Spring boot cloud etc. Experience with API, Microservice design principles and leveraged them in actual project implementation for integration. Deep understanding of Architecture and Design Patterns Need to have knowledge of implementation of event-driven architecture, data integration, event streaming architecture, API driven architecture. Needs to be well versed with DevOps principal and need to have working experience in Docker/containerization. Experience in solution and execution of IaaS, PaaS, SaaS-based deployments, etc. Require conceptual thinking to create 'out of the box solutions Should be good in communication and should be able to handle both customer and development team to deliver an outcome. Mandatory Skills: App-Cloud-Google. Experience5-8 Years.

Posted 2 months ago

Apply

8.0 - 10.0 years

14 - 18 Lacs

Chennai

Work from Office

GCP Architect A Seasoned architect with a minimum of 12+ years and designing medium to large scale Application-to-Application integration requirements leveraging API, APIMs, ESB, product-based hybrid implementation.. Good understanding of GCP cloud environment/services (IAM, Networking, Pub/Sub, Cloud Run, Cloud Storage, Cloud SQL/PostgreSQL, Cloud Spanner etc) based on real migration projects Experience/Exposure for Openshift & PCF on GCP & DevSecOps will be an added advantage Ability to make critical solution design decisions Knowledge of Java / Java frameworks. Have leveraged/ worked with any or all technology areas like Spring boot, Spring batch, Spring boot cloud etc. Experience with API, Microservice design principles and leveraged them in actual project implementation for integration. Deep understanding of Architecture and Design Patterns Need to have knowledge of implementation of event-driven architecture, data integration, event streaming architecture, API driven architecture. Need to have an understanding and designed integration platform to meet the NFR requirements. Should have implemented design patterns like integrating with multiple COTS applications, integrations with multiple databases (SQL based and also NoSQL) Have worked with multiple teams to gather integration requirements, create Integration specification documents, map specifications, write high level and detailed designs, guiding the technical team for design and implementation. Needs to be well versed with DevOps principal and need to have working experience in Docker/containerization. Experience in solution and execution of IaaS, PaaS, SaaS-based deployments, etc. Require conceptual thinking to create 'out of the box solutions Should be good in communication and should be able to handle both customer and development team to deliver an outcome. Mandatory Skills: App-Cloud-Google. Experience8-10 Years.

Posted 2 months ago

Apply

10.0 - 20.0 years

10 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Job Description: Cloud Infrastructure & Deployment Design and implement secure, scalable, and highly available cloud infrastructure on GCP. Provision and manage compute, storage, network, and database services. Automate infrastructure using Infrastructure as Code (IaC) tools such as Terraform or Deployment Manager. Architecture & Design Translate business requirements into scalable cloud solutions. Recommend GCP services aligned with application needs and cost optimization. Participate in high-level architecture and solution design discussions. DevOps & Automation Build and maintain CI/CD pipelines (e.g., using Cloud Build, Jenkins, GitLab CI). Integrate monitoring, logging, and alerting (e.g., Stackdriver / Cloud Operations Suite). Enable autoscaling, load balancing, and zero-downtime deployments. Security & Compliance Ensure compliance with security standards and best Migration & Optimization Support cloud migration projects from on-premise or other cloud providers to GCP. Optimize performance, reliability, and cost of GCP workloads. Documentation & Support Maintain technical documentation and architecture diagrams. Provide L2/L3 support for GCP-based services and incidents. Required Skills and Qualifications: Google Cloud Certification Associate Cloud Engineer or Professional Cloud Architect/Engineer Hands-on experience with GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) Strong command of Linux , shell scripting , and networking fundamentals Proficiency in Terraform , Cloud Build , Cloud Functions , or other GCP-native tools Experience with containers and orchestration – Docker, Kubernetes (GKE) Familiarity with monitoring/logging – Cloud Monitoring , Prometheus , Grafana Understanding of IAM , VPCs , firewall rules , service accounts , and Cloud Identity

Posted 2 months ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Gurugram

Work from Office

A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat In your role, you will be responsible for Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core TechnologiesOSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence

Posted 2 months ago

Apply

3.0 - 8.0 years

6 - 11 Lacs

Bengaluru, Karnataka, India

On-site

7+ years of experience in SQL development and database management. Own and drive the end-to-end data migration process from Mainframe DB2 to GCP. Analyze existing DB2 data structures, stored procedures, and ETL processes. Design and implement scalable, secure, and efficient data models. (BigQuery, Cloud SQL, etc.). Develop and optimize SQL scripts for data extraction, transformation, and loading (ETL). Develop, test, and optimize SQL queries for performance, scalability, and maintainability. Collaborate with infrastructure, cloud engineering, and business teams to ensure data integrity and performance. Monitor and troubleshoot data migration pipelines and resolve data quality issues. Proven experience in cloud data migration projects Document data mappings, and technical specifications. Strong understanding of query execution plans, indexing, and optimization techniques. Ensure compliance with data governance, security, and privacy standards. Excellent analytical, communication, and documentation skills.

Posted 2 months ago

Apply

3.0 - 8.0 years

6 - 11 Lacs

Chennai, Tamil Nadu, India

On-site

7+ years of experience in SQL development and database management. Own and drive the end-to-end data migration process from Mainframe DB2 to GCP. Analyze existing DB2 data structures, stored procedures, and ETL processes. Design and implement scalable, secure, and efficient data models. (BigQuery, Cloud SQL, etc.). Develop and optimize SQL scripts for data extraction, transformation, and loading (ETL). Develop, test, and optimize SQL queries for performance, scalability, and maintainability. Collaborate with infrastructure, cloud engineering, and business teams to ensure data integrity and performance. Monitor and troubleshoot data migration pipelines and resolve data quality issues. Proven experience in cloud data migration projects Document data mappings, and technical specifications. Strong understanding of query execution plans, indexing, and optimization techniques. Ensure compliance with data governance, security, and privacy standards. Excellent analytical, communication, and documentation skills.

Posted 2 months ago

Apply

3.0 - 8.0 years

6 - 11 Lacs

Hyderabad, Telangana, India

On-site

7+ years of experience in SQL development and database management. Own and drive the end-to-end data migration process from Mainframe DB2 to GCP. Analyze existing DB2 data structures, stored procedures, and ETL processes. Design and implement scalable, secure, and efficient data models. (BigQuery, Cloud SQL, etc.). Develop and optimize SQL scripts for data extraction, transformation, and loading (ETL). Develop, test, and optimize SQL queries for performance, scalability, and maintainability. Collaborate with infrastructure, cloud engineering, and business teams to ensure data integrity and performance. Monitor and troubleshoot data migration pipelines and resolve data quality issues. Proven experience in cloud data migration projects Document data mappings, and technical specifications. Strong understanding of query execution plans, indexing, and optimization techniques. Ensure compliance with data governance, security, and privacy standards. Excellent analytical, communication, and documentation skills.

Posted 2 months ago

Apply

4.0 - 9.0 years

20 - 35 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Salary: 20 to 35 LPA Exp: 5 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services such as BigQuery, Data Flow, PubSub, Dataproc, and Cloud Storage. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from large datasets stored in Google Cloud SQL databases. Troubleshoot issues related to data processing workflows and provide timely resolutions. Desired Candidate Profile 5-9 years of experience in Data Engineering with expertise GCP & Biq query data engineering. Strong understanding of GCP Cloud Platform Administration including Compute Engine (Dataproc), Kubernetes Engine (K8s), Cloud Storage, Cloud SQL etc. . Experience working on big data analytics projects involving ETL processes using tools like Airflow or similar technologies.

Posted 2 months ago

Apply

10.0 - 20.0 years

10 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Job Description: Cloud Infrastructure & Deployment Design and implement secure, scalable, and highly available cloud infrastructure on GCP. Provision and manage compute, storage, network, and database services. Automate infrastructure using Infrastructure as Code (IaC) tools such as Terraform or Deployment Manager. Architecture & Design Translate business requirements into scalable cloud solutions. Recommend GCP services aligned with application needs and cost optimization. Participate in high-level architecture and solution design discussions. DevOps & Automation Build and maintain CI/CD pipelines (e.g., using Cloud Build, Jenkins, GitLab CI). Integrate monitoring, logging, and alerting (e.g., Stackdriver / Cloud Operations Suite). Enable autoscaling, load balancing, and zero-downtime deployments. Security & Compliance Ensure compliance with security standards and best Migration & Optimization Support cloud migration projects from on-premise or other cloud providers to GCP. Optimize performance, reliability, and cost of GCP workloads. Documentation & Support Maintain technical documentation and architecture diagrams. Provide L2/L3 support for GCP-based services and incidents. Required Skills and Qualifications: Google Cloud Certification Associate Cloud Engineer or Professional Cloud Architect/Engineer Hands-on experience with GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) Strong command of Linux , shell scripting , and networking fundamentals Proficiency in Terraform , Cloud Build , Cloud Functions , or other GCP-native tools Experience with containers and orchestration – Docker, Kubernetes (GKE) Familiarity with monitoring/logging – Cloud Monitoring , Prometheus , Grafana Understanding of IAM , VPCs , firewall rules , service accounts , and Cloud Identity

Posted 2 months ago

Apply

5.0 - 7.0 years

4 - 9 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering,BTech,Bachelor Of Technology,BCA,BSc,MTech,MCA Service Line Application Development and Maintenance Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Preferred LocationsBangalore, Hyderabad, Chennai, Pune Experience Required3 to 5 years of experiencePure hands on and expertise on the skill, able to deliver without any support Experience Required5 - 9 years of experienceDesign knowledge, estimation technique, leading and guiding the team on technical solution Experience Required9 - 13 years of experienceArchitecture, Solutioning, (Optional) proposal Containerization, micro service development on AWS/Azure/GCP is preferred. In-depth knowledge of design issues and best practices Solid understanding of object-oriented programming Familiar with various design, architectural patterns and software development process. Implementing automated testing platforms and unit tests Strong experience in building and developing applications using technologies like Python Knowledge about RESTful APIs and ability to design cloud ready applications using cloud SDK’s , microservices Exposure to cloud compute services like VM’s, PaaS services, containers, serverless and storage services on AWS/Azure/GCP Good understanding of application development design patterns Technical and Professional : Primary SkillPythonSecondary Skills: AWS/Azure/GCP Preferred Skills: Technology-Machine Learning-Python Generic Skills: Technology-Cloud Platform-AWS App Development Technology-Cloud Platform-Azure Development & Solution Architecting Technology-Cloud Platform-GCP Devops

Posted 2 months ago

Apply

9.0 - 14.0 years

22 - 37 Lacs

Pune

Hybrid

We're Hiring: Senior GCP Data Engineer (L4) for a client || Immediate joiners only Location: Pune | Walk-in Drive: 5th July 2025 Are you a seasoned Data Engineer with 912 years of experience and a passion for building scalable data solutions on Google Cloud Platform? Join us for an exciting walk-in opportunity! Key Skills Required GCP Data Engineering, BigQuery, SQL Python (Cloud Compressor, Cloud Functions, Python Injection) Dataproc + PySpark, Dataflow + Pub/Sub Apache Beam, Spark, Hadoop What You'll Do Architect and implement end-to-end data pipelines on GCP Work with BigQuery, BigTable, Cloud Storage, Spanner, and more Automate data ingestion, transformation, and augmentation Ensure data quality and compliance across systems Collaborate in a fast-paced, dynamic environment Bonus Points Google Professional Data Engineer or Solution Architect certification Experience with Snaplogic, Cloud Dataprep Strong SQL and data integration expertise If interested, Pls share your CV @ Raveena.kalra@in.ey.com

Posted 2 months ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad

Work from Office

About the Role: We are seeking a skilled and detail-oriented Data Engineer with deep expertise in PostgreSQL and SQL to design, maintain, and optimize our database systems. As a key member of our data infrastructure team, you will work closely with developers, DevOps, and analysts to ensure data integrity, performance, and scalability of our applications. Key Responsibilities: Design, implement, and maintain PostgreSQL database systems for high availability and performance. Write efficient, well-documented SQL queries, stored procedures, and database functions. Analyze and optimize slow-performing queries and database structures. Collaborate with software engineers to support schema design, indexing, and query optimization. Perform database migrations, backup strategies, and disaster recovery planning. Ensure data security and compliance with internal and regulatory standards. Monitor database performance and proactively address bottlenecks and anomalies. Automate routine database tasks using scripts and monitoring tools. Contribute to data modeling and architecture discussions for new and existing systems. Support ETL pipelines and data integration processes as needed. Required Qualifications: Bachelor's degree in Computer Science, Information Systems, or related field. 5 years of professional experience in a database engineering role. Proven expertise with PostgreSQL (version 12+ preferred). Strong SQL skills with the ability to write complex queries and optimize them. Experience with performance tuning, indexing, query plans, and execution analysis. Familiarity with database design best practices and normalization techniques. Solid understanding of ACID principles and transaction management. Preferred Qualifications: Experience with cloud platforms (e.g., AWS RDS, GCP Cloud SQL, or Azure PostgreSQL). Familiarity with other database technologies (e.g., MySQL, NoSQL, MongoDB, Redis). Knowledge of scripting languages (e.g., Python, Bash) for automation. Experience with monitoring tools (e.g., pgBadger, pg_stat_statements, Prometheus/Grafana). Understanding of CI/CD processes and infrastructure as code (e.g., Terraform). Exposure to data warehousing or analytics platforms (e.g., Redshift, BigQuery).

Posted 2 months ago

Apply

7.0 - 11.0 years

7 - 11 Lacs

Chennai

Work from Office

Strong DB Setup and Implementation Experience SQL Server Strong SQL Server Knowledge- Architecture and Internals Working with Version Management (BitBucket / Stash etc.) and DevOps for DB Experience in data migration Experience in ETL SSIS preferred. Knowledge of other tools added advantage Primary skills AWS Cloud SQL server Performance tuning Data Migration Secondary skills Splunk Monitoring Foglight Tool

Posted 2 months ago

Apply

0.0 years

9 - 14 Lacs

Noida

Work from Office

Required Skills: GCP Proficiency Strong expertise in Google Cloud Platform (GCP) services and tools. Strong expertise in Google Cloud Platform (GCP) services and tools, including Compute Engine, Google Kubernetes Engine (GKE), Cloud Storage, Cloud SQL, Cloud Load Balancing, IAM, Google Workflows, Google Cloud Pub/Sub, App Engine, Cloud Functions, Cloud Run, API Gateway, Cloud Build, Cloud Source Repositories, Artifact Registry, Google Cloud Monitoring, Logging, and Error Reporting. Cloud-Native Applications Experience in designing and implementing cloud-native applications, preferably on GCP. Workload Migration Proven expertise in migrating workloads to GCP. CI/CD Tools and Practices Experience with CI/CD tools and practices. Python and IaC Proficiency in Python and Infrastructure as Code (IaC) tools such as Terraform. Responsibilities: Cloud Architecture and Design Design and implement scalable, secure, and highly available cloud infrastructure solutions using Google Cloud Platform (GCP) services and tools such as Compute Engine, Kubernetes Engine, Cloud Storage, Cloud SQL, and Cloud Load Balancing. Cloud-Native Applications Design Development of high-level architecture design and guidelines for develop, deployment and life-cycle management of cloud-native applications on CGP, ensuring they are optimized for security, performance and scalability using services like App Engine, Cloud Functions, and Cloud Run. API ManagementDevelop and implement guidelines for securely exposing interfaces exposed by the workloads running on GCP along with granular access control using IAM platform, RBAC platforms and API Gateway. Workload Migration Lead the design and migration of on-premises workloads to GCP, ensuring minimal downtime and data integrity. Skills (competencies)

Posted 2 months ago

Apply

6.0 - 10.0 years

6 - 11 Lacs

Mumbai

Work from Office

Primary Skills Google Cloud Platform (GCP) Expertise in Compute (VMs, GKE, Cloud Run), Networking (VPC, Load Balancers, Firewall Rules), IAM (Service Accounts, Workload Identity, Policies), Storage (Cloud Storage, Cloud SQL, BigQuery), and Serverless (Cloud Functions, Eventarc, Pub/Sub). Strong experience in Cloud Build for CI/CD, automating deployments and managing artifacts efficiently. Terraform Skilled in Infrastructure as Code (IaC) with Terraform for provisioning and managing GCP resources. Proficient in Modules for reusable infrastructure, State Management (Remote State, Locking), and Provider Configuration . Experience in CI/CD Integration with Terraform Cloud and automation pipelines. YAML Proficient in writing Kubernetes manifests for deployments, services, and configurations. Experience in Cloud Build Pipelines , automating builds and deployments. Strong understanding of Configuration Management using YAML in GitOps workflows. PowerShell Expert in scripting for automation, managing GCP resources, and interacting with APIs. Skilled in Cloud Resource Management , automating deployments, and optimizing cloud operations. Secondary Skills CI/CD Pipelines GitHub Actions, GitLab CI/CD, Jenkins, Cloud Build Kubernetes (K8s) Helm, Ingress, RBAC, Cluster Administration Monitoring & Logging Stackdriver (Cloud Logging & Monitoring), Prometheus, Grafana Security & IAM GCP IAM Policies, Service Accounts, Workload Identity Networking VPC, Firewall Rules, Load Balancers, Cloud DNS Linux & Shell Scripting Bash scripting, system administration Version Control Git, GitHub, GitLab, Bitbucket

Posted 2 months ago

Apply

4.0 - 8.0 years

5 - 15 Lacs

Chennai

Hybrid

GCP Data Engineer (ETL Migration) We're looking for a GCP Data Engineer with strong ETL migration experience to join our team and lead the transition of legacy data pipelines to Google Cloud Platform . Important Note: This opportunity is open only to candidates currently residing in Tamil Nadu, including those from other states who are currently working in Tamil Nadu. Please note that this is a client-specific requirement, and profiles not meeting this preference may not be considered for further evaluation. Key Responsibilities: Migrate ETL processes to GCP-native services (BigQuery, Dataflow, DataFusion, etc.) Design, develop & deploy scalable data pipelines Analyze existing JCL/ETL jobs and dependencies Troubleshoot production issues and support job performance tuning Lead estimation, testing, deployment, and monitoring Must-Have Skills: GCP (BigQuery, Dataflow, Dataproc, DataFusion, Terraform, Airflow), PySpark, Python, SQL, API Integration, Cloud SQL, Postgres Experience: 5+ years in complex SQL and ETL development 2+ years in GCP at production scale Experience with DataStage, Autosys/Astronomer, GitHub Ready to drive real-time and batch data solutions at scale? Apply now. #GCPJobs #DataEngineer #ETLMigration #BigQuery #Dataflow #HiringNow

Posted 2 months ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Hyderabad, Gurugram, Coimbatore

Work from Office

Role: DevOps Engineer Experience : 5+ years Location : Gurgaon, Hyderabad & Coimbatore Key Responsibilities: Ensure zero-downtime deployments across production. Implement custom Helm deployment and rollback strategies. Refactor Terraform modules for simplicity and efficiency. Enforce secure CI/CD practices with tools like GitHub Actions. Migrate secret management to GCP Secret Manager and Kubernetes Secrets. Standardize drift detection and config audits. Lead GKE workload IAM scoping using workload identity. Maintain infrastructure documentation, SOPs, and disaster recovery playbooks. Mentor team members and contribute to DevOps metrics and postmortems. Requirements: 3+ years in DevOps, SRE, or Infrastructure Engineering. Strong experience with Terraform and reusable modules. Hands-on with Kubernetes (GKE preferred). Familiarity with GitHub Actions, Helm, and CI/CD workflows. Knowledge of GCP services like CloudSQL, VPC, IAM. Experience with observability tools, especially Datadog. Strong attention to deployment quality and operational details. Desirable Experience: Exposure to GitOps (ArgoCD/FluxCD). Experience with Kubernetes operators. Understanding of SLIs, SLOs, and structured alerting. Tools & Expectations: Terraform / HCP Terraform Infrastructure as code, state management, and drift detection. GitHub / GitLab / GitHub Actions Secure CI/CD pipeline setup and governance. Helm Application deployment and lifecycle management. Kubernetes / GKE Cluster and workload management. GCP Services – VPC, IAM, CloudSQL integration. Secret Management – Kubernetes Secrets, CSI Driver, GCP Secret Manager. Datadog – Observability and alerting. Cloudflare – DNS, WAF, and exposure configuration. Snyk / SonarQube / Wiz – Code and container security in CI/CD. Interested candidates can share their resume at Neesha1@damcogroup.com

Posted 2 months ago

Apply

5.0 - 10.0 years

11 - 16 Lacs

Bengaluru

Work from Office

: Job TitleSenior GCP Data Engineer Corporate TitleAssociate LocationBangalore, India Role Description Deutsche Bank has set for itself ambitious goals in the areas of Sustainable Finance, ESG Risk Mitigation as well as Corporate Sustainability. As Climate Change throws new Challenges and opportunities, Bank has set out to invest in developing a Sustainability Technology Platform, Sustainability data products and various sustainability applications which will aid Banks goals. As part of this initiative, we are building an exciting global team of technologists who are passionate about Climate Change, want to contribute to greater good leveraging their Technology Skillset in in Cloud / Hybrid Architecture. As part of this Role, we are seeking a highly motivated and experienced Senior GCP Data Engineer to join our team. In this role, you will play a critical role in designing, developing, and maintaining robust data pipelines that transform raw data into valuable insights for our organization. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design, develop, and maintain data pipelines using GCP services like Dataflow, Dataproc, and Pub/Sub. Develop and implement data ingestion and transformation processes using tools like Apache Beam and Apache Spark. Manage and optimize data storage solutions on GCP, including Big Query, Cloud Storage, and Cloud SQL. Implement data security and access controls using GCP's Identity and Access Management (IAM) and Cloud Security Command Center. Monitor and troubleshoot data pipelines and storage solutions using GCP's Stackdriver and Cloud Monitoring tools. Collaborate with data experts, analysts, and product teams to understand data needs and deliver effective solutions. Automate data processing tasks using scripting languages like Python. Participate in code reviews and contribute to establishing best practices for data engineering on GCP. Stay up to date on the latest advancements and innovations in GCP services and technologies. Your skills and experience 5+ years of experience as a Data Engineer or similar role. Proven expertise in designing, developing, and deploying data pipelines. In-depth knowledge of Google Cloud Platform (GCP) and its core data services (GCS, BigQuery, Cloud Storage, Dataflow, etc.). Strong proficiency in Python & SQL for data manipulation and querying. Experience with distributed data processing frameworks like Apache Beam or Apache Spark (a plus). Familiarity with data security and access control principles. Excellent communication, collaboration, and problem-solving skills. Ability to work independently, manage multiple projects, and meet deadlines Knowledge of Sustainable Finance / ESG Risk / CSRD / Regulatory Reporting will be a plus Knowledge of cloud infrastructure and data governance best practices will be a plus. Knowledge of Terraform will be a plus How well support you . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 months ago

Apply

10.0 - 15.0 years

30 - 45 Lacs

Bengaluru

Work from Office

Position Overview As an Engineering Manager, you will lead a team of software engineers in building scalable, reliable, and efficient web applications and microservices in the for “ News and Competitive Data Analysis Platform ” . You will drive technical excellence, system architecture, and best practices while fostering a high-performance engineering culture. You’ll be responsible for managing engineering execution, mentoring engineers, and ensuring the timely delivery of high-quality solutions. Your expertise in Python , Django, React, Apache Solr , RabbitMQ, and Postges and other NoSQL cloud databases will help shape the technical strategy of our SaaS platform. You will collaborate closely with Product, Design, and DevOps teams to align engineering efforts with business goals. Key Responsibilities Demonstratesstrong technical leadership, typically across multipleteams by delivering high-quality software solutions ensuring scalability, reliability, and performance. Collaborate with cross-functional teams, product managers, designers, and stakeholders to define project requirements and deliver solutions that meet business goals. Ableto spot the biggest pain points of the systems you’re working with andpropose solutions toimprove. Ableto influence the engineering culture and practices of the teams, workswith self-confidence with stakeholders outside of own team aswell. Experiencedmentor providing technical guidance and inspiring the team to achieve goals, helping them to grow and develop skills. Conduct constructive code reviews, ensure code quality, and promote best practices in software development so everyone can go and learn alongside you. What you need to succeed Technical Leadership: Provide architectural guidance, ensuring best practices in Django-based backend development and React-based frontends. Engineering Execution: Own the technical roadmap, working with teams on key components like Python framework for task management, RabbitMQ for messaging, and Google Cloud SQL/PostgreSQL for data storage. System Scalability & Performance: Optimize & redisgn Backend worker nodes efficiency for tasks like crawling and notifications, ensuring smooth distributed task execution. Database & Search Optimization: Drive performance improvements in PostgreSQL (self-hosted and GCP-managed CloudSQL) and Apache Solr for indexed content retrieval. Cloud & Infrastructure: Oversee the transition to GCP-managed CloudSQL, replacing Google Data Store and PostgreSQL for metadata. Cloud Migration : Stragegise and drive product migration from GCP to AWS Cross-Team Collaboration: Partner with DevOps and infrastructure teams to ensure reliable deployments, CI/CD, and production monitoring. Mentorship & Growth: Coach and mentor engineers, fostering a culture of ownership, continuous learning, and technical excellence. Preferred Experience/Skills: 8+ years of experience in software development, with at least 2+ years in an engineering leadership role. Strong expertise in Python (Django, Celery) and JavaScript (React, frontend development). Deep understanding of message brokers (RabbitMQ) and task orchestration. Proficiency in SQL and NoSQL databases, including PostgreSQL, Apache Solr, and Google Cloud SQL. Experience with scalable architectures, distributed systems, and event-driven designs. Knowledge of NLP and Semantic Search Knowledge of DevOps, CI/CD pipelines, and cloud platforms (GCP, AWS, or Azure). Strong background in code quality, security best practices, and performance tuning. Ability to influence stakeholders, manage engineering teams, and drive technical innovation.

Posted 2 months ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Pune

Work from Office

Job Titles: Technology Service Specialist Corporate title: Associate Location: Pune, India Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. TDI PB Germany Service Operations provides 2nd Level Application Support for business applications used in branches, by mobile sales or via internet. The department is overall responsible for the stability of the applications. Incident Management and Problem Management are the main processes that account for the required stability. In-depth application knowledge and understanding of the business processes that the applications support are our main assets. Within TDI, Partnerdata is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. With the partnership with Google Cloud (GCP), a bunch of applications and functionalities were migrated to GCP from where they will be operated and worked upon in terms of further development. Besides to the maintenance and the implementation of new requirements, the content focus also lies on the regulatory topics surrounding a partner/ client. We are looking for reinforcements for this contemporary and emerging Cloud area of application. What we'll offer you As part of our flexible scheme, here are just some of the benefits that you'll enjoy Your key responsibilities Ensures that the Service Operations team provides optimum service level to the business lines it supports. Takes overall responsibility for the resolution of incidents and problems within the team. Oversees the resolution of complex incidents. Ensure that Analysts apply the right problem-solving techniques and processes. Assists in managing business stakeholder relationships. Assists in defining and managing OLAs with relevant stakeholders. Ensures that the team understands OLAs and resources appropriately and are aligned to business SLAs. Ensures relevant Client Service teams are informed of progress on incidents, where necessary. Ensures that defined divisional Production Management service operations and support processes are adhered to by the team. Make improvement recommendations where appropriate. Prepares for and, if requested, manages steam review meetings. Makes suggestions for continual service improvement. Manages escalations by working with Client Services and other Service Operations Specialists and relevant functions to accurately resolve escalated issues quickly. Observes areas requiring monitoring, reporting and improvement. Identifies required metrics and ensure they are established, monitored and improved where appropriate. Continuously seeks to improve team performance. Participates in team training events, where appropriate. Works with team members to identify areas of focus, where training may improve team performance, and improve incident resolution. Mentors and coaches Production Management Analysts within the team by providing career development and counselling, as needed. Assists Production Management Analysts in setting performance targets; and manages performance against them. Identifies team bottlenecks (obstacles) and takes appropriate actions to eliminate them. Level 3 or Advanced support for technical infrastructure components Evaluation of new products including prototyping and recommending new products including automation Specify/select tools to enhance operational support. Champion activities and establishes best practices in specialist area, working to implement best of breed test practices and processes in area of profession. Defines and implements best practices, solutions and standards related to their area of expertise Builds captures and manages the transfers of knowledge across the Service Operations organization Fulfil Service Requests addressed to L2 Support Communicate with Service Desk function, other L2 and L3 units Incident-, Change-, Problem Management and Service Request Fulfillment Solving incidents of customers in time Log file analysis and root cause analysis Participating in major incident calls for high priority incidents Resolving inconsistencies of data replication Supporting Problem management to solve Application issues Creating/Executing Service Requests for Customers, provide Reports and Statistics Escalating and informing about incidents in a timely manner Documentation of Tasks, Incidents, Problems and Changes Documentation in Service Now Documentation in Knowledgebases Improving monitoring of the application Adding requests for Monitoring Adding alerts and thresholds for occurring issues Implementing automation of tasks Your skills and experience Service Operations Specialist experience within a global operations context Extensive experience of supporting complex application and infrastructure domains Experience managing and mentoring Service Operations teams Broad ITIL/best practice service context within a real time distributed environment Experience managing relationships across multiple disciplines and time zones Ability to converse clearly with internal and external staff via telephone and written communication Good knowledge on interface technologies and communication protocols Be willing to work in DE business hours Clear and concise documentation in general and especially a proper documentation of the current status of incidents, problems and service requests in the Service Management tool Thorough and precise work style with a focus on high quality Distinct service orientation High degree of self-initiative Bachelors Degree from an accredited college or university with a concentration in IT or Computer Science related discipline (equivalent diploma or technical faculty) ITIL certification and experience with ITSM tool ServiceNow (preferred) Know How on Banking domain and preferably regulatory topics around know your customer processes Experience with databases like BigQuery and good understanding of Big Data and GCP technologies Experience in at least: GitHub, Terraform, Cloud SQL, Cloud Storage, Dataproc, Dataflow Architectural skills for big data solutions, especially interface architecture You can work very well in teams but also independent and you are constructive and target oriented Your English skills are very good and you can both communicate professionally but also informally in small talks with the team Area specific tasks / responsibilities: Handling Incident- /Problem Management und Service Request Fulfilment Analyze Incidents, which are addressed from 1st Level Support Analyze occurred errors out of the batch processing and interfaces of related systems Resolution or Workaround determination and implementation Supporting the resolution of high impact incidents on our services, including attendance at incident bridge calls Escalate incident tickets and working with members of the team and Developers Handling Service Request eg. Reports for Business and Projects Providing resolution for open problems, or ensuring that the appropriate parties have been tasked with doing so Supporting the handover from new Projects / Applications into Production Services with Service Transition before Go Life Phase Supporting Oncall-Support activities

Posted 2 months ago

Apply

10.0 - 15.0 years

30 - 40 Lacs

Noida, Pune, Bengaluru

Hybrid

Strong Experience in Big Data- Data Modelling, Design, Architecting & Solutioning Understands programming language like SQL, Python, R-Scala. Good Python skills, - Experience from data visualisation tools such as Google Data Studio or Power BI Knowledge in A/B Testing, Statistics, Google Cloud Platform, Google Big Query, Agile Development, DevOps, Date Engineering, ETL Data Processing Strong Migration experience of production Hadoop Cluster to Google Cloud. Good To Have:- Expert in Big Query, Dataproc, Data Fusion, Dataflow, Bigtable, Fire Store, CloudSQL, Cloud Spanner, Google Cloud Storage, Cloud Composer, Cloud Interconnect, Etc

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies