Jobs
Interviews

279 Cloud Sql Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

6 - 10 Lacs

pune

Work from Office

You bring systems design experience with the ability to architect and explain complex systems interactions, data flows, common interfaces and APIs. You bring a deep understanding of and experience with software development and programming languages such as Java/Kotlin, and Shell scripting. You have hands-on experience with the following technologies as a senior software developer: Java/Kotlin, Spring, Spring Boot, Wiremock, Docker, Terraform, GCP services (Kubernetes, CloudSQL, PubSub, Storage, Logging, Dashboards), Oracle & amp; Postgres, SQL, PgWeb, Git, Github & amp; Github Actions, GCP Professional Data Engineering certification Data Pipeline Development: Designing, implementing, and optimizing data pipelines on GCP using PySpark for efficient and scalable data processing. ETL Workflow Development: Building and maintaining ETL workflows for extracting, transforming, and loading data into various GCP services. GCP Service Utilization: Leveraging GCP services like BigQuery, Cloud Storage, Dataflow, and Dataproc for data storage, processing, and analysis. Data Transformation: Utilizing PySpark for data manipulation, cleansing, enrichment, and validation. Performance Optimization: Ensuring the performance and scalability of data processing jobs on GCP. Collaboration: Working with data scientists, analysts, and other stakeholders to understand data requirements and translate them into technical solutions. Data Quality and Governance: Implementing and maintaining data quality standards, security measures, and compliance with data governance policies on GCP. Troubleshooting and Support: Diagnosing and resolving issues related to data pipelines and infrastructure. Staying Updated: Keeping abreast of the latest GCP services, PySpark features, and best practices in data engineering. Required Skills: GCP Expertise: Strong understanding of GCP services like BigQuery, Cloud Storage, Dataflow, and Dataproc. PySpark Proficiency: Demonstrated experience in using PySpark for data processing, transformation, and analysis. Python Programming: Solid Python programming skills for data manipulation and scripting. Data Modeling and ETL: Experience with data modeling, ETL processes, and data warehousing concepts. SQL: Proficiency in SQL for querying and manipulating data in relational databases. Big Data Concepts: Understanding of big data principles and distributed computing concepts. Communication and Collaboration: Ability to effectively communicate technical solutions and collaborate with cross-functional teams

Posted 23 hours ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

bengaluru

Work from Office

As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Relevant Experience5 years Proficient in Core Java (Java 17+), Spring Boot, tomcat/JBoss/WebSphere, RDBMS (My SQL/Oracle) and developing RESTful APIs. Experienced in modernizing and migrating on-premise Java/J2EE applications to Google Cloud. Hands-on with Docker and Kubernetes (GKE) for containerizing and deploying workloads. Skilled in using GCP services like Cloud SQL, GCE, GKE, and setting up CI/CD pipelines, Well-versed with discovery and intake tools such as Delivery Curator for cloud migration planning Preferred technical and professional experience None

Posted 1 day ago

Apply

8.0 - 13.0 years

5 - 9 Lacs

bengaluru

Work from Office

Install, configure, and maintain database systems (e.g., Oracle, SQL Server, MySQL, PostgreSQL, or cloud-based databases). Monitor database performance and proactively tune queries, indexing, and configurations to ensure optimal efficiency. Manage database security, including role-based access control, encryption, and auditing. Oversee backup, recovery, high availability (HA), and disaster recovery (DR) strategies. Perform database upgrades, patching, migrations, and capacity planning. Collaborate with developers to optimize queries, stored procedures, and schema design. Automate routine tasks using scripts (Python, Bash, PowerShell, etc.). Implement and manage replication, clustering, and failover solutions. Maintain documentation of database configurations, policies, and procedures. Stay updated with emerging technologies, best practices, and compliance requirements. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Bachelor’s degree in Computer Science, Information Technology, or related field (or equivalent experience). 8+ years of hands-on experience as a Database Administrator. Strong expertise in at least one major RDBMS (Oracle, SQL Server, PostgreSQL, MySQL). Proficiency in performance tuning, query optimization, and troubleshooting. Experience in exadata and rac setup. Experience with HA/DR solutions such as Always On, Patroni, Data Guard, or replication technologies. Should have fundamental knowledge of linux/unix and different kinds of storages. Knowledge of cloud database services (AWS RDS/Redshift, Azure SQL, Google Cloud SQL/BigQuery). Solid understanding of data security, compliance standards (GDPR, HIPAA, PCI-DSS). Strong scripting and automation skills (Python, Shell, PowerShell, etc.). Should be able to perform multi region replication and create database images in containers. Excellent analytical, problem-solving, and communication skills. Preferred technical and professional experience Certifications such as Oracle Certified Professional (OCP), Microsoft CertifiedAzure Database Administrator, AWS Certified Database Specialty Familiarity with DevOps practices, CI/CD pipelines, and database version control tools (Liquibase, Flyway). Exposure to big data technologies (Hadoop, Spark).

Posted 1 day ago

Apply

8.0 - 12.0 years

30 - 42 Lacs

hyderabad, pune, bengaluru

Work from Office

We are seeking a Technical Lead with strong Application Development expertise in Google Cloud Platform (GCP). The successful candidate will provide technical leadership in designing and implementing robust, scalable cloud-based solutions. If you are an experienced professional passionate about GCP technologies and committed to staying abreast of emerging trends, apply today. Responsibilities Design, develop, and deploy cloud-based solutions using GCP, establishing and adhering to cloud architecture standards and best practices Hands on coding experience in building Java Applications using GCP Native Services like GKE, CloudRun, Functions, Firestore, CloudSQL, PubSub, etc. Develop low-level application architecture designs based on enterprise standards Choose appropriate GCP services meeting functional and non-functional requirements Demonstrate comprehensive knowledge with GCP PaaS, Serverless, and Database services Provide technical leadership to development and infrastructure teams, guiding them throughout the project lifecycle Ensure all cloud-based solutions comply with security and regulatory standards Enhance cloud-based solutions optimizing performance, cost, and scalability Stay up-to-date with the latest cloud technologies and trends in the industry Familiarity with GCP GenAI solutions and models including Vertex.ai, Codebison, and Gemini models is preferred, but not required Having hands on experience in front end technologies like Angular or React will be added advantage Requirements Bachelor's or Master's degree in Computer Science, Information Technology, or a similar field Must have 8 + years of extensive experience in designing, implementing, and maintaining applications on GCP Comprehensive expertise in GCP services such as GKE, Cloudrun, Functions, Cloud SQL, Firestore, Firebase, Apigee, GCP App Engine, Gemini Code Assist, Vertex AI, Spanner, Memorystore, Service Mesh, and Cloud Monitoring Solid understanding of cloud security best practices and experience in implementing security controls in GCP Thorough understanding of cloud architecture principles and best practices Experience with automation and configuration management tools like Terraform and a sound understanding of DevOps principles Proven leadership skills and the ability to mentor and guide a technical team

Posted 1 day ago

Apply

6.0 - 10.0 years

2 - 6 Lacs

chennai

Remote

We are seeking an experienced Database and Cloud Advisory Specialist with a strong background in SQL Server , Oracle , and optionally MongoDB , to provide strategic and technical guidance for enterprise workloads on Google Cloud Platform (GCP) . The ideal candidate will play a pivotal role in designing, advising, and optimizing database and cloud infrastructure, particularly for migration planning and execution on Google Cloud VMware Engine (GCVE) . Key Responsibilities Database Expertise: Provide expert-level support and recommendations for SQL Server and Oracle databases across various environments. Offer guidance on MongoDB as an optional skill for NoSQL workloads. Perform assessments and capacity planning for databases prior to migration to GCP. Cloud Advisory (GCP Focus): Act as a cloud advisor for workloads migrating to GCP , specifically on Compute Engine , Cloud SQL , and GCVE . Provide best practices for provisioning, performance tuning, cost optimization, and security on GCP. Design and recommend backup, disaster recovery (DR) , and observability strategies for cloud-hosted databases and virtualized environments. Migration Planning & Execution: Offer high-level and hands-on guidance on developing end-to-end migration strategies from on-prem or other clouds to GCP. Work with application teams and stakeholders to plan and sequence migrations efficiently using industry-proven methodologies. GCVE Advisory: Guide the configuration and use of Google Cloud VMware Engine (GCVE) for hosting legacy applications. Advise on VM sizing, network configuration, storage options, and hybrid connectivity. Monitoring & Observability: Recommend tools and practices for observability, including logging, monitoring, and alerting. Ensure visibility into database and application performance post-migration. Preferred Skills Strong knowledge of GCP services, especially Cloud SQL, Compute Engine, GCVE. Experience with database migration tools and methodologies. Excellent communication and stakeholder management skills.

Posted 4 days ago

Apply

6.0 - 11.0 years

6 - 15 Lacs

chennai

Hybrid

Position Description: Employees in this job function are responsible for designing, building, and maintaining data solutions including data infrastructure, pipelines, etc. for collecting, storing, processing and analyzing large volumes of data efficiently and accurately Key Responsibilities: 1) Collaborate with business and technology stakeholders to understand current and future data requirements 2) Design, build and maintain reliable, efficient and scalable data infrastructure for data collection, storage, transformation, and analysis 3) Plan, design, build and maintain scalable data solutions including data pipelines, data models, and applications for efficient and reliable data workflow 4) Design, implement and maintain existing and future data platforms like data warehouses, data lakes, data lakehouse etc. for structured and unstructured data 5) Design and develop analytical tools, algorithms, and programs to support data engineering activities like writing scripts and automating tasks 6) Ensure optimum performance and identify improvement opportunities Skills Required: Python, Dataproc, dataflow, GCP Cloud Run, Agile Software Development, DataForm, TERRAFORM, Big Query, Data Fusion, GCP, Cloud SQL, KAFKA Experience Required: Bachelor's degree in Computer Science, Engineering, or a related technical field 5+ years of SQL development experience 5+ years of analytics/data product development experience required 3+ years of Google cloud experience with solutions designed and implemented at production scale Experience working in GCP native (or equivalent) services like Big Query, Google Cloud Storage, Dataflow, Dataproc, etc 2+ Experience working with Airflow for scheduling and orchestration of data pipelines 1+ Experience working with Terraform to provision Infrastructure as Code 2 + years professional development experience in Python Experience Preferred: In-depth understanding of Google's product technology (or other cloud platform) and underlying architectures Experience with development eco-system such as Tekton/Cloud Build, Git Experience in working with DBT/Dataform Education Required: Bachelor's Degree Additional Information : You will work on ingesting, transforming, and analyzing large datasets to support the Enterprise Securitization Solution Experience with large scale solution and operationalization of data lakes, data warehouses, and analytics platforms on Google Cloud Platform or other cloud environments is a must Work in collaborative environment that leverages paired programming Work on a small agile team to deliver curated data products Work effectively with product owners, data champions and other technical experts Demonstrate technical knowledge and communication skills with the ability to advocate for well-designed solutions Develop exceptional analytical data products using both streaming and batch ingestion patterns on Google CloudPlatform with solid data warehouse principles Be the Subject Matter Expert in Data Engineering with a focus on GCP native services and other well integrated third-party technologies

Posted 4 days ago

Apply

4.0 - 6.0 years

0 Lacs

gurgaon, haryana, india

On-site

Job Description At American Express, our culture is built on a 175-year history of innovation, shared and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role We are building an energetic, high-performance team with a nimble and creative mindset to drive our technology and products. American Express (AXP) is a powerful brand, a great place to work and has unparalleled scale. Join us for an exciting opportunity in the Marketing Data Technology (Mar Tech Data Team) within American Express Technologies. This team is specialized in creating and expanding suite of data and insight solutions to power the customer marketing ecosystem. The team creates and manages various batch/Realtime marketing data products that fuels the Customer Marketing Platforms. Being part of the team, you will get numerous opportunities to utilize and learn bigdata and GCP cloud technologies. Job Responsibilities: Responsible for delivering the features or software functionality independently and reliably. Develop technical design documentation. Functions as core member of an agile team by contributing to software builds through consistent development practices with respect to tools, common components, and documentation. Performs hands-on ETL development for marketing data applications. Participate in code reviews and automated testing. Helps other junior members of the team deliver. Demonstrates analytical thinking - recommends improvements, best practices and conducts experiments to prove/disprove them Provides continuous support for ongoing application availability. Learns, understands, participates fully in all team ceremonies, including work breakdown, estimation, and retrospectives. Willingness to learn new technologies and exploit them to their optimal potential, including substantiated ability to innovate and take pride in quickly deploying working software. High energy demonstrated, willingness to learn new technologies and takes pride in how fast they develop working software. Minimum Qualifications: Bachelor's Degree with minimum 4+ years of overall software design and development experience. Expert in SQL and Data warehousing concepts. Hands-on expertise with cloud platforms, ideally Google Cloud Platform (GCP) Working knowledge of data storage solutions like Big Query or Cloud SQL and data engineering tools like AirFlow or Cloud Workflows. Experience with other GCP services like Cloud Storage, Pub/Sub, or Data Catalog. Familiarity with Agile or other rapid application development methods. Hands on experience with one or more programming languages (Java, Python). Hands-on expertise with software development in Big Data (Hadoop, MapReduce, Spark, HIVE). Experience with CICD pipelines, Automated test frameworks, DevOps and source code management tools (XLR, Jenkins, Git, Sonar, Stash, Maven, Jira, Confluence, Splunk etc.). Knowledge of various Shell Scripting tools and ansible will be added advantage. Strong communication and analytical skills including effective presentation skills We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 4 days ago

Apply

1.0 - 3.0 years

2 - 6 Lacs

pune

Work from Office

Locations:Pune Indore Ahmedabad Hyderabad Bangalore We are seeking an experienced Database & Cloud Advisory Specialist with strong expertise in SQL Server and Oracle , along with working knowledge of MongoDB (optional) . The role is primarily advisory in nature , focusing on Google Cloud Platform (GCP) services such as Compute Engine, Cloud SQL, Observability, Backup & Disaster Recovery (DR), and GCVE (Google Cloud VMware Engine) . The candidate will play a key role in guiding teams and clients through cloud migration planning and ensuring best practices in database management and cloud adoption. Key Responsibilities Database Advisory: Provide expert guidance on SQL Server and Oracle databases (MongoDB as optional), covering design, deployment, optimization, and migration. Cloud Strategy: Advise clients on leveraging GCP Compute Engine, Cloud SQL, and GCVE for enterprise workloads. Observability & Monitoring: Recommend best practices for system observability, monitoring, and alerting to ensure system reliability. Backup & DR: Develop and review strategies for backup, recovery, and disaster recovery (DR) aligned with business continuity goals. Migration Planning: Guide stakeholders through migration roadmaps, including assessment, planning, and execution strategies for moving workloads to GCP. Performance & Optimization: Identify and advise on performance tuning opportunities for databases and cloud-hosted applications. Stakeholder Collaboration: Work closely with client executives, architects, and technical teams to align database and cloud strategies with business objectives. Best Practices & Governance: Ensure compliance with cloud governance, security standards, and data management policies. Knowledge Sharing: Provide thought leadership, documentation, and training to internal teams and client stakeholders. Qualifications Proven expertise in SQL Server and Oracle database environments. Hands-on or advisory experience with GCP services (Compute Engine, Cloud SQL, GCVE). Strong knowledge of observability tools , backup, and DR planning. Excellent communication and advisory skills for working with senior stakeholders.

Posted 5 days ago

Apply

5.0 - 8.0 years

25 - 40 Lacs

pune, gurugram, bengaluru

Hybrid

Salary: 25 to 40 LPA Exp: 5 to 10 years Location: Gurgaon/Pune/Bengalore Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engineer, Candidate will work on the assignments for one of our Utilities clients. Collaborating with cross-functional teams and stakeholders involves gathering data requirements, aligning business goals, and translating them into scalable data solutions. The role includes working closely with data analysts, scientists, and business users to understand needs, designing robust data pipelines, and ensuring data is accessible, reliable, and well-documented. Regular communication, iterative feedback, and joint problem-solving are key to delivering high-impact, data-driven outcomes that support organizational objectives. This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.

Posted 5 days ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

bengaluru

Work from Office

As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Relevant Experience5 years Proficient in Core Java (Java 17+), Spring Boot, tomcat/JBoss/WebSphere, RDBMS (My SQL/Oracle) and developing RESTful APIs. Experienced in modernizing and migrating on-premise Java/J2EE applications to Google Cloud. Hands-on with Docker and Kubernetes (GKE) for containerizing and deploying workloads. Skilled in using GCP services like Cloud SQL, GCE, GKE, and setting up CI/CD pipelines, Well-versed with discovery and intake tools such as Delivery Curator for cloud migration planning Preferred technical and professional experience None

Posted 5 days ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

bengaluru

Work from Office

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 5 days ago

Apply

2.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

As a Senior Data Engineer with 5-8 years of IT experience, including 2-3 years focused on GCP data services, you will be a valuable addition to our dynamic data and analytics team. Your primary responsibility will be to design, develop, and implement robust and insightful data-intensive solutions using GCP Cloud services. Your role will entail a deep understanding of data engineering, proficiency in SQL, and extensive experience with various GCP services such as BigQuery, DataFlow, DataStream, Pub/Sub, Dataproc, Cloud Storage, and other key GCP services for Data Pipeline Orchestration. You will be instrumental in the construction of a GCP native cloud data platform. Key Responsibilities: - Lead and contribute to the development, deployment, and lifecycle management of applications on GCP, utilizing services like Compute Engine, Kubernetes Engine (GKE), Cloud Functions, Cloud Run, Pub/Sub, BigQuery, Cloud SQL, Cloud Storage, and more. Required Skills and Qualifications: - Bachelor's degree in Computer Science, Information Technology, Data Analytics, or a related field. - 5-8 years of overall IT experience, with hands-on experience in designing and developing data applications on GCP Cloud. - In-depth expertise in GCP services and architectures, including Compute, Storage & Databases, Data & Analytics, and Operations & Monitoring. - Proven ability to translate business requirements into technical solutions. - Strong analytical, problem-solving, and critical thinking skills. - Effective communication and interpersonal skills for collaboration with technical and non-technical stakeholders. - Experience in Agile development methodology. - Ability to work independently, manage multiple priorities, and meet deadlines. Preferred Skills (Nice to Have): - Experience with other Hyperscalers. - Proficiency in Python or other scripting languages for data manipulation and automation. If you are a highly skilled and experienced Data Engineer with a passion for leveraging GCP data services to drive innovation, we invite you to apply for this exciting opportunity in Gurugram or Hyderabad.,

Posted 6 days ago

Apply

1.0 - 3.0 years

2 - 6 Lacs

chennai

Remote

This is a remote position. We are seeking an experienced Database & Cloud Advisory Specialist with strong expertise in SQL Server and Oracle, along with working knowledge of MongoDB (optional). The role is primarily advisory in nature, focusing on Google Cloud Platform (GCP) services such as Compute Engine, Cloud SQL, Observability, Backup & Disaster Recovery (DR), and GCVE (Google Cloud VMware Engine). The candidate will play a key role in guiding teams and clients through cloud migration planning and ensuring best practices in database management and cloud adoption. Key Responsibilities Database Advisory: Provide expert guidance on SQL Server and Oracle databases (MongoDB as optional), covering design, deployment, optimization, and migration. Cloud Strategy: Advise clients on leveraging GCP Compute Engine, Cloud SQL, and GCVE for enterprise workloads. Observability & Monitoring: Recommend best practices for system observability, monitoring, and alerting to ensure system reliability.Backup & DR: Develop and review strategies for backup, recovery, and disaster recovery (DR) aligned with business continuity goals. Migration Planning: Guide stakeholders through migration roadmaps, including assessment, planning, and execution strategies for moving workloads to GCP.Performance & Optimization: Identify and advise on performance tuning opportunities for databases and cloud-hosted applications. Stakeholder Collaboration: Work closely with client executives, architects, and technical teams to align database and cloud strategies with business objectives. Best Practices & Governance: Ensure compliance with cloud governance, security standards, and data management policies. Knowledge Sharing: Provide thought leadership, documentation, and training to internal teams and client stakeholders. Qualifications Proven expertise in SQL Server and Oracle database environments. Hands-on or advisory experience with GCP services (Compute Engine, Cloud SQL, GCVE). Strong knowledge of observability tools, backup, and DR planning. Excellent communication and advisory skills for working with senior stakeholders.

Posted 6 days ago

Apply

0.0 - 5.0 years

0 - 0 Lacs

bengaluru

Work from Office

SUMMARY Wissen Technology is Hirin g for GCP Cloud Engineer About Wissen Technology: At Wissen Technology, we deliver niche, custom-built products that solve complex business challenges across industries worldwide. Founded in 2015, our core philosophy is built around a strong product engineering mindset ensuring every solution is architected and delivered right the first time. Today, Wissen Technology has a global footprint with 2000+ employees across offices in the US, UK, UAE, India, and Australia . Our commitment to excellence translates into delivering 2X impact compared to traditional service providers. How do we achieve this? Through a combination of deep domain knowledge, cutting-edge technology expertise , and a relentless focus on quality. We don’t just meet expectations we exceed them by ensuring faster time-to-market, reduced rework, and greater alignment with client objectives . We have a proven track record of building mission-critical systems across industries, including financial services, healthcare, retail, manufacturing, and more. Wissen stands apart through its unique delivery models. Our outcome-based projects ensure predictable costs and timelines, while our agile pods provide clients the flexibility to adapt to their evolving business needs. Wissen leverages its thought leadership and technology prowess to drive superior business outcomes. Our success is powered by top-tier talent . Our mission is clear: to be the partner of choice for building world-class custom products that deliver exceptional impact the first time, every time. Job Summary : We are looking for an experienced GCP Cloud Engineer to design, implement, and manage cloud-based solutions on Google Cloud Platform (GCP). The ideal candidate should have expertise in GKE (Google Kubernetes Engine), Cloud Run, Cloud Loadbalancer , Cloud function, Azure DevOps, and Terraform, with a strong focus on automation, security, and scalability. You will work closely with development, operations, and security teams to ensure robust cloud infrastructure and CI/CD pipelines while optimizing performance and cost. Experience : 4 - 12 Years Location: Pune Mode of Work : Full Time Key Responsibilities : 1. Cloud Infrastructure Design & Management Architect, deploy, and maintain GCP cloud resources via terraform/other automation. Implement Google Cloud Storage, Cloud SQL, file store, for data storage and processing needs. Manage and configure Cloud Load Balancers (HTTP(S), TCP/UDP, and SSL Proxy) for high availability and scalability. Optimize resource allocation, monitoring, and cost efficiency across GCP environments. 2. Kubernetes & Container Orchestration Deploy, manage, and optimize workloads on Google Kubernetes Engine (GKE). Work with Helm charts, Istio, and service meshes for microservices deployments. Automate scaling, rolling updates, and zero-downtime deployments. 3. Serverless & Compute Services Deploy and manage applications on Cloud Run and Cloud Functions for scalable, serverless workloads. Optimize containerized applications running on Cloud Run for cost efficiency and performance. 4. CI/CD & DevOps Automation Design, implement, and manage CI/CD pipelines using Azure DevOps. Automate infrastructure deployment using Terraform, Bash and Power shell scripting Integrate security and compliance checks into the DevOps workflow ( DevSecOps ). Requirements: 8+ years of experience in Cloud Engineering, with strong focus on GCP . Hands-on experience with GKE, Compute Engine, IAM, VPC, Cloud Functions, Cloud SQL . Solid expertise in Docker, Kubernetes networking , and Helm charts . Proficient with Azure DevOps for building automated CI/CD pipelines. Strong experience in Terraform and Infrastructure as Code ( IaC ) . Proficiency in scripting languages like Python, Bash, or PowerShell . Knowledge of cloud security , IAM, and compliance frameworks. Strong problem-solving skills and ability to work independently in fast-paced environments Wissen Sites: Website: www.wissen.com LinkedIn: https://www.linkedin.com/company/wissen - technology Wissen Leadership: https://www.wissen.com/company/leadership - team/ Wissen Live: https://www.linkedin.com/company/wissen - technology/posts/feedView=All

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Cloud Database Administrator specializing in MS-SQL, MySQL, and PostgreSQL, you will be an integral part of our team, supporting cloud database operations for our leading client in a hybrid role based in India (Chandigarh, Hyderabad, Bangalore, Pune) on a 1-year contract. Your responsibilities will include ensuring the reliability, scalability, and performance of mission-critical systems across various environments. To succeed in this role, you should have a minimum of 3 years of hands-on experience as a DBA, with a strong understanding of Cloud SQL and basic GCP concepts. Your tasks will involve database operations and monitoring, database creation, deletion, and upgrades, administration, and configuration, backup and restore procedures, as well as performance tuning and optimization. Additionally, you will be responsible for instance whitelisting for GKE applications. We are looking for a professional with excellent communication skills who can effectively collaborate with clients and translate their requirements into technical solutions. If you are ready to take on this challenging position that involves strong client interaction, we encourage you to apply as interviews are already in progress.,

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 30 Lacs

pune, bengaluru, mumbai (all areas)

Work from Office

Designing, deploying, and managing applications and infrastructure on Google Cloud. Responsible for maintaining more solutions that leverage Google-managed or self-managed services, utilizing both the Google Cloud Console and command-line interface Required Candidate profile Designing and Implementing Cloud Solutions: Deploying and Managing Applications: Monitoring and Maintaining Cloud Infrastructure: Utilizing Cloud Services: Automation and DevOps:

Posted 1 week ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

chennai

Work from Office

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 1 week ago

Apply

10.0 - 15.0 years

25 - 35 Lacs

chennai

Work from Office

We are looking for an experienced Senior Software Engineer with strong expertise in full-stack development and cloud technologies. The ideal candidate will be responsible for designing, developing, testing, and maintaining scalable software applications and products. You will be deeply involved in the entire software development lifecycleright from architecture design to deployment—while collaborating with cross-functional teams and driving user-centric solutions. Key Responsibilities Engage with customers to understand use cases, pain points, and requirements, and advocate for user-focused solutions. Design, develop, and deliver software applications using various tools, frameworks, and methodologies (Agile). Assess business and technical requirements to determine the best technology stack, integration methods, and deployment strategy. Create high-level software architecture designs, including structure, components, and interfaces. Collaborate closely with product owners, designers, and architects to align on solutions. Define and implement software testing strategies, policies, and best practices. Continuously optimize application performance and adopt new technologies to enhance efficiency. Apply programming practices such as Test-Driven Development (TDD), Continuous Integration (CI), and Continuous Delivery (CD). Implement secure coding practices, including encryption and anonymization of user data. Develop user-friendly, interactive front-end interfaces and robust back-end services (APIs, microservices). Leverage cloud platforms and emerging technologies to build future-ready solutions. Skills Required Programming & Data Engineering: Python, PySpark, API Development, SQL/Postgres Cloud Platforms & Tools: Google Cloud Platform (BigQuery, Cloud Run, Dataflow, Dataproc, Data Fusion, Cloud SQL), IBM WebSphere Application Server Infrastructure & DevOps: Terraform, Tekton, Airflow Other Expertise: MDM (Master Data Management), application optimization, microservices Experience Required 10+ years of experience in IT, with 8+ years in software development. Strong practical experience in at least 2 programming languages OR advanced expertise in 1 language. Experience mentoring and guiding engineering teams.

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

bengaluru, karnataka, india

On-site

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within the Consumer and Community Banking - Banking and Wealth Management Team, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives. Job responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, opportunity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years applied experience Hands-on experience with cloud-based applications, technologies and tools, deployment, monitoring and operations, such as Kubernetes, Prometheus, FluentD, Slack, Elasticsearch, Grafana, Kibana, etc. Relational and NoSQL databases developing and managing operations leveraging key event streaming, messaging and DB services such as Cassandra, MQ/JMS/Kafka, Aurora, RDS, Cloud SQL, BigTable, DynamoDB, MongoDB, Cloud Spanner, Kinesis, Cloud Pub/Sub, etc. Networking (Security, Load Balancing, Network Routing Protocols, etc.) Demonstrated experience in the fields of production engineering and automation. Strong understanding of cloud technology standards and practices. Proficiency in utilizing tools for monitoring, analysis, and troubleshooting, including Splunk, Dynatrace, Datadog, or equivalent. Preferred qualifications, capabilities, and skills Ability to conduct detailed analysis on incidents to identify patterns and trends, thereby enhancing operational stability and efficiency. Familiarity with digital certificate management and automation tools. Knowledge of frameworks such as CI/CD pipeline. Excellent communication and collaboration skills.

Posted 1 week ago

Apply

12.0 - 15.0 years

12 - 20 Lacs

bengaluru, delhi / ncr, mumbai (all areas)

Work from Office

Your potential, unleashed. Indias impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The team Our EAD:Engg team focuses on enabling our clients end-to-end journey from On-Premise to Cloud, with opportunities in the areas of: Cloud Strategy, Op Model Transformation, Cloud Development, Cloud Integration & APIs, Cloud Migration, Cloud Infrastructure & Engineering, and Cloud Managed Services. We help our clients see the transformational capabilities of Cloud as an opportunity for business enablement and competitive advantage. EAD:Engg supports our clients as they improve agility and resilience, and identifies opportunities to reduce IT operations spend through automation by enabling Cloud. We accelerate our clients towards a technology-driven future, leveraging vendor solutions and Deloitte-developed software products, tools, and accelerators Your work profile Senior Google Cloud Senior Infrastructure Engineer Must have Tech skills Certifications: GCP Professional Cloud Architect/GCP Professional DevOps Engineer/GCP Professional Network Engineer Experience: 9 to 15 years What this role is all about This position will be responsible to work on client engagements and manage Cloud Platforms. We are primarily seeking a candidate who has great technical skills and the ability to lead a team of Infra Engineers. Infrastructure admins from past, who have evolved themselves as Cloud and DevOps engineers are a great fir for this role. What are the expectations? Technical Skills: Awareness of Google Cloud Foundation setup. Hands-on expertise on the following cloud services: GCP Compute Services: Google Compute Engine, App Engine (GAE), Google Kubernetes Engine (GKE), Google Cloud Functions GCP Networking Services: VPC, Cloud CDN, Cloud Load Balancing, Cloud Interconnect, Cloud DNS, NAT, VPN, and Private Service Connect Database Services: Cloud SQL, Cloud Storage GCP Management Tools: Cloud Operations Suite, Cloud Monitoring, Cloud Logging Other Services: OS Patch Management, Identity and Access Management (IAM), Deployment Manager, Billing, Marketplace, Google Cloud Directory Sync, Instance Groups, Instance Templates, Snapshots, Health Checks Infrastructure automation: Expertise in infrastructure provisioning and automation using Terraform. Exposure to GitOps practices is a plus. Scripting: Proficiency in shell scripting or Python for automation and orchestration tasks. Containerization and Orchestration: Hands-on experience with Kubernetes (GKE preferred) including deployment strategies, Helm charts, autoscaling, and cluster operations. Familiarity with container security best practices and tools like kube-bench, kube-hunter, or Falco. DevSecOps and CI/CD : Experience integrating security into CI/CD pipelines, using tools like Snyk, Trivy, Aqua Security, or Checkov Knowledge of DevSecOps practices, policy enforcement, and compliance monitoring in cloud-native environments Experience working with CI/CD tools like Cloud Build, Jenkins, GitHub Actions, or GitLab CI Certifications : Any Google Professional Certification, such as: Google Professional Cloud Architect Google Professional Cloud Network Architect Google Professional DevOps Engineer Environments/Tools: Source Code Management like GitHub / Bitbucket Issue Tracking: Tools similar to Jira Documentation: Tools similar to Confluence Behavioral Skills: At Deloitte India, we believe in the importance of leadership at all levels. We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and society, and make an impact that matters. In addition to living our purpose, managers across our organization: Develop self by actively seeking opportunities for growth, share knowledge and experiences with others, and act as a strong brand ambassadors Understand objectives for clients and Deloitte, align own work to objectives, and set personal priorities Seek opportunities to challenge self Collaborate with others across businesses and borders to deliver and take accountability for own and team results Identify and embrace our purpose and values and put these into practice in their professional life Build relationships and communicate effectively to positively influence peers and other stakeholders Golden Behavior Passionate, persuasive, articulate Cloud professional capable of quickly establishing interest and credibility. Good business judgment, a comfortable, open communication style, and a willingness and ability to work with customers and teams. Strong service attitude and a commitment to quality. Highly organized and efficient. Confident in working with others to inspire a high-quality standard. . How youll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the worlds most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report . Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyones welcome entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Heres a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area youre applying to. Check out recruiting tips from Deloitte professionals.

Posted 1 week ago

Apply

5.0 - 10.0 years

11 - 15 Lacs

bengaluru

Work from Office

Experience:5+Years Category: GCP+GKE Main location: Bangalore / Chennai / Hyderabad / Pune / Mumbai Position ID: J0425-1242 Employment Type: Full Time Job Description : We are seeking a skilled and proactive Google Cloud Engineer with strong experience in DevOps with hands-on expertise in Google Kubernetes Engine (GKE) to design, implement, and manage cloud-native infrastructure . You will play a key role in automating deployments, maintaining scalable systems, and ensuring the availability and performance of our cloud services on Google Cloud Platform (GCP). Key Responsibilities and Required Skills 5+ years of experience in DevOps Cloud Engineering roles. Design and manage cloud infrastructure using Google Cloud services such as Compute Engine, Cloud Storage, VPC, IAM, Cloud SQL, GKE, and more. Proficient in writing Infrastructure-as-Code using Terraform, Deployment Manager, or similar tools. Automate CI/CD pipelines using tools like Cloud Build, Jenkins, GitHub Actions, etc. Manage and optimize Kubernetes clusters for high availability, performance, and security. Collaborate with developers to containerize applications and streamline their deployment. Monitor cloud environments and troubleshoot performance, availability, or security issues. Implement best practices for cloud governance, security, cost management, and compliance. Participate in cloud migration and modernization projects. Ensure system reliability and high availability through redundancy, backup strategies, and proactive monitoring. Contribute to cost optimization and cloud governance practices. Strong hands-on experience with core GCP services including Compute, Networking, IAM, Storage, and optional Kubernetes (GKE). Proven expertise in Kubernetes (GKE)managing clusters, deployments, services, autoscaling, etc. Experience in Configuring Kubernetes resources (Deployments, Services, Ingress, Helm charts, etc.) to support application lifecycles. Solid scripting knowledge (e.g., Python, Bash, Go). Familiarity with GitOps and deployment tools like ArgoCD, Helm. Experience with CI/CD tools and setting up automated deployment pipelines. Should have Google Cloud certifications (e.g., Professional Cloud DevOps Engineer, Cloud Architect, or Cloud Engineer). Behavioural Competencies : Proven experience of delivering process efficiencies and improvements Clear and fluent English (both verbal and written) Ability to build and maintain efficient working relationships with remote teams Demonstrate ability to take ownership of and accountability for relevant products and services Ability to plan, prioritise and complete your own work, whilst remaining a team player Willingness to engage with and work in other technologies Your future duties and responsibilities Required qualifications to be successful in this role Skills: DevOps Google Cloud Platform Kubernetes Terraform Helm

Posted 1 week ago

Apply

3.0 - 8.0 years

11 - 16 Lacs

pune

Work from Office

Job Description: Job Title Lead Engineer Location Pune Corporate Title Director As a lead engineer within the Transaction Monitoring department, you will lead and drive forward critical engineering initiatives and improvements to our application landscape whilst supporting and leading the engineering teams to excel in their roles. You will be closely aligned to the architecture function and delivery leads, ensuring alignment with planning and correct design and architecture governance is followed for all implementation work. You will lead by example and drive and contribute to automation and innovation initiatives with the engineering teams. Join the fight against financial crime with us! Your key responsibilities Experienced hands-on cloud and on-premise engineer, leading by example with engineering squads Thinking analytically, with systematic and logical approach to solving complex problems, and high attention to detail Design & document complex technical solutions at varying levels in an inclusive and participatory manner with a range of stakeholders Liaise and face-off directly to senior stakeholders in technology, business and modelling areas Collaborating with application development teams to design and prototype solutions (both on-premises and on-cloud), supporting / presenting these via the Design Authority forum for approval and providing good practice and guidelines to the teams Ensuring engineering & architecture compliance with bank-standard processes for deploying new applications, working directly with central functions such as Group Architecture, Chief Security Office and Data Governance Innovate and think creatively, showing willingness to apply new approaches to solving problems and to learn new methods, technologies and potentially outside-of-box solution Your skills and experience Proven hands-on engineering and design experience in a delivery-focused (preferably agile) environment Solid technical/engineering background, preferably with at least two high level languages and multiple relational databases or big-data technologies Proven experience with cloud technologies, preferably GCP (GKE / DataProc / CloudSQL / BigQuery), GitHub & Terraform Competence / expertise in technical skills across a wide range of technology platforms and ability to use and learn new frameworks, libraries and technologies A deep understanding of the software development life cycle and the waterfall and agile methodologies Experience leading complex engineering initiatives and engineering teams Excellent communication skills, with demonstrable ability to interface and converse at both junior and senior level and with non-IT staff Line management experience including working in a matrix management configuration

Posted 1 week ago

Apply

3.0 - 7.0 years

8 - 13 Lacs

pune

Work from Office

Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate for the Cloud Data Engineer area. Your key responsibilities You are responsible for the implementation of the new project on GCP (Spark, Dataproc, Dataflow, BigQuery, Terraform etc) in the whole SDLC chain You are responsible for the support of the migration of current functionalities to Google Cloud You are responsible for the stability of the application landscape and support software releases You also support in L3 topics and application governance You are responsible in the CTM area for coding as part of an agile team (Java, Scala, Spring Boot) Your skills and experience You have experience with databases (BigQuery, Cloud SQl, No Sql, Hive etc.) and development preferably for Big Data and GCP technologies Strong understanding of Data Mesh Approach and integration patterns Understanding of Party data and integration with Product data Your architectural skills for big data solutions, especially interface architecture allows a fast start You have experience in at least: Spark, Java ,Scala and Python, Maven, Artifactory, Hadoop Ecosystem, Github Actions, GitHub, Terraform scripting You have knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processes You can work very well in teams but also independent and are constructive and target oriented Your English skills are good and you can both communicate professionally but also informally in small talks with the team

Posted 1 week ago

Apply

4.0 - 9.0 years

13 - 18 Lacs

pune

Work from Office

Role Description Our team is part of the area Technology, Data and Innovation (TDI) Private Bank. Within TDI, Partnerdata is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we maintain critical functionality on the mainframe, but build new solutions (REST services, Angular frontend, analytics capabilities) in a public and private cloud environment. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate with a passion for cloud solutions (GCP) and Big Data. What we'll offer you As part of our flexible scheme, here are just some of the benefits that you'll enjoy Your key responsibilities You are responsible for the implementation of new project requirements on GCP (Cloud Run, Kubernetes, Cloud SQL, Terraform) You are responsible for the design and translation of high-level business requirements into software You are responsible for extending the service architecture, designing enhancements, and establishing best practices You support the migration of existing functionalities to the Google Cloud Platform You are responsible for the stability of the application landscape and support software releases You provide L3 support in case of incidents and facilitate the application governance procedure You are willing to work and code as part of an agile team (Java, Scala, Spring Boot) Your skills and experience You have multiple years of experience in designing and developing REST services (REST, OpenAPI, API-first) You have experience with Java and especially Spring and Spring Boot applications; Spring Cloud is a bonus You have experience in querying SQL databases and are familiar with relational databases in general Yous have experience with developing container-based applications and are familiar with container orchestration frameworks such as Kubernetes/Openshift You are familiar with cloud technologies and especially Google Cloud (Cloud Run, Kubernetes Engine, Cloud SQL, Big Query) You are familiar with building and deploying code in an enterprise-grade environment utilizing CI/CD pipelines, especially, Maven, JFrog Artifactory & GitHub Actions You have a good understanding of IaC concepts and tools such as Terraform Knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processes is a bonus You enjoy working in a team setting in an independent and target-oriented way You have very good English skills which allow you to communicate professionally, but also informally, with all team members

Posted 1 week ago

Apply

5.0 - 10.0 years

15 - 19 Lacs

noida

Remote

Mandatory skill- Advance Python & SQL GCP Services- BigQuery, Dataflow, Dataproc and Pub/Sub. Key Responsibilities Design, develop and optimize scalable data pipelines and ETL workflows using Google Cloud Platform (GCP), particularly leveraging BigQuery, Dataflow, Dataproc and Pub/Sub. Design and manage secure, efficient data integrations involving Snowflake and BigQuery. Write, test and maintain high-quality Python code for data extraction, transformation and loading (ETL), analytics and automation tasks. Use Git for collaborative version control, code reviews and managing data engineering projects. Implement infrastructure-as-code practices using Pulumi for cloud resources management and automation within GCP environments. Apply clean room techniques to design and maintain secure data sharing environments in alignment with privacy standards and client requirements. Collaborate with cross-functional teams (data scientists, business analysts, product teams) to deliver data solutions, troubleshoot issues and assure data integrity throughout the lifecycle. Optimize performance of batch and streaming data pipelines, ensuring reliability and scalability. Maintain documentation on processes, data flows and configurations for operational transparency. ________________________________________ Required Skills Strong hands-on experience of 5+ years with GCP core data services: BigQuery, Dataflow, Dataproc and Pub/Sub. Proficiency in data engineering development using Python. Deep familiarity with Snowflakedata modeling, secure data sharing and advanced query optimization. Proven experience with Git for source code management and collaborative development. Demonstrated ability using Pulumi (or similar IaC tools) for deployment and support of cloud infrastructure. Practical understanding of cleanroom concepts in cloud data warehousing, including privacy/compliance considerations. Solid skills in debugging complex issues within data pipelines and cloud environments. Effective communication and documentation skills. ________________________________________ Great to Have GCP certification (e.g., Professional Data Engineer). Experience working in regulated environments (telecom/financial/healthcare) with data privacy and compliance focus. Exposure to additional GCP services such as Cloud Storage, Cloud Functions or Kubernetes. Demonstrated success collaborating in agile, distributed teams. Experience with data visualization tools (e.g., Tableau, Looker) is nice to have.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies