Jobs
Interviews

167 Cloud Sql Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

37 - 40 Lacs

Bengaluru

Work from Office

: Job TitleDevOps Engineer, AS LocationBangalore, India Role Description Deutsche Bank has set for itself ambitious goals in the areas of Sustainable Finance, ESG Risk Mitigation as well as Corporate Sustainability. As Climate Change throws new Challenges and opportunities, Bank has set out to invest in developing a Sustainability Technology Platform, Sustainability data products and various sustainability applications which will aid Banks goals. As part of this initiative, we are building an exciting global team of technologists who are passionate about Climate Change, want to contribute to greater good leveraging their Technology Skillset in Cloud / Hybrid Architecture. As part of this Role, we are seeking a highly skilled and experienced DevOps Engineer to join our growing team. In this role, you will play a pivotal role in managing and optimizing cloud infrastructure, facilitating continuous integration and delivery, and ensuring system reliability. What well offer you . 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Create, implement, and oversee scalable, secure, and cost-efficient cloud infrastructures on Google Cloud Platform (GCP). Utilize Infrastructure as Code (IaC) methodologies with tools such as Terraform, Deployment Manager, or alternatives. Implement robust security measures to ensure data access control and compliance with regulations. Adopt security best practices, establish IAM policies, and ensure adherence to both organizational and regulatory requirements. Set up and manage Virtual Private Clouds (VPCs), subnets, firewalls, VPNs, and interconnects to facilitate secure cloud networking. Establish continuous integration and continuous deployment (CI/CD) pipelines using Jenkins, GitHub Actions, or comparable tools for automated application deployments. Implement monitoring and alerting solutions through Stackdriver (Cloud Operations), Prometheus, or other third-party applications. Evaluate and optimize cloud expenditures by utilizing committed use discounts, autoscaling features, and resource rightsizing. Manage and deploy containerized applications through Google Kubernetes Engine (GKE) and Cloud Run. Deploy and manage GCP databases like Cloud SQL, BigQuery. Your skills and experience Minimum of 5+ years of experience in DevOps or similar roles with hands-on experience in GCP. In-depth knowledge of Google Cloud services (e.g., GCE, GKE, Cloud Functions, Cloud Run, Pub/Sub, BigQuery, Cloud Storage) and the ability to architect, deploy, and manage cloud-native applications. Proficient in using tools like Jenkins, GitLab, Terraform, Ansible, Docker, Kubernetes. Experience with Infrastructure as Code (IaC) tools like Terraform, CloudFormation, or GCP-native Deployment Manager. Solid understanding of security protocols, IAM, networking, and compliance requirements within cloud environments. Strong problem-solving skills and ability to troubleshoot cloud-based infrastructure. Google Cloud certifications (e.g., Associate Cloud Engineer, Professional Cloud Architect, or Professional DevOps Engineer) are a plus. How well support you . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 month ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxIntuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 1 month ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxIntuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 1 month ago

Apply

6.0 - 10.0 years

15 - 20 Lacs

Hyderabad

Work from Office

Develop, optimize, and maintain scalable data pipelines using Python and PySpark. Design and implement data processing workflows leveraging GCP services such as: BigQuery Dataflow Cloud Functions Cloud Storage

Posted 1 month ago

Apply

5.0 - 7.0 years

10 - 16 Lacs

Bengaluru

Work from Office

Role & responsibilities Associate should have L2 and L3 capabilities in different Versions of Microsoft SQL Servers. Installation of database software. Database Builds. Incident Management. Change Management. Problem Management. Database maintenance (Index re-build, table re-org). User Access Management. Database startup/shutdown. DBCC checks. Database re-org activities. Altering database/T-log files. Analyzing the database blocking. Analyzing session wait events. Perform database backup/restores. Migrating the database objects from Dev /QA to Production. Database refresh/cloning. Database upgrades. Database patching. Knowledge Management - Creation of SOPs for regular activities, KEDB. Knowledge on SOX/PCI Compliance reporting. DR Drill support. Always On. Azure Database Multiple node cluster adding knowledge in Power BI App Preferred candidate profile Minimum of 5-7 years of experience in Microsoft SQL Server Database Administration. Strong knowledge of Microsoft SQL Server 2016,2017,2019,2022 Administration. Ability to work in a fast-paced environment. Excellent communication, problem-solving skills, interpersonal, team member and leadership skills and abilities Perks and Benefits Competitive salary commensurate with experience, good health insurance coverage, epf, Bonus and Gratuity excellent work environment training and Certifications

Posted 1 month ago

Apply

6.0 - 9.0 years

15 - 19 Lacs

Bengaluru

Work from Office

Project description Developing cloud-based compliance state-of-the-art archive products to archive and retain real-time communications data in line with internal and external regulatory requirements. The product is developed in-house in an agile development setup based on business requirements from multiple stakeholder parties. By employing continuous development and deployment principles, the team is aiming to transition from project to product management to support the bank with robust compliance archive solutions for the next decade. For a large global investment bank, we are looking for GCP-qualified cloud engineers to help with the FIC (fixed income and currencies) cloud migrations under Project Cirrus. Knowledge of Financial Services/FIC would be great, but the primary skills we need are in building, migrating, and deploying applications to GCP, Terraform module coding, Google infrastructure, cloud-native services such as GCE, GKE, CloudSQL/Postgres, logging and monitoring, etc., & good written and spoken English, as we would like these engineers to help with knowledge transfer to our existing development & support teams. We would like to place people alongside the engineers they'll be working with in the bank. You should have extensive experience with Google Cloud Platform (GCP), Kubernetes, and Docker. role involves working closely with our development and operations teams to ensure seamless integration and deployment of applications. Responsibilities Design, implement, and manage CI/CD pipelines on GCP. Automate infrastructure provisioning, configuration, and deployment using tools like Terraform and Ansible. Manage and optimize Kubernetes clusters for high availability and scalability. Containerize applications using Docker and manage container orchestration. Monitor system performance, troubleshoot issues, and ensure system reliability and security. Collaborate with development teams to ensure smooth and reliable operation of software and systems. Implement and manage logging, monitoring, and alerting solutions. Stay updated with the latest industry trends and best practices in DevOps and cloud technologies. Skills Must have Looking for 6 to 9 years of experience as a DevOps Engineer and a minimum of 4 years of relevant experience in GCP. Bachelor's degree in Computer Science, Engineering, or a related field. Strong expertise in Kubernetes and Docker. Experience with infrastructure as code (IaC) tools such as Terraform and Ansible. Proficiency in scripting languages like Python, Bash, or Go. Familiarity with CI/CD tools such as Jenkins, GitLab CI, or CircleCI. Knowledge of networking, security, and database management. Excellent problem-solving skills and attention to detail. Nice to have Strong communication and collaboration skills. Other Languages EnglishC2 Proficient Seniority Senior

Posted 1 month ago

Apply

5.0 - 8.0 years

12 - 16 Lacs

Mangaluru, Udupi

Hybrid

SRE Lead Role Description: We are seeking an experienced SRE Strategist to lead the reliability and operational excellence agenda for our Enterprise Data Platforms spanning GCP cloud-native systems. This strategic leadership role will help instill Google’s SRE principles across diverse data engineering teams, uplift our platform reliability posture, and spearhead the creation of a Centre-of-Excellence (CoE) for SRE. The ideal candidate will possess a deep understanding of modern SRE practices, demonstrate a proven ability to scale SRE capabilities in large enterprises, and evangelise a data-driven approach to resilience engineering. Key Responsibilities: Define and drive SRE strategy for enterprise data platforms on GCP, aligning with business goals and reliability needs. Act as a trusted advisor to platform teams, embedding SRE mindset, best practices, and golden signals into their SDLC and operational processes. Set up and lead a Site Reliability Engineering CoE, delivering reusable tools, runbooks, blueprints, and platform accelerators to scale SRE adoption across the organisation. Partner with product and platform owners to prioritise and structure SRE backlogs, formulate roadmaps, and help teams move from reactive ops to proactive reliability engineering. Define and track SLIs, SLOs, and error budgets across critical data services, enabling data-driven decision making around availability and performance. Drive incident response maturity, including chaos engineering, incident retrospectives, and blameless postmortems. Foster a reliability culture through coaching, workshops, and cross-functional forums. Build strategic relationships across engineering, data governance, security, and architecture teams to ensure reliability is baked in, not bolted on. Required Qualifications: Bachelor's or Master’s degree in Computer Science, Engineering, or related discipline. 3+ years in SRE leadership or SRE strategy roles. Strong familiarity with Google SRE principles and practical experience applying them in complex enterprise settings. Proven track record in establishing and scaling SRE teams. Experience with GCP services like Cloud Build, GCS, CloudSQL, Cloud Functions, and GCP logging & monitoring. Deep experience with observability stacks such as Prometheus, Grafana, Splunk, and GCP native solutions. Skilled in Infrastructure as Code using tools like Terraform, and working knowledge of automation in CI/CD environments. Key Competencies & Skills: Strong leadership, influence without authority, and mentoring capabilities. Hands-on scripting and automation skills in Python, with secondary languages like Go or Java a plus. Familiarity with incident and problem management frameworks in enterprise environments. Ability to define and execute a platform-wide reliability roadmap in alignment with architectural and business objectives. Nice to Have: Exposure to secrets management tools (e.g., HashiCorp Vault). Experience with tracing and APM tools like Google Cloud Trace or Honeycomb. Background in data governance, data pipelines, and security standards for data products.

Posted 1 month ago

Apply

8.0 - 12.0 years

30 - 35 Lacs

Pune

Work from Office

: Job Title- Senior Engineer PD Location- Pune, India Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate for the Cloud Data Engineer area. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities You are responsible for the implementation of the new project on GCP (Spark, Dataproc, Dataflow, BigQuery, Terraform etc) in the whole SDLC chain You are responsible for the support of the migration of current functionalities to Google Cloud You are responsible for the stability of the application landscape and support software releases You also support in L3 topics and application governance You are responsible in the CTM area for coding as part of an agile team (Java, Scala, Spring Boot) Your skills and experience You have experience with databases (BigQuery, Cloud SQl, No Sql, Hive etc.) and development preferably for Big Data and GCP technologies Strong understanding of Data Mesh Approach and integration patterns Understanding of Party data and integration with Product data Your architectural skills for big data solutions, especially interface architecture allows a fast start You have experience in at leastSpark, Java ,Scala and Python, Maven, Artifactory, Hadoop Ecosystem, Github Actions, GitHub, Terraform scripting You have knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processes You can work very well in teams but also independent and are constructive and target oriented Your English skills are good and you can both communicate professionally but also informally in small talks with the team How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 month ago

Apply

10.0 - 15.0 years

19 - 25 Lacs

Pune

Work from Office

: Job Title- IT Architect Location- Pune Role Description The Business Architect defines the technical solution design of specific IT platforms and provides guidance to the squad members in order to design, build, test and deliver high quality software solutions. A key element in this context is translation of functional and non-functional business requirements into an appropriate technical solution design, leveraging best practices and consistent design patterns. The Business Architect collaborates closely with Product Owners Chapter leads and Squad members to ensure consisten adherence to the agreed-upon application design and is responsible for maintaining an appropriate technical design documentation. The Solution architect ensures that the architectures and designs of solutions conform to the principles blueprints, standards, patterns etc,that have been established by the Enterprise Architecture in this context the Business Architect is closely collaborating with the respective solution Architect to ensure architecture compliance. The business Architect also actively contributes to the definition and enrichment of design patterns and standards with the aim to leverage those across squads and tribes. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Define the technical Architecture of IT Solutions in line with functional and non-functional requirements following consistent design patterns and best practices. Ensure that the solution design is in sync with WM target Architecture blueprints and principles, as well as with overarching DB architecture and security standards. Create appropriate technical design documentation and ensure this is kept up-to-date. Provide guidance to the squad members to design, build, test and deliver high quality software solutions in line with business requirements Responsible for all aspects of the solution architecture (i.e. Maintainablity, scalability, effective integration with other solutions, usage of shared solutions and components where possible, optimization of the resource consumption etc. ) with the object to meet the appropriate balance between business needs and total cost of ownership Closely collaborate with enterprise architecture to ensure architecture compliance and make sure that any design options are discussed in a timely manner to allow sufficient time for deliberate decision taking Present architecture proposals to relevant forums along with enterprise architect at different levels and drive the process to gain the necessary architecture approvals. Collaborate with relevant technology stakeholders within other squads and across tribes to ensure the cross-squad and cross-tribe solution architecture synchronization and alignment Contribute to definition and enrichment of appropriate design patterns and standards that can be leveraged across WM squads / tribes Serve as a Counsel to designers and developers and carry out reviews of software designs and high level detailed level design documentation provided by other squad members Lead the technical discussions with CCO, Data factory, Central Data quality and Complience, end to end and control functions for technical queries contribute to peer level solution architecture reviews e.g. within a respective chapter Your skills and experience Ability / experience in defining the high level and low level technical solution designs for complex initiatives very good analytical skills and ability to oversee / structure complex tasks Hands on skills with various google cloud components like storage buckets, BigQuery, Dataproc, cloud composer, cloud functions etc aling with Pyspark, Scala is essential,. Good to have experience in Cloud SQL, Dataflow, Java and Unix Experience with implementing a google cloud based solution is essesntial persuasive power and persistence in driving adherence to solution design within the squad Ability to apply the appropriate architectural patterns considering the relevant functional and nonfunctional requirements proven ability to balance business demands and IT capabilities in terms of standardization reducing risk and increasing the IT flexibility comfortable working in an open, highly collaborative team ability to work in an agile and dynamic environment and to build up the knowledge related to new technology/ solutions in an effective and timely manner ability to communicate effectively with other technology stakeholders feedbackseek feedback from others, provides feedback to others in support of their development and is open and honest while dealing constructively with criticism inclusive leadershipvalues individuals and embraces diversity by integrating differences in promoting diversity and inclusion across teams and functions coachingunderstands and anticipates people's needs skills and abilities in order to coach, motivate and empower them for success broad set of architecture knowledge and application design skills and - depending on the specific squad requirements - in-depth expertise with regards to specific architecture domains (e.g. service and integration architecture web and mobile front end architecture guitar architecture security architecture infrastructure architecture) and related technology stacks and design patterns experience in establishing thought leadership in solution architecture practices and ability to lead design and development teams and defining building and delivering first class software solutions familiar with current and emerging technologies, tools, frameworks and design patterns experience in effectively collaborating across multiple teams and geographies ability to appropriately consider other dimensions(e.g. financials, risk, time to market) On top of the architecture drivers in order to propose balanced and physical architecture solutions Experience / Qualifications : 10+ years relevant experience as technology Manager within the IT support industry experience in financial /banking industry preferred Minimum 8 years' experience supporting Oracle Platform in a mid-size to large corporate environment Preferably from Banking / Wealth Management experience Must have experience working in agile organization. How well support you

Posted 1 month ago

Apply

4.0 - 9.0 years

15 - 19 Lacs

Pune

Work from Office

: Job Title: Technical-Specialist GCP Developer LocationPune, India Role Description This role is for Engineer who is responsible for design, development, and unit testing software applications. The candidate is expected to ensure good quality, maintainable, scalable, and high performing software applications getting delivered to users in an Agile development environment. Candidate / Applicant should be coming from a strong technological background. The candidate should have goo working experience in Spark and GCP technology. Should be hands on and be able to work independently requiring minimal technical/tool guidance. Should be able to technically guide and mentor junior resources in the team. As a developer you will bring extensive design and development skills to enforce the group of developers within the team. The candidate will extensively make use and apply Continuous Integration tools and practices in the context of Deutsche Banks digitalization journey. What well offer you . 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design and discuss your own solution for addressing user stories and tasks. Develop and unit-test, Integrate, deploy, maintain, and improve software. Perform peer code review. Actively participate into the sprint activities and ceremonies e.g., daily stand-up/scrum meeting, Sprint planning, retrospectives, etc. Apply continuous integration best practices in general (SCM, build automation, unit testing, dependency management) Collaborate with other team members to achieve the Sprint objectives. Report progress/update Agile team management tools (JIRA/Confluence) Manage individual task priorities and deliverables. Responsible for quality of solutions candidate / applicant provides. Contribute to planning and continuous improvement activities & support PO, ITAO, Developers and Scrum Master. Your skills and experience Engineer with Good development experience in Google Cloud platform for at least 4 years. Hands own experience in Bigquery, Dataproc, Composer, Terraform, GKE, Cloud SQL and Cloud functions. Experience in set-up, maintenance, and ongoing development of continuous build/ integration infrastructure as a part of Devops. Create and maintain fully automated CI build processes and write build and deployment scripts. Has experience with development platformsOpenShift/ Kubernetes/Docker configuration and deployment with DevOps tools e.g., GIT, TeamCity, Maven, SONAR Good Knowledge about the core SDLC processes and tools such as HP ALM, Jira, Service Now. Knowledge on working with APIs and microservices , integrating external and internal web services including SOAP, XML, REST, JSON . Strong analytical skills. Proficient communication skills. Fluent in English (written/verbal). Ability to work in virtual teams and in matrixed organizations. Excellent team player. Open minded and willing to learn business and technology. Keeps pace with technical innovation. Understands the relevant business area. Ability to share information, transfer knowledge to expertise the team members. How well support you . . . .

Posted 1 month ago

Apply

3.0 - 7.0 years

8 - 13 Lacs

Pune

Work from Office

: J ob Title Senior Full Stack Engineer Corporate TitleAssistant Vice President LocationPune, India Role Description Enterprise SRE Team in CB is responsible for making Production Better by boosting Observability and strengthening reliability across Corporate Banking. The team actively works on building common platforms, reference architectures, tools for production engineering teams to standardize processes across CB. We work in agile environment with focus on Customer centricity and outstanding user experience with high reliability and flexibility of technical solutions in mind. With our platform we want to be an enabler for highest quality cloud-based software solutions and processes at Deustche Bank. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities What Youll Do Work on the SLO Dashboard, an application owned by the CB SRE team, ensuring its design (a highly scalable & performant solution), development, and maintenance. Participate in requirement workshops, analyze requirements, perform technical design, and take ownership of the development process. Identify and implement appropriate tools to support engineering automation, including test automation and CI/CD pipelines. Understand technical needs, prioritize requirements, and manage technical debt based on stakeholder urgency. Collaborate with the UI/UX designer while being mindful of backend changes and their impact on architecture or endpoint modifications during discussions. Produce detailed design documents and guide junior developers to align with the priorities and deliverables of the SLO Dashboard. Your skills and experience Several years relevant experiences in software architecture, design, development, and engineering, ideally in banking/finance services industry Strong engineering, solution and domain architecture background and up to date knowledge on software engineering topics such as microservices, streaming architectures, high-performance, horizontal scaling, API design, GraphQL, REST services, database systems, UI frameworks, Distributed Caching (e.g. Apache Ignite, HazelCast, Redis etc.), enterprise integration patterns, modern SDLC practices Good experience in working in GCP (Cloud based technologies) using GKE, CloudSQL (Postgres), Cloudrun, terraform. Good experience in DevOps using GitHub Actions for build, Liquibase pipelines. Fluent in application development stack such as Java/Spring-Boot (3.0+), ReactJS, Python, JavaScript/TypeScript/NodeJS, SQL Postgres DB. Ability to work in a fast-paced environment with competing and alternating priorities with a constant focus on delivery with strong interpersonal skills to manage relationships with a variety of partners and stakeholders; as well as facilitate group sessions AI Integration and Implementation (Nice to have): Leverage AI tools like GitHub Copilot, Google Gemini, Llama and other language models to optimize engineering analytics and workflows. Design and implement AI-driven dashboards and reporting tools for stakeholders Apply AI tools to automate repetitive tasks, analyze complex engineering datasets, and derive trends and patterns. How well support you

Posted 1 month ago

Apply

4.0 - 9.0 years

13 - 18 Lacs

Pune

Work from Office

: Job Title- Partner Data - Senior Java/API Engineer Location- Pune Role Description Our team is part of the area Technology, Data and Innovation (TDI) Private Bank. Within TDI, Partnerdata is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we maintain critical functionality on the mainframe, but build new solutions (REST services, Angular frontend, analytics capabilities) in a public and private cloud environment. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate with a passion for cloud solutions (GCP) and Big Data. What we'll offer you As part of our flexible scheme, here are just some of the benefits that you'll enjoy 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities You are responsible for the implementation of new project requirements on GCP (Cloud Run, Kubernetes, Cloud SQL, Terraform) You are responsible for the design and translation of high-level business requirements into software You are responsible for extending the service architecture, designing enhancements, and establishing best practices You support the migration of existing functionalities to the Google Cloud Platform You are responsible for the stability of the application landscape and support software releases You provide L3 support in case of incidents and facilitate the application governance procedure You are willing to work and code as part of an agile team (Java, Scala, Spring Boot) Your skills and experience You have multiple years of experience in designing and developing REST services (REST, OpenAPI, API-first) You have experience with Java and especially Spring and Spring Boot applications; Spring Cloud is a bonus You have experience in querying SQL databases and are familiar with relational databases in general Yous have experience with developing container-based applications and are familiar with container orchestration frameworks such as Kubernetes/Openshift You are familiar with cloud technologies and especially Google Cloud (Cloud Run, Kubernetes Engine, Cloud SQL, Big Query) You are familiar with building and deploying code in an enterprise-grade environment utilizing CI/CD pipelines, especially, Maven, JFrog Artifactory & GitHub Actions You have a good understanding of IaC concepts and tools such as Terraform Knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processes is a bonus You enjoy working in a team setting in an independent and target-oriented way You have very good English skills which allow you to communicate professionally, but also informally, with all team members How well support you . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 month ago

Apply

3.0 - 7.0 years

8 - 13 Lacs

Pune

Work from Office

: Job Title- Senior Engineer PD Location- Pune, India Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate for the Cloud Data Engineer area. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities You are responsible for the implementation of the new project on GCP (Spark, Dataproc, Dataflow, BigQuery, Terraform etc) in the whole SDLC chain You are responsible for the support of the migration of current functionalities to Google Cloud You are responsible for the stability of the application landscape and support software releases You also support in L3 topics and application governance You are responsible in the CTM area for coding as part of an agile team (Java, Scala, Spring Boot) Your skills and experience You have experience with databases (BigQuery, Cloud SQl, No Sql, Hive etc.) and development preferably for Big Data and GCP technologies Strong understanding of Data Mesh Approach and integration patterns Understanding of Party data and integration with Product data Your architectural skills for big data solutions, especially interface architecture allows a fast start You have experience in at leastSpark, Java ,Scala and Python, Maven, Artifactory, Hadoop Ecosystem, Github Actions, GitHub, Terraform scripting You have knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processes You can work very well in teams but also independent and are constructive and target oriented Your English skills are good and you can both communicate professionally but also informally in small talks with the team How well support you

Posted 1 month ago

Apply

7.0 - 11.0 years

10 - 20 Lacs

Indore, Pune

Work from Office

Exp Range 7+ Years Location Pune/Indore Notice – Immediate Senior DevOps Engineer Location: Indore, Pune – work from office. Job Summary: We are seeking an experienced and enthusiastic Senior DevOps Engineer with 7+ years of dedicated experience to join our growing team. In this pivotal role, you will be instrumental in designing, implementing, and maintaining our continuous integration, continuous delivery (CI/CD) pipelines, and infrastructure automation. You will champion DevOps best practices, optimize our cloud-native environments, and ensure the reliability, scalability, and security of our systems. This role demands deep technical expertise, an initiative-taking mindset, and a strong commitment to operational excellence. Key Responsibilities: CI/CD Pipeline Management: Design, build, and maintain robust and automated CI/CD pipelines using GitHub Actions to ensure efficient and reliable software delivery from code commit to production deployment. Infrastructure Automation: Develop and manage infrastructure as code (IaC) using Shell scripting and GCloud CLI to provision, configure, and manage resources within Google Cloud Platform (GCP) . Deployment Orchestration: Implement and optimize deployment strategies, leveraging GitHub for version control of deployment scripts and configurations, ensuring repeatable and consistent releases. Containerization & Orchestration: Work extensively with Docker for containerizing applications, including building, optimizing, and managing Docker images. Artifact Management: Administer and optimize artifact repositories, specifically Artifactory in GCP , to manage dependencies and build artifacts efficiently. System Reliability & Performance: Monitor, troubleshoot, and optimize the performance, scalability, and reliability of our cloud infrastructure and applications. Collaboration & Documentation: Work closely with development, QA, and operations teams. Utilize Jira for task tracking and Confluence for comprehensive documentation of systems, processes, and best practices. Security & Compliance: Implement and enforce security best practices within the CI/CD pipelines and cloud infrastructure, ensuring compliance with relevant standards. Mentorship & Leadership: Provide technical guidance and mentorship to junior engineers, fostering a culture of learning and continuous improvement within the team. Incident Response: Participate in on-call rotations and provide rapid response to production incidents, perform root cause analysis, and implement preventative measures. Required Skills & Experience (Mandatory - 7+ Years): Proven experience (7+ years) in a DevOps, Site Reliability Engineering (SRE), or similar role. Expert-level proficiency with Git and GitHub , including advanced branching strategies, pull requests, and code reviews. Experience designing and implementing CI/CD pipelines using GitHub Actions. Deep expertise in Google Cloud Platform (GCP) , including compute, networking, storage, and identity services. Advanced proficiency in Shell scripting for automation, system administration, and deployment tasks. Strong firsthand experience with Docker for containerization, image optimization, and container lifecycle management. Solid understanding and practical experience with Artifactory (or similar artifact management tools) in a cloud environment. Expertise in using GCloud CLI for automating GCP resource management and deployments. Demonstrable experience with Continuous Integration (CI) principles and practices. Proficiency with Jira for agile project management and Confluence for knowledge sharing. Strong understanding of networking concepts, security best practices, and system monitoring. Excellent critical thinking skills and an initiative-taking approach to identifying and resolving issues. Nice-to-Have Skills: Experience with Kubernetes (GKE) for container orchestration. Familiarity with other Infrastructure as Code (IaC) tools like Terraform . Experience with monitoring and logging tools such as Prometheus, Grafana, or GCP's Cloud Monitoring/Logging. Proficiency in other scripting or programming languages (e.g., Python, Go) for automation and tool development. Experience with database management in a cloud environment (e.g., Cloud SQL, Firestore). Knowledge of DevSecOps principles and tools for integrating security into the CI/CD pipeline. GCP Professional Cloud DevOps Engineer or other relevant GCP certifications. Experience with large-scale distributed systems and microservices architectures.

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Pune

Work from Office

Responsibilities Lead and mentor a team of data engineers, providing technical guidance, setting best practices, and overseeing task execution for the migration project. Design, develop, and architect scalable ETL processes to extract, transform, and load petabytes of data from on-premises SQL Server to GCP Cloud SQL PostgreSQL. Oversee the comprehensive analysis of existing SQL Server schemas, data types, stored procedures, and complex data models, defining strategies for their optimal conversion and refactoring for PostgreSQL. Establish and enforce rigorous data validation, quality, and integrity frameworks throughout the migration lifecycle, ensuring accuracy and consistency. Collaborate strategically with Database Administrators, application architects, business stakeholders, and security teams to define migration scope, requirements, and cutover plans. Lead the development and maintenance of advanced scripts (primarily Python) for automating large-scale migration tasks, complex data transformations, and reconciliation processes. Proactively identify, troubleshoot, and lead the resolution of complex data discrepancies, performance bottlenecks, and technical challenges during migration. Define and maintain comprehensive documentation standards for migration strategies, data mapping, transformation rules, and post-migration validation procedures. Ensure data governance, security, and compliance standards are meticulously applied throughout the migration process, including data encryption and access controls within GCP. Implement Schema conversion or custom schema mapping strategy for SQL Server to PostgreSQL shift Refactor and translate complex stored procedures and T-SQL logic to PostgreSQL-compatible constructs while preserving functional equivalence. Develop and execute comprehensive data reconciliation strategies to ensure consistency and parity between legacy and migrated datasets post-cutover. Design fallback procedures and lead post-migration verification and support to ensure business continuity. Ensuring metadata cataloging and data lineage tracking using GCP-native or integrated tools. Must-Have Skills Expertise in data engineering, specifically for Google Cloud Platform (GCP). Deep understanding of relational database architecture, advanced schema design, data modeling, and performance tuning. Expert-level SQL proficiency, with extensive hands-on experience in both T-SQL (SQL Server) and PostgreSQL. Hands-on experience with data migration processes, including moving datasets from on-premises databases to cloud storage solutions. Proficiency in designing, implementing, and optimizing complex ETL/ELT pipelines for high-volume data movement, leveraging tools and custom scripting. Strong knowledge of GCP services: Cloud SQL, Dataflow, Pub/Sub, Cloud Storage, Dataproc, Cloud Composer, Cloud Functions, and Bigquery. Solid understanding of data governance, security, and compliance practices in the cloud, including the management of sensitive data during migration. Strong programming skills in Python or Java for building data pipelines and automating processes. Experience with real-time data processing using Pub/Sub, Dataflow, or similar GCP services. Experience with CI/CD practices and tools like Jenkins, GitLab, or Cloud Build for automating the data engineering pipeline. Knowledge of data modeling and best practices for structuring cloud data storage for optimal query performance and analytics in GCP. Familiarity with observability and monitoring tools in GCP (e.g., Stackdriver, Prometheus) for real-time data pipeline visibility and alerting. Good-to-Have Skills Direct experience with GCP Database Migration Service, Storage Transfer Service, or similar cloud-native migration tools. Familiarity with data orchestration using tools like Cloud Composer (based on Apache Airflow) for managing workflows. Experience with containerization tools like Docker and Kubernetes for deploying data pipelines in a scalable manner. Exposure to DataOps tools and methodologies for managing data workflows. Experience with machine learning platforms like AI Platform in GCP to integrate with data pipelines. Familiarity with data lake architecture and the integration of BigQuery with Google Cloud Storage or Dataproc.

Posted 1 month ago

Apply

2.0 - 6.0 years

5 - 9 Lacs

Noida

Work from Office

About Us :. At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service all backed by TELUS, our multi-billion dollar telecommunications parent.. Required Skills :. Minimum 6 years of experience in Architectecture, Design and building data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms.. Perform application impact assessments, requirements reviews, and develop work estimates.. Develop test strategies and site reliability engineering measures for data products and solutions.. Lead agile development "scrums" and solution reviews.. Mentor junior Data Engineering Specialists.. Lead the resolution of critical operations issues, including post-implementation reviews.. Perform technical data stewardship tasks, including metadata management, security, and privacy by design.. Demonstrate expertise in SQL and database proficiency in various data engineering tasks.. Automate complex data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect.. Develop and manage Unix scripts for data engineering tasks.. Intermediate proficiency in infrastructure-as-code tools like Terraform, Puppet, and Ansible to automate infrastructure deployment.. Proficiency in data modeling to support analytics and business intelligence.. Working knowledge of ML Ops to integrate machine learning workflows with data pipelines.. Extensive expertise in GCP technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud. Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion, Dataproc (good to have), and BigTable.. Advanced proficiency in programming languages (Python).. Qualifications:. Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field.. Analytics certification in BI or AI/ML.. 6+ years of data engineering experience.. 4 years of data platform solution architecture and design experience.. GCP Certified Data Engineer (preferred).. Show more Show less

Posted 1 month ago

Apply

2.0 - 6.0 years

5 - 9 Lacs

Noida

Work from Office

About Us :. At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service all backed by TELUS, our multi-billion dollar telecommunications parent.. Required Skills:. Design, develop, and support data pipelines and related data products and platforms.. Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms.. Perform application impact assessments, requirements reviews, and develop work estimates.. Develop test strategies and site reliability engineering measures for data products and solutions.. Participate in agile development "scrums" and solution reviews.. Mentor junior Data Engineers.. Lead the resolution of critical operations issues, including post-implementation reviews.. Perform technical data stewardship tasks, including metadata management, security, and privacy by design.. Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies. Demonstrate SQL and database proficiency in various data engineering tasks.. Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect.. Develop Unix scripts to support various data operations.. Model data to support business intelligence and analytics initiatives.. Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation.. Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog,. Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have).. Qualifications:. Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field.. 4+ years of data engineering experience.. 2 years of data solution architecture and design experience.. GCP Certified Data Engineer (preferred).. Show more Show less

Posted 1 month ago

Apply

2.0 - 6.0 years

5 - 9 Lacs

Noida

Work from Office

About Us :. At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service all backed by TELUS, our multi-billion dollar telecommunications parent.. Required Skills :. 5+ years of industry experience in data engineering, business intelligence, or a related field with experience in manipulating, processing, and extracting value from datasets.. Expertise in architecting, designing, building, and deploying internal applications to support technology life cycle management, service delivery management, data, and business intelligence.. Experience in developing modular code for versatile pipelines or complex ingestion frameworks aimed at loading data into Cloud SQL and managing data migration from multiple on-premises sources.. Strong collaboration with analysts and business process owners to translate business requirements into technical solutions.. Proficiency in coding with scripting languages (Shell scripting, Python, SQL).. Deep understanding and hands-on experience with Google Cloud Platform (GCP) technologies, especially in data migration and warehousing, including Database Migration Service (DMS), Cloud SQL, BigQuery,. Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage (GCS), IAM, Compute Engine, Cloud Data Fusion, and optionally Dataproc.. Adherence to best development practices including technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, and writing clean, modular, self-sustaining code.. Familiarity with CI/CD processes using GitHub, Cloud Build, and Google Cloud SDK.. Qualifications:. Bachelor's degree in Computer Science or a related technical field, or equivalent practical experience.. GCP Certified Data Engineer (preferred).. Excellent verbal and written communication skills with the ability to effectively advocate technical solutions to research scientists, engineering teams, and business audiences.. Show more Show less

Posted 1 month ago

Apply

4.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Who we are. About Stripe. Stripe is a financial infrastructure platform for businesses. Millions of companies—from the world’s largest enterprises to the most ambitious startups—use Stripe to accept payments, grow their revenue, and accelerate new business opportunities. Our mission is to increase the GDP of the internet, and we have a staggering amount of work ahead. That means you have an unprecedented opportunity to put the global economy within everyone’s reach while doing the most important work of your career.. About The Team. The Reporting Platform Data Foundations group maintains and evolves the core systems that power reporting data for Stripe's users. We're responsible for Aqueduct, the data ingestion and processing platform that powers core reporting data for millions of businesses on Stripe. We integrate with the latest Data Platform tooling, such as Falcon for real-time data. Our goal is to provide a robust, scalable, and efficient data infrastructure that enables clear and timely insights for Stripe's users.. What you'll do. As a Software Engineer on the Reporting Platform Data Foundations group, you will lead efforts to improve and redesign core data ingestion and processing systems that power reporting for millions of Stripe users. You'll tackle complex challenges in data management, scalability, and system architecture.. Responsibilities. Design and implement a new backfill model for reporting data that can handle hundreds of millions of row additions and updates efficiently. Revamp the end-to-end experience for product teams adding or changing API-backed datasets, improving ergonomics and clarity. Enhance the Aqueduct Dependency Resolver system, responsible for determining what critical data to update for Stripe’s users based on events. Areas include error management, observability, and delegation of issue resolution to product teams. Lead integration with the latest Data Platform tooling, such as Falcon for real-time data, while managing deprecation of older systems. Implement and improve data warehouse management practices, ensuring data freshness and reliability. Collaborate with product teams to understand their reporting needs and data requirements. Design and implement scalable solutions for data ingestion, processing, and storage. Onboard, spin up, and mentor engineers, and set the group’s technical direction and strategy. Who you are. We're looking for someone who meets the minimum requirements to be considered for the role. If you meet these requirements, you are encouraged to apply. The preferred qualifications are a bonus, not a requirement.. Minimum Requirements. 8+ years of professional experience writing high quality production level code or software programs.. Extensive experience in designing and implementing large-scale data processing systems. Strong background in distributed systems and data pipeline architectures. Proficiency in at least one modern programming language (e.g., Go, Java, Python, Scala). Experience with big data technologies (e.g., Hadoop, Flink, Spark, Kafka, Pinot, Trino, Iceberg). Solid understanding of data modeling and database systems. Excellent problem-solving skills and ability to tackle complex technical challenges. Strong communication skills and ability to work effectively with cross-functional teams. Experience mentoring other engineers and driving technical initiatives. Preferred Qualifications. Experience with real-time data processing and streaming systems. Knowledge of data warehouse technologies and best practices. Experience in migrating legacy systems to modern architectures. Contributions to open-source projects or technical communities. In-office expectations. Office-assigned Stripes in most of our locations are currently expected to spend at least 50% of the time in a given month in their local office or with users. This expectation may vary depending on role, team and location. For example, Stripes in Stripe Delivery Center roles in Mexico City, Mexico and Bengaluru, India work 100% from the office. Also, some teams have greater in-office attendance requirements, to appropriately support our users and workflows, which the hiring manager will discuss. This approach helps strike a balance between bringing people together for in-person collaboration and learning from each other, while supporting flexibility when possible.. Pay and benefits. Stripe does not yet include pay ranges in job postings in every country. Stripe strongly values pay transparency and is working toward pay transparency globally.. Show more Show less

Posted 1 month ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring a Cloud Application Architect to design cloud-native applications that are scalable, secure, and resilient. Ideal for professionals experienced in modern architectures and distributed systems. Key Responsibilities: Architect cloud-based applications using microservices and serverless patterns Guide teams on best practices in scalability, observability, and fault tolerance Ensure integration with cloud services, APIs, and third-party systems Collaborate with stakeholders on application modernization and cloud-native transitions Required Skills & Qualifications: Proficiency with AWS, Azure, or GCP services for application development Experience in containerized and serverless architectures (Kubernetes, Lambda, Functions) Strong background in backend development (Java, Go, Node.js, or Python) Bonus: Familiarity with API gateways, service mesh, and CI/CD pipelines Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 1 month ago

Apply

5.0 - 10.0 years

9 - 19 Lacs

Pune, Chennai, Bengaluru

Hybrid

Project Role : Cloud Platform Architect Project Role Description : Oversee application architecture and deployment in cloud platform environments -- including public cloud, private cloud and hybrid cloud. This can include cloud adoption plans, cloud application design, and cloud management and monitoring. Must have skills : Google Cloud Platform Architecture Summary: As a Cloud Platform Architect, you will be responsible for overseeing application architecture and deployment in cloud platform environments, including public cloud, private cloud, and hybrid cloud. Your typical day will involve designing cloud adoption plans, managing and monitoring cloud applications, and ensuring cloud application design meets business requirements. Roles & Responsibilities: - Design and implement cloud adoption plans, including public cloud, private cloud, and hybrid cloud environments. - Oversee cloud application design, ensuring it meets business requirements and aligns with industry best practices. - Manage and monitor cloud applications, ensuring they are secure, scalable, and highly available. - Collaborate with cross-functional teams to ensure cloud applications are integrated with other systems and services. - Stay up-to-date with the latest advancements in cloud technology, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: - Must To Have Skills: Strong experience in Google Cloud Platform Architecture. - Good To Have Skills: Experience with other cloud platforms such as AWS or Azure. - Experience in designing and implementing cloud adoption plans. - Strong understanding of cloud application design and architecture. - Experience in managing and monitoring cloud applications. - Solid grasp of cloud security, scalability, and availability best practices.

Posted 1 month ago

Apply

4.0 - 7.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Comprehensive understanding of BGP: Configure and troubleshoot in large global environments. Comprehensive understanding of routing concepts: Inter-VR routing, policy-based routing, routing protocols, route filtering, path selection, access-lists etc. Comprehensive understanding of switching concepts: VLANs, Layer 2, Mac-forwarding, vlan trunking, VRRP, Gratuitous ARP. Comprehensive understanding of firewall/security concepts: L2-L7, all versions of NAT, failover scenarios, zonal concepts, IPSec, L7 encryption concepts, URL filtering, DNS, security profiles and rules, proxying. Comprehensive understanding of Load Balancing concepts: Cloud LB and conventional LB and their differences in functionality. Good understanding of Public Cloud platforms: Preferably GCP; specifically Networking, Firewalling, IAM and how they relate to Cloud Native services (PSA, Cloud SQL, GCVE, Cloud Interconnects, BMS, FileStore, Netapp, etc). Good understanding of Infrastructure as Code (IAC) to provision resources: Must be able to customize and optimize the codebase to simplify deliveries. Good understanding of Linux administration: Using Linux to bridge technical gaps in Windows and understanding the tools available to troubleshoot network connectivity. Understanding of APIs: in order to expedite data collection and configuration to eliminate human error. Understanding of DevOps: how it can improve delivery and operation Primary Skills Products Juniper: MX, SRX, QFX Palo Alto: Physical and virtual firewalls, Panorama. Google Cloud Platform Squid Proxy Tools Terraform Algosec or similar tool for traffic flow governance. Ansible M2VM (GCP migration tool) Azure DevOps pipelines Azure DevOps GIT Mandatory languages Python HCL (HashiCorp Configuration Language) YAML JSON

Posted 1 month ago

Apply

3.0 - 5.0 years

7 - 11 Lacs

Pune

Work from Office

Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. What well offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities DesigningandimplementationofGCPinfrastructureusingIaC Automationofrecurringprocesses WorkingcloselywithdevelopmentteamsandimplementingCI/CDpipelinesforbuilding,testing and deployingapplications Containerizingapplicationsandorchestratingcontainers Designingandimplementationofapplicationenvironmentstoeasedevelopment,testing,and Release processes Monitoringtheinfrastructureandapplicationforimprovements Maintainingandupgradingcurrentprocesses Cost-cuttinganalysis Your skills and experience Experienceinworkingwith GoogleCloudPlatform Experienceincontainerizationandorchestration (Docker,GKE,ArtifactRegistry,CloudRun,CloudSQL) ExperiencewithIaC(Terraform) ExperienceinwritingCI/CDforapplicationsandinfrastructure(GitHubworkflows,Jenkins,etc.) Experienceinusingmonitoringtools(CloudMonitoring) Knowledgeofatleastonescriptinglanguage BasicDevSecOpsskills Experienceinuserandpermissionsmanagement(IAM) 3-5 Years of Experience as DevOps Engineer CertificationinGCPPreffered How well support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.

Posted 1 month ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Bengaluru

Work from Office

A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In your role, you will be responsible for Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence

Posted 1 month ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Bengaluru

Work from Office

A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In your role, you will be responsible for Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies