Jobs
Interviews

279 Cloud Sql Jobs - Page 7

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

10 - 20 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Salary: 8 to 24 LPA Exp: 3 to 7 years Location: Gurgaon/Pune/Bengalore Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engineer, Candidate will work on the assignments for one of our Utilities clients. Collaborating with cross-functional teams and stakeholders involves gathering data requirements, aligning business goals, and translating them into scalable data solutions. The role includes working closely with data analysts, scientists, and business users to understand needs, designing robust data pipelines, and ensuring data is accessible, reliable, and well-documented. Regular communication, iterative feedback, and joint problem-solving are key to delivering high-impact, data-driven outcomes that support organizational objectives. This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.

Posted 2 months ago

Apply

5.0 - 7.0 years

4 - 8 Lacs

Pune

Work from Office

We are looking for a skilled PostgreSQL Expert with 5 to 7 years of experience in the field. The ideal candidate should have expertise in GCP Cloud SQL knowledge, DB DDL, DML, and production support. This position is located in Pune. Roles and Responsibility Design, develop, and implement database architectures using PostgreSQL. Develop and maintain databases on GCP Cloud SQL. Ensure high availability and performance of database systems. Troubleshoot and resolve database-related issues. Collaborate with cross-functional teams to identify and prioritize database requirements. Implement data security and access controls. Job Strong knowledge of PostgreSQL and GCP Cloud SQL. Experience with DB DDL, DML, and production support. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills. Familiarity with database design principles and best practices.

Posted 2 months ago

Apply

10.0 - 20.0 years

12 - 22 Lacs

Pune

Work from Office

Your key responsibilities Ensures that the Service Operations team provides optimum service level to the business lines it supports. Takes overall responsibility for the resolution of incidents and problems within the team. Oversees the resolution of complex incidents. Ensure that Analysts apply the right problem-solving techniques and processes. Assists in managing business stakeholder relationships. Assists in defining and managing OLAs with relevant stakeholders. Ensures that the team understands OLAs and resources appropriately and are aligned to business SLAs. Ensures relevant Client Service teams are informed of progress on incidents, where necessary. Ensures that defined divisional Production Management service operations and support processes are adhered to by the team. Make improvement recommendations where appropriate. Prepares for and, if requested, manages steam review meetings. Makes suggestions for continual service improvement. Manages escalations by working with Client Services and other Service Operations Specialists and relevant functions to accurately resolve escalated issues quickly. Observes areas requiring monitoring, reporting and improvement. Identifies required metrics and ensure they are established, monitored and improved where appropriate. Continuously seeks to improve team performance. Participates in team training events, where appropriate. Works with team members to identify areas of focus, where training may improve team performance, and improve incident resolution. Mentors and coaches Production Management Analysts within the team by providing career development and counselling, as needed. Assists Production Management Analysts in setting performance targets; and manages performance against them. Identifies team bottlenecks (obstacles) and takes appropriate actions to eliminate them. Level 3 or Advanced support for technical infrastructure components Evaluation of new products including prototyping and recommending new products including automation Specify/select tools to enhance operational support. Champion activities and establishes best practices in specialist area, working to implement best of breed test practices and processes in area of profession. Defines and implements best practices, solutions and standards related to their area of expertise Builds captures and manages the transfers of knowledge across the Service Operations organization Fulfil Service Requests addressed to L2 Support Communicate with Service Desk function, other L2 and L3 units Incident-, Change-, Problem Management and Service Request Fulfillment Solving incidents of customers in time Log file analysis and root cause analysis Participating in major incident calls for high priority incidents Resolving inconsistencies of data replication Supporting Problem management to solve Application issues Creating/Executing Service Requests for Customers, provide Reports and Statistics Escalating and informing about incidents in a timely manner Documentation of Tasks, Incidents, Problems and Changes Documentation in Service Now Documentation in Knowledgebases Improving monitoring of the application Adding requests for Monitoring Adding alerts and thresholds for occurring issues Implementing automation of tasks Your skills and experience Service Operations Specialist experience within a global operations context Extensive experience of supporting complex application and infrastructure domains Experience managing and mentoring Service Operations teams Broad ITIL/best practice service context within a real time distributed environment Experience managing relationships across multiple disciplines and time zones Ability to converse clearly with internal and external staff via telephone and written communication Good knowledge on interface technologies and communication protocols Be willing to work in DE business hours Clear and concise documentation in general and especially a proper documentation of the current status of incidents, problems and service requests in the Service Management tool Thorough and precise work style with a focus on high quality Distinct service orientation High degree of self-initiative Bachelors Degree from an accredited college or university with a concentration in IT or Computer Science related discipline (equivalent diploma or technical faculty) ITIL certification and experience with ITSM tool ServiceNow (preferred) Know How on Banking domain and preferably regulatory topics around know your customer processes Experience with databases like BigQuery and good understanding of Big Data and GCP technologies Experience in at least: GitHub, Terraform, Cloud SQL, Cloud Storage, Dataproc, Dataflow Architectural skills for big data solutions, especially interface architecture You can work very well in teams but also independent and you are constructive and target oriented Your English skills are very good and you can both communicate professionally but also informally in small talks with the team Area specific tasks / responsibilities: Handling Incident- /Problem Management und Service Request Fulfilment Analyze Incidents, which are addressed from 1st Level Support Analyze occurred errors out of the batch processing and interfaces of related systems Resolution or Workaround determination and implementation Supporting the resolution of high impact incidents on our services, including attendance at incident bridge calls Escalate incident tickets and working with members of the team and Developers Handling Service Request eg. Reports for Business and Projects Providing resolution for open problems, or ensuring that the appropriate parties have been tasked with doing so Supporting the handover from new Projects / Applications into Production Services with Service Transition before Go Life Phase Supporting Oncall-Support activities

Posted 2 months ago

Apply

8.0 - 13.0 years

2 - 30 Lacs

Bengaluru

Work from Office

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Bengaluru, Karnataka, India; Hyderabad, Telangana, India Minimum qualifications: Bachelor's degree in Computer Science, Engineering, Mathematics, a related field, or equivalent practical experience Experience in distributed data processing frameworks and modern age Google Cloud Platform (GCP) analytical and transactional data stores like BigQuery, CloudSQL, AlloyDB etc, and experience in one Database type to write SQLs Experience in distributed data processing frameworks and modern age GCP analytical and transactional data stores like BigQuery, CloudSQL, AlloyDB etc, and experience in one Database type to write SQLs Experience in GCP Preferred qualifications: Experience in working with/on data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools, environments, and data structures Experience with encryption techniques like symmetric, asymmetric, HSMs, and envelop, and ability to implement secure key storage using Key Management System Experience in building multi-tier, high availability applications with modern technologies such as NoSQL, MongoDB, SparkML, and TensorFlow Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments Experience in Big Data, information retrieval, data mining, or Machine Learning Experience with IaC and CICD tools like Terraform, Ansible, Jenkins etc About The Job The Google Cloud Platform team helps customers transform and build what's next for their business ? all with technology built in the cloud Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware Our teams are dedicated to helping our customers ? developers, small and large businesses, educational institutions and government agencies ? see the benefits of our technology come to life As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze, and explore/visualize data on the Google Cloud Platform You will work on data migrations and modernization projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform/product tests You will travel to customer sites to deploy solutions and deliver workshops to educate and empower customers Additionally, you will work with Product Management and Product Engineering teams to build and drive excellence in our products Google Cloud accelerates every organizations ability to digitally transform its business and industry We deliver enterprise-grade solutions that leverage Googles cutting-edge technology, and tools that help developers build more sustainably Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems Responsibilities Interact with stakeholders to translate customer requirements into recommendations for appropriate solution architectures and advisory services Engage with technical leads, and partners to lead high velocity migration and modernization to Google Cloud Platform (GCP) Design, Migrate/Build and Operationalize data storage and processing infrastructure using Cloud native products Develop and implement data quality and governance procedures to ensure the accuracy and reliability of data Take various project requirements and organize them into clear goals and objectives, and create a work breakdown structure to manage internal and external stakeholders Google is proud to be an equal opportunity workplace and is an affirmative action employer We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status We also consider qualified applicants regardless of criminal histories, consistent with legal requirements See also Google's EEO Policy and EEO is the Law If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form

Posted 2 months ago

Apply

5.0 - 8.0 years

6 - 12 Lacs

Chennai

Work from Office

Design and develop scalable cloud-based data solutions on Google Cloud Platform (GCP) Build and optimize Python-based ETL pipelines and data workflows Work with NoSQL databases (Bigtable, Firestore, MongoDB) for high-performance data management

Posted 2 months ago

Apply

11.0 - 16.0 years

40 - 45 Lacs

Pune

Work from Office

Role Description This role is for a Senior business functional analyst for Group Architecture. This role will be instrumental in establishing and maintaining bank wide data policies, principles, standards and tool governance. The Senior Business Functional Analyst acts as a link between the business divisions and the data solution providers to align the target data architecture against the enterprise data architecture principles, apply agreed best practices and patterns. Group Architecture partners with each division of the bank to ensure that Architecture is defined, delivered, and managed in alignment with the banks strategy and in accordance with the organizations architectural standards. Your key responsibilities Data Architecture: The candidate will work closely with stakeholders to understand their data needs and break out business requirements into implementable building blocks and design the solution's target architecture. AI/ML: Identity and support the creation of AI use cases focused on delivery the data architecture strategy and data governance tooling. Identify AI/ML use cases and architect pipelines that integrate data flows, data lineage, data quality. Embed AI-powered data quality, detection and metadata enrichment to accelerate data discoverability. Assist in defining and driving the data architecture standards and requirements for AI that need to be enabled and used. GCP Data Architecture & Migration: A strong working experience on GCP Data architecture is must (BigQuery, Dataplex, Cloud SQL, Dataflow, Apigee, Pub/Sub, ...). Appropriate GCP architecture level certification. Experience in handling hybrid architecture & patterns addressing non- functional requirements like data residency, compliance like GDPR and security & access control. Experience in developing reusable components and reference architecture using IaaC (Infrastructure as a code) platforms such as terraform. Data Mesh: The candidate is expected to have proficiency in Data Mesh design strategies that embrace the decentralization nature of data ownership. The candidate must have good domain knowledge to ensure that the data products developed are aligned with business goals and provide real value. Data Management Tool: Access various tools and solutions comprising of data governance capabilities like data catalogue, data modelling and design, metadata management, data quality and lineage and fine-grained data access management. Assist in development of medium to long term target state of the technologies within the data governance domain. Collaboration: Collaborate with stakeholders, including business leaders, project managers, and development teams, to gather requirements and translate them into technical solutions. Your skills and experience Demonstrable experience in designing and deploying AI tooling architectures and use cases Extensive experience in data architecture, within Financial Services Strong technical knowledge of data integration patterns, batch & stream processing, data lake/ data lake house/data warehouse/data mart, caching patterns and policy bases fine grained data access. Proven experience in working on data management principles, data governance, data quality, data lineage and data integration with a focus on Data Mesh Knowledge of Data Modelling concepts like dimensional modelling and 3NF. Experience of systematic structured review of data models to enforce conformance to standards. High level understanding of data management solutions e.g. Collibra, Informatica Data Governance etc. Proficiency at data modeling and experience with different data modelling tools. Very good understanding of streaming and non-streaming ETL and ELT approaches for data ingest. Strong analytical and problem-solving skills, with the ability to identify complex business requirements and translate them into technical solutions.

Posted 2 months ago

Apply

18.0 - 22.0 years

25 - 30 Lacs

Pune

Work from Office

Treasury Technology are responsible for the design, build and operation of Deutsche Banks Treasury trading, Balance-sheet Management and Liquidity Reporting ecosystem. In partnership with the Treasury business we look to deliver innovative technology solutions that will enable business to gain competitive edge and operational efficiency. This is a Global role to lead the Engineering function for Treasury Engineering product portfolio. This aim is to develop best in class portfolio consisting of following products: Liqudity Measurement and Management Issuance and Securitization Risk in Banking Book Funds Transfer Pricing Treasury is about managing the money and financial risks in a business. This involves making sure the business has the capital it needs to manage its day-to-day business obligations, while helping develop its long term financial strategy and policies. Economic factors such as interest rate rises, changes in regulations and volatile foreign exchange rates can have a serious impact on any business. Treasuey is responsobile to monitor and assess market conditions and put strategies in place to mitigate any potential financial risks to the business. As a senior leader in Software Engineering, you will lead a highly inspired and inquisitive team of technologists to develop applications to the highest standards. You will be expected to solve complex business and technical challenges while managing a large and senior business stakeholders. You will build an effective and trusted global engineering capability that can deliver consistently against the business ambitions. You are expected to take ownership of the quality of the platform, dev automation, agile processes and production resiliency. Position Specific Responsibilities and Accountabilities: Lead the Global Engineering function across our strategic locations based at Pune, Buchrest, London and New York Communicate with senior business stakeholders with regards to the vision and business goals. Provide transparency to program status, and manage risks and issues Lead a culture of innovation and experimentation, support full software development lifecycle that incorporates the best of technology approaches and delivery methodologies Ensure on time product releases that are of high quality, enabling the core vision of next generation trade processing systems compliant with regulatory requirements Lead development of next generation of cloud enabled platforms which includes modern web frameworks and complex transaction processing systems leveraging a broad set of technology stacks Experience in building fault-tolerant, low-latency, scalable solutions that are performed at a global enterprise scale Implement the talent strategy for engineering aligned to the broader Treasury Technology talent strategy & operating model Develop application with industry best practise using DevOps and automated deployment and testing framework Skills Matrix: Education Qualifications: Degree from an accredited college or university (or equivalent certification and/or relevant work experience). Business Analysis and SME Experience: 18+ years experience in the following areas: Well-developed requirements analysis skills, including good communication abilities (both speaking and listening) and stakeholder management (all levels up to Managing Director). Experience working with Front Office business teams highly desirable Experience in IT delivery or architecture including experience as an Application Developer and people manager Strong object-oriented design skills Previous experience hiring, motivating and managing internal and vendor teams. Technical Experience Mandatory Skills: Java, ideally Spark and Scala Oracle PostGres other Database technologies Experience developing microservices based architectures UI design and implementation Business Process management tools (e.g.g JBPM, IBM BPN) Experience with a range of BI technologies including Tableau Experience with DevOps best practices (DORA), CI/CD Experience in application security, scalability, performance tuning and optimization (NFRs) Experience in API designing, sound knowledge of micro services, containerization (Docker), exposure to federated and NoSQL DB. Experience in database query tuning and optimization Experience in implementing Devops best practices including CI CD, Implementing API testing automation. Experience working in an Agile based team ideally Scrum Desirable skills: Experience with Cloud Services Platforms in particular Google Cloud, and internal cloud based development (Cloud Run, Cloud Composer, Cloud SQL, Docker, K8s) Industry Domain Experience Hands-on knowledge of enterprise technology platforms supporting Front Office, Finance and/or Risk domains would be a significant advantage, as would experience or interest in Sustainable Finance. For example: Knowledge of the Finance/controlling domain and end-to-end workflow for a banking & trading businesses. High level understanding of financial products across Investment, Corporate and Private/Retail banking, in particular Loans. Knowledge of the investment banking, sales & trading, asset management and similar industries is a strong advantage. Clear Thought & Leadership A mindset built on simplicity A clear understanding of the concept of re-use in software development, and the drive to apply it relentlessly Proficiency to talk in functional and data terms to clients, embedded architects and senior managers Technical Leadership skills Ability to work in a fast paced environment with competing and alternating priorities with a constant focus on delivery. Proven ability to balance business demands and IT fulfillment in terms of standardisation, reducing risk and increasing IT flexibility. Logical & structured approach to problem-solving in both near-term (tactical) and mid-long term (strategic) horizons. Communication: Good verbal as well as written communication and presentation capabilities. Good team player facilitator-negotiator networker. Able to lead senior managers towards common goals and build consensus across a diverse group. Able to lead and influence a diverse team from a range of technical and non-technical backgrounds.

Posted 2 months ago

Apply

3.0 - 8.0 years

11 - 16 Lacs

Pune

Work from Office

Job Title Lead Engineer Location Pune Corporate Title Director As a lead engineer within the Transaction Monitoring department, you will lead and drive forward critical engineering initiatives and improvements to our application landscape whilst supporting and leading the engineering teams to excel in their roles. You will be closely aligned to the architecture function and delivery leads, ensuring alignment with planning and correct design and architecture governance is followed for all implementation work. You will lead by example and drive and contribute to automation and innovation initiatives with the engineering teams. Join the fight against financial crime with us! Your key responsibilities Experienced hands-on cloud and on-premise engineer, leading by example with engineering squads Thinking analytically, with systematic and logical approach to solving complex problems, and high attention to detail Design & document complex technical solutions at varying levels in an inclusive and participatory manner with a range of stakeholders Liaise and face-off directly to stakeholders in technology, business and modelling areas Collaborating with application development teams to design and prototype solutions (both on-premises and on-cloud), supporting / presenting these via the Design Authority forum for approval and providing good practice and guidelines to the teams Ensuring engineering & architecture compliance with bank-standard processes for deploying new applications, working directly with central functions such as Group Architecture, Chief Security Office and Data Governance Innovate and think creatively, showing willingness to apply new approaches to solving problems and to learn new methods, technologies and potentially outside-of-box solution Your skills and experience Proven hands-on engineering and design experience in a delivery-focused (preferably agile) environment Solid technical/engineering background, preferably with at least two high level languages and multiple relational databases or big-data technologies Proven experience with cloud technologies, preferably GCP (GKE / DataProc / CloudSQL / BigQuery), GitHub & Terraform Competence / expertise in technical skills across a wide range of technology platforms and ability to use and learn new frameworks, libraries and technologies A deep understanding of the software development life cycle and the waterfall and agile methodologies Experience leading complex engineering initiatives and engineering teams Excellent communication skills, with demonstrable ability to interface and converse at both junior and level and with non-IT staff Line management experience including working in a matrix management configuration How well support you Training and development to help you excel in your career Flexible working to assist you balance your personal priorities Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs

Posted 2 months ago

Apply

3.0 - 7.0 years

37 - 40 Lacs

Bengaluru

Work from Office

: Job TitleDevOps Engineer, AS LocationBangalore, India Role Description Deutsche Bank has set for itself ambitious goals in the areas of Sustainable Finance, ESG Risk Mitigation as well as Corporate Sustainability. As Climate Change throws new Challenges and opportunities, Bank has set out to invest in developing a Sustainability Technology Platform, Sustainability data products and various sustainability applications which will aid Banks goals. As part of this initiative, we are building an exciting global team of technologists who are passionate about Climate Change, want to contribute to greater good leveraging their Technology Skillset in Cloud / Hybrid Architecture. As part of this Role, we are seeking a highly skilled and experienced DevOps Engineer to join our growing team. In this role, you will play a pivotal role in managing and optimizing cloud infrastructure, facilitating continuous integration and delivery, and ensuring system reliability. What well offer you . 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Create, implement, and oversee scalable, secure, and cost-efficient cloud infrastructures on Google Cloud Platform (GCP). Utilize Infrastructure as Code (IaC) methodologies with tools such as Terraform, Deployment Manager, or alternatives. Implement robust security measures to ensure data access control and compliance with regulations. Adopt security best practices, establish IAM policies, and ensure adherence to both organizational and regulatory requirements. Set up and manage Virtual Private Clouds (VPCs), subnets, firewalls, VPNs, and interconnects to facilitate secure cloud networking. Establish continuous integration and continuous deployment (CI/CD) pipelines using Jenkins, GitHub Actions, or comparable tools for automated application deployments. Implement monitoring and alerting solutions through Stackdriver (Cloud Operations), Prometheus, or other third-party applications. Evaluate and optimize cloud expenditures by utilizing committed use discounts, autoscaling features, and resource rightsizing. Manage and deploy containerized applications through Google Kubernetes Engine (GKE) and Cloud Run. Deploy and manage GCP databases like Cloud SQL, BigQuery. Your skills and experience Minimum of 5+ years of experience in DevOps or similar roles with hands-on experience in GCP. In-depth knowledge of Google Cloud services (e.g., GCE, GKE, Cloud Functions, Cloud Run, Pub/Sub, BigQuery, Cloud Storage) and the ability to architect, deploy, and manage cloud-native applications. Proficient in using tools like Jenkins, GitLab, Terraform, Ansible, Docker, Kubernetes. Experience with Infrastructure as Code (IaC) tools like Terraform, CloudFormation, or GCP-native Deployment Manager. Solid understanding of security protocols, IAM, networking, and compliance requirements within cloud environments. Strong problem-solving skills and ability to troubleshoot cloud-based infrastructure. Google Cloud certifications (e.g., Associate Cloud Engineer, Professional Cloud Architect, or Professional DevOps Engineer) are a plus. How well support you . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 months ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxIntuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 2 months ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxIntuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 2 months ago

Apply

6.0 - 10.0 years

15 - 20 Lacs

Hyderabad

Work from Office

Develop, optimize, and maintain scalable data pipelines using Python and PySpark. Design and implement data processing workflows leveraging GCP services such as: BigQuery Dataflow Cloud Functions Cloud Storage

Posted 2 months ago

Apply

5.0 - 7.0 years

10 - 16 Lacs

Bengaluru

Work from Office

Role & responsibilities Associate should have L2 and L3 capabilities in different Versions of Microsoft SQL Servers. Installation of database software. Database Builds. Incident Management. Change Management. Problem Management. Database maintenance (Index re-build, table re-org). User Access Management. Database startup/shutdown. DBCC checks. Database re-org activities. Altering database/T-log files. Analyzing the database blocking. Analyzing session wait events. Perform database backup/restores. Migrating the database objects from Dev /QA to Production. Database refresh/cloning. Database upgrades. Database patching. Knowledge Management - Creation of SOPs for regular activities, KEDB. Knowledge on SOX/PCI Compliance reporting. DR Drill support. Always On. Azure Database Multiple node cluster adding knowledge in Power BI App Preferred candidate profile Minimum of 5-7 years of experience in Microsoft SQL Server Database Administration. Strong knowledge of Microsoft SQL Server 2016,2017,2019,2022 Administration. Ability to work in a fast-paced environment. Excellent communication, problem-solving skills, interpersonal, team member and leadership skills and abilities Perks and Benefits Competitive salary commensurate with experience, good health insurance coverage, epf, Bonus and Gratuity excellent work environment training and Certifications

Posted 2 months ago

Apply

6.0 - 9.0 years

15 - 19 Lacs

Bengaluru

Work from Office

Project description Developing cloud-based compliance state-of-the-art archive products to archive and retain real-time communications data in line with internal and external regulatory requirements. The product is developed in-house in an agile development setup based on business requirements from multiple stakeholder parties. By employing continuous development and deployment principles, the team is aiming to transition from project to product management to support the bank with robust compliance archive solutions for the next decade. For a large global investment bank, we are looking for GCP-qualified cloud engineers to help with the FIC (fixed income and currencies) cloud migrations under Project Cirrus. Knowledge of Financial Services/FIC would be great, but the primary skills we need are in building, migrating, and deploying applications to GCP, Terraform module coding, Google infrastructure, cloud-native services such as GCE, GKE, CloudSQL/Postgres, logging and monitoring, etc., & good written and spoken English, as we would like these engineers to help with knowledge transfer to our existing development & support teams. We would like to place people alongside the engineers they'll be working with in the bank. You should have extensive experience with Google Cloud Platform (GCP), Kubernetes, and Docker. role involves working closely with our development and operations teams to ensure seamless integration and deployment of applications. Responsibilities Design, implement, and manage CI/CD pipelines on GCP. Automate infrastructure provisioning, configuration, and deployment using tools like Terraform and Ansible. Manage and optimize Kubernetes clusters for high availability and scalability. Containerize applications using Docker and manage container orchestration. Monitor system performance, troubleshoot issues, and ensure system reliability and security. Collaborate with development teams to ensure smooth and reliable operation of software and systems. Implement and manage logging, monitoring, and alerting solutions. Stay updated with the latest industry trends and best practices in DevOps and cloud technologies. Skills Must have Looking for 6 to 9 years of experience as a DevOps Engineer and a minimum of 4 years of relevant experience in GCP. Bachelor's degree in Computer Science, Engineering, or a related field. Strong expertise in Kubernetes and Docker. Experience with infrastructure as code (IaC) tools such as Terraform and Ansible. Proficiency in scripting languages like Python, Bash, or Go. Familiarity with CI/CD tools such as Jenkins, GitLab CI, or CircleCI. Knowledge of networking, security, and database management. Excellent problem-solving skills and attention to detail. Nice to have Strong communication and collaboration skills. Other Languages EnglishC2 Proficient Seniority Senior

Posted 2 months ago

Apply

5.0 - 8.0 years

12 - 16 Lacs

Mangaluru, Udupi

Hybrid

SRE Lead Role Description: We are seeking an experienced SRE Strategist to lead the reliability and operational excellence agenda for our Enterprise Data Platforms spanning GCP cloud-native systems. This strategic leadership role will help instill Google’s SRE principles across diverse data engineering teams, uplift our platform reliability posture, and spearhead the creation of a Centre-of-Excellence (CoE) for SRE. The ideal candidate will possess a deep understanding of modern SRE practices, demonstrate a proven ability to scale SRE capabilities in large enterprises, and evangelise a data-driven approach to resilience engineering. Key Responsibilities: Define and drive SRE strategy for enterprise data platforms on GCP, aligning with business goals and reliability needs. Act as a trusted advisor to platform teams, embedding SRE mindset, best practices, and golden signals into their SDLC and operational processes. Set up and lead a Site Reliability Engineering CoE, delivering reusable tools, runbooks, blueprints, and platform accelerators to scale SRE adoption across the organisation. Partner with product and platform owners to prioritise and structure SRE backlogs, formulate roadmaps, and help teams move from reactive ops to proactive reliability engineering. Define and track SLIs, SLOs, and error budgets across critical data services, enabling data-driven decision making around availability and performance. Drive incident response maturity, including chaos engineering, incident retrospectives, and blameless postmortems. Foster a reliability culture through coaching, workshops, and cross-functional forums. Build strategic relationships across engineering, data governance, security, and architecture teams to ensure reliability is baked in, not bolted on. Required Qualifications: Bachelor's or Master’s degree in Computer Science, Engineering, or related discipline. 3+ years in SRE leadership or SRE strategy roles. Strong familiarity with Google SRE principles and practical experience applying them in complex enterprise settings. Proven track record in establishing and scaling SRE teams. Experience with GCP services like Cloud Build, GCS, CloudSQL, Cloud Functions, and GCP logging & monitoring. Deep experience with observability stacks such as Prometheus, Grafana, Splunk, and GCP native solutions. Skilled in Infrastructure as Code using tools like Terraform, and working knowledge of automation in CI/CD environments. Key Competencies & Skills: Strong leadership, influence without authority, and mentoring capabilities. Hands-on scripting and automation skills in Python, with secondary languages like Go or Java a plus. Familiarity with incident and problem management frameworks in enterprise environments. Ability to define and execute a platform-wide reliability roadmap in alignment with architectural and business objectives. Nice to Have: Exposure to secrets management tools (e.g., HashiCorp Vault). Experience with tracing and APM tools like Google Cloud Trace or Honeycomb. Background in data governance, data pipelines, and security standards for data products.

Posted 2 months ago

Apply

8.0 - 12.0 years

30 - 35 Lacs

Pune

Work from Office

: Job Title- Senior Engineer PD Location- Pune, India Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate for the Cloud Data Engineer area. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities You are responsible for the implementation of the new project on GCP (Spark, Dataproc, Dataflow, BigQuery, Terraform etc) in the whole SDLC chain You are responsible for the support of the migration of current functionalities to Google Cloud You are responsible for the stability of the application landscape and support software releases You also support in L3 topics and application governance You are responsible in the CTM area for coding as part of an agile team (Java, Scala, Spring Boot) Your skills and experience You have experience with databases (BigQuery, Cloud SQl, No Sql, Hive etc.) and development preferably for Big Data and GCP technologies Strong understanding of Data Mesh Approach and integration patterns Understanding of Party data and integration with Product data Your architectural skills for big data solutions, especially interface architecture allows a fast start You have experience in at leastSpark, Java ,Scala and Python, Maven, Artifactory, Hadoop Ecosystem, Github Actions, GitHub, Terraform scripting You have knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processes You can work very well in teams but also independent and are constructive and target oriented Your English skills are good and you can both communicate professionally but also informally in small talks with the team How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 months ago

Apply

10.0 - 15.0 years

19 - 25 Lacs

Pune

Work from Office

: Job Title- IT Architect Location- Pune Role Description The Business Architect defines the technical solution design of specific IT platforms and provides guidance to the squad members in order to design, build, test and deliver high quality software solutions. A key element in this context is translation of functional and non-functional business requirements into an appropriate technical solution design, leveraging best practices and consistent design patterns. The Business Architect collaborates closely with Product Owners Chapter leads and Squad members to ensure consisten adherence to the agreed-upon application design and is responsible for maintaining an appropriate technical design documentation. The Solution architect ensures that the architectures and designs of solutions conform to the principles blueprints, standards, patterns etc,that have been established by the Enterprise Architecture in this context the Business Architect is closely collaborating with the respective solution Architect to ensure architecture compliance. The business Architect also actively contributes to the definition and enrichment of design patterns and standards with the aim to leverage those across squads and tribes. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Define the technical Architecture of IT Solutions in line with functional and non-functional requirements following consistent design patterns and best practices. Ensure that the solution design is in sync with WM target Architecture blueprints and principles, as well as with overarching DB architecture and security standards. Create appropriate technical design documentation and ensure this is kept up-to-date. Provide guidance to the squad members to design, build, test and deliver high quality software solutions in line with business requirements Responsible for all aspects of the solution architecture (i.e. Maintainablity, scalability, effective integration with other solutions, usage of shared solutions and components where possible, optimization of the resource consumption etc. ) with the object to meet the appropriate balance between business needs and total cost of ownership Closely collaborate with enterprise architecture to ensure architecture compliance and make sure that any design options are discussed in a timely manner to allow sufficient time for deliberate decision taking Present architecture proposals to relevant forums along with enterprise architect at different levels and drive the process to gain the necessary architecture approvals. Collaborate with relevant technology stakeholders within other squads and across tribes to ensure the cross-squad and cross-tribe solution architecture synchronization and alignment Contribute to definition and enrichment of appropriate design patterns and standards that can be leveraged across WM squads / tribes Serve as a Counsel to designers and developers and carry out reviews of software designs and high level detailed level design documentation provided by other squad members Lead the technical discussions with CCO, Data factory, Central Data quality and Complience, end to end and control functions for technical queries contribute to peer level solution architecture reviews e.g. within a respective chapter Your skills and experience Ability / experience in defining the high level and low level technical solution designs for complex initiatives very good analytical skills and ability to oversee / structure complex tasks Hands on skills with various google cloud components like storage buckets, BigQuery, Dataproc, cloud composer, cloud functions etc aling with Pyspark, Scala is essential,. Good to have experience in Cloud SQL, Dataflow, Java and Unix Experience with implementing a google cloud based solution is essesntial persuasive power and persistence in driving adherence to solution design within the squad Ability to apply the appropriate architectural patterns considering the relevant functional and nonfunctional requirements proven ability to balance business demands and IT capabilities in terms of standardization reducing risk and increasing the IT flexibility comfortable working in an open, highly collaborative team ability to work in an agile and dynamic environment and to build up the knowledge related to new technology/ solutions in an effective and timely manner ability to communicate effectively with other technology stakeholders feedbackseek feedback from others, provides feedback to others in support of their development and is open and honest while dealing constructively with criticism inclusive leadershipvalues individuals and embraces diversity by integrating differences in promoting diversity and inclusion across teams and functions coachingunderstands and anticipates people's needs skills and abilities in order to coach, motivate and empower them for success broad set of architecture knowledge and application design skills and - depending on the specific squad requirements - in-depth expertise with regards to specific architecture domains (e.g. service and integration architecture web and mobile front end architecture guitar architecture security architecture infrastructure architecture) and related technology stacks and design patterns experience in establishing thought leadership in solution architecture practices and ability to lead design and development teams and defining building and delivering first class software solutions familiar with current and emerging technologies, tools, frameworks and design patterns experience in effectively collaborating across multiple teams and geographies ability to appropriately consider other dimensions(e.g. financials, risk, time to market) On top of the architecture drivers in order to propose balanced and physical architecture solutions Experience / Qualifications : 10+ years relevant experience as technology Manager within the IT support industry experience in financial /banking industry preferred Minimum 8 years' experience supporting Oracle Platform in a mid-size to large corporate environment Preferably from Banking / Wealth Management experience Must have experience working in agile organization. How well support you

Posted 2 months ago

Apply

4.0 - 9.0 years

15 - 19 Lacs

Pune

Work from Office

: Job Title: Technical-Specialist GCP Developer LocationPune, India Role Description This role is for Engineer who is responsible for design, development, and unit testing software applications. The candidate is expected to ensure good quality, maintainable, scalable, and high performing software applications getting delivered to users in an Agile development environment. Candidate / Applicant should be coming from a strong technological background. The candidate should have goo working experience in Spark and GCP technology. Should be hands on and be able to work independently requiring minimal technical/tool guidance. Should be able to technically guide and mentor junior resources in the team. As a developer you will bring extensive design and development skills to enforce the group of developers within the team. The candidate will extensively make use and apply Continuous Integration tools and practices in the context of Deutsche Banks digitalization journey. What well offer you . 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design and discuss your own solution for addressing user stories and tasks. Develop and unit-test, Integrate, deploy, maintain, and improve software. Perform peer code review. Actively participate into the sprint activities and ceremonies e.g., daily stand-up/scrum meeting, Sprint planning, retrospectives, etc. Apply continuous integration best practices in general (SCM, build automation, unit testing, dependency management) Collaborate with other team members to achieve the Sprint objectives. Report progress/update Agile team management tools (JIRA/Confluence) Manage individual task priorities and deliverables. Responsible for quality of solutions candidate / applicant provides. Contribute to planning and continuous improvement activities & support PO, ITAO, Developers and Scrum Master. Your skills and experience Engineer with Good development experience in Google Cloud platform for at least 4 years. Hands own experience in Bigquery, Dataproc, Composer, Terraform, GKE, Cloud SQL and Cloud functions. Experience in set-up, maintenance, and ongoing development of continuous build/ integration infrastructure as a part of Devops. Create and maintain fully automated CI build processes and write build and deployment scripts. Has experience with development platformsOpenShift/ Kubernetes/Docker configuration and deployment with DevOps tools e.g., GIT, TeamCity, Maven, SONAR Good Knowledge about the core SDLC processes and tools such as HP ALM, Jira, Service Now. Knowledge on working with APIs and microservices , integrating external and internal web services including SOAP, XML, REST, JSON . Strong analytical skills. Proficient communication skills. Fluent in English (written/verbal). Ability to work in virtual teams and in matrixed organizations. Excellent team player. Open minded and willing to learn business and technology. Keeps pace with technical innovation. Understands the relevant business area. Ability to share information, transfer knowledge to expertise the team members. How well support you . . . .

Posted 2 months ago

Apply

3.0 - 7.0 years

8 - 13 Lacs

Pune

Work from Office

: J ob Title Senior Full Stack Engineer Corporate TitleAssistant Vice President LocationPune, India Role Description Enterprise SRE Team in CB is responsible for making Production Better by boosting Observability and strengthening reliability across Corporate Banking. The team actively works on building common platforms, reference architectures, tools for production engineering teams to standardize processes across CB. We work in agile environment with focus on Customer centricity and outstanding user experience with high reliability and flexibility of technical solutions in mind. With our platform we want to be an enabler for highest quality cloud-based software solutions and processes at Deustche Bank. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities What Youll Do Work on the SLO Dashboard, an application owned by the CB SRE team, ensuring its design (a highly scalable & performant solution), development, and maintenance. Participate in requirement workshops, analyze requirements, perform technical design, and take ownership of the development process. Identify and implement appropriate tools to support engineering automation, including test automation and CI/CD pipelines. Understand technical needs, prioritize requirements, and manage technical debt based on stakeholder urgency. Collaborate with the UI/UX designer while being mindful of backend changes and their impact on architecture or endpoint modifications during discussions. Produce detailed design documents and guide junior developers to align with the priorities and deliverables of the SLO Dashboard. Your skills and experience Several years relevant experiences in software architecture, design, development, and engineering, ideally in banking/finance services industry Strong engineering, solution and domain architecture background and up to date knowledge on software engineering topics such as microservices, streaming architectures, high-performance, horizontal scaling, API design, GraphQL, REST services, database systems, UI frameworks, Distributed Caching (e.g. Apache Ignite, HazelCast, Redis etc.), enterprise integration patterns, modern SDLC practices Good experience in working in GCP (Cloud based technologies) using GKE, CloudSQL (Postgres), Cloudrun, terraform. Good experience in DevOps using GitHub Actions for build, Liquibase pipelines. Fluent in application development stack such as Java/Spring-Boot (3.0+), ReactJS, Python, JavaScript/TypeScript/NodeJS, SQL Postgres DB. Ability to work in a fast-paced environment with competing and alternating priorities with a constant focus on delivery with strong interpersonal skills to manage relationships with a variety of partners and stakeholders; as well as facilitate group sessions AI Integration and Implementation (Nice to have): Leverage AI tools like GitHub Copilot, Google Gemini, Llama and other language models to optimize engineering analytics and workflows. Design and implement AI-driven dashboards and reporting tools for stakeholders Apply AI tools to automate repetitive tasks, analyze complex engineering datasets, and derive trends and patterns. How well support you

Posted 2 months ago

Apply

4.0 - 9.0 years

13 - 18 Lacs

Pune

Work from Office

: Job Title- Partner Data - Senior Java/API Engineer Location- Pune Role Description Our team is part of the area Technology, Data and Innovation (TDI) Private Bank. Within TDI, Partnerdata is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we maintain critical functionality on the mainframe, but build new solutions (REST services, Angular frontend, analytics capabilities) in a public and private cloud environment. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate with a passion for cloud solutions (GCP) and Big Data. What we'll offer you As part of our flexible scheme, here are just some of the benefits that you'll enjoy 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities You are responsible for the implementation of new project requirements on GCP (Cloud Run, Kubernetes, Cloud SQL, Terraform) You are responsible for the design and translation of high-level business requirements into software You are responsible for extending the service architecture, designing enhancements, and establishing best practices You support the migration of existing functionalities to the Google Cloud Platform You are responsible for the stability of the application landscape and support software releases You provide L3 support in case of incidents and facilitate the application governance procedure You are willing to work and code as part of an agile team (Java, Scala, Spring Boot) Your skills and experience You have multiple years of experience in designing and developing REST services (REST, OpenAPI, API-first) You have experience with Java and especially Spring and Spring Boot applications; Spring Cloud is a bonus You have experience in querying SQL databases and are familiar with relational databases in general Yous have experience with developing container-based applications and are familiar with container orchestration frameworks such as Kubernetes/Openshift You are familiar with cloud technologies and especially Google Cloud (Cloud Run, Kubernetes Engine, Cloud SQL, Big Query) You are familiar with building and deploying code in an enterprise-grade environment utilizing CI/CD pipelines, especially, Maven, JFrog Artifactory & GitHub Actions You have a good understanding of IaC concepts and tools such as Terraform Knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processes is a bonus You enjoy working in a team setting in an independent and target-oriented way You have very good English skills which allow you to communicate professionally, but also informally, with all team members How well support you . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 months ago

Apply

3.0 - 7.0 years

8 - 13 Lacs

Pune

Work from Office

: Job Title- Senior Engineer PD Location- Pune, India Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate for the Cloud Data Engineer area. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities You are responsible for the implementation of the new project on GCP (Spark, Dataproc, Dataflow, BigQuery, Terraform etc) in the whole SDLC chain You are responsible for the support of the migration of current functionalities to Google Cloud You are responsible for the stability of the application landscape and support software releases You also support in L3 topics and application governance You are responsible in the CTM area for coding as part of an agile team (Java, Scala, Spring Boot) Your skills and experience You have experience with databases (BigQuery, Cloud SQl, No Sql, Hive etc.) and development preferably for Big Data and GCP technologies Strong understanding of Data Mesh Approach and integration patterns Understanding of Party data and integration with Product data Your architectural skills for big data solutions, especially interface architecture allows a fast start You have experience in at leastSpark, Java ,Scala and Python, Maven, Artifactory, Hadoop Ecosystem, Github Actions, GitHub, Terraform scripting You have knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processes You can work very well in teams but also independent and are constructive and target oriented Your English skills are good and you can both communicate professionally but also informally in small talks with the team How well support you

Posted 2 months ago

Apply

7.0 - 11.0 years

10 - 20 Lacs

Indore, Pune

Work from Office

Exp Range 7+ Years Location Pune/Indore Notice – Immediate Senior DevOps Engineer Location: Indore, Pune – work from office. Job Summary: We are seeking an experienced and enthusiastic Senior DevOps Engineer with 7+ years of dedicated experience to join our growing team. In this pivotal role, you will be instrumental in designing, implementing, and maintaining our continuous integration, continuous delivery (CI/CD) pipelines, and infrastructure automation. You will champion DevOps best practices, optimize our cloud-native environments, and ensure the reliability, scalability, and security of our systems. This role demands deep technical expertise, an initiative-taking mindset, and a strong commitment to operational excellence. Key Responsibilities: CI/CD Pipeline Management: Design, build, and maintain robust and automated CI/CD pipelines using GitHub Actions to ensure efficient and reliable software delivery from code commit to production deployment. Infrastructure Automation: Develop and manage infrastructure as code (IaC) using Shell scripting and GCloud CLI to provision, configure, and manage resources within Google Cloud Platform (GCP) . Deployment Orchestration: Implement and optimize deployment strategies, leveraging GitHub for version control of deployment scripts and configurations, ensuring repeatable and consistent releases. Containerization & Orchestration: Work extensively with Docker for containerizing applications, including building, optimizing, and managing Docker images. Artifact Management: Administer and optimize artifact repositories, specifically Artifactory in GCP , to manage dependencies and build artifacts efficiently. System Reliability & Performance: Monitor, troubleshoot, and optimize the performance, scalability, and reliability of our cloud infrastructure and applications. Collaboration & Documentation: Work closely with development, QA, and operations teams. Utilize Jira for task tracking and Confluence for comprehensive documentation of systems, processes, and best practices. Security & Compliance: Implement and enforce security best practices within the CI/CD pipelines and cloud infrastructure, ensuring compliance with relevant standards. Mentorship & Leadership: Provide technical guidance and mentorship to junior engineers, fostering a culture of learning and continuous improvement within the team. Incident Response: Participate in on-call rotations and provide rapid response to production incidents, perform root cause analysis, and implement preventative measures. Required Skills & Experience (Mandatory - 7+ Years): Proven experience (7+ years) in a DevOps, Site Reliability Engineering (SRE), or similar role. Expert-level proficiency with Git and GitHub , including advanced branching strategies, pull requests, and code reviews. Experience designing and implementing CI/CD pipelines using GitHub Actions. Deep expertise in Google Cloud Platform (GCP) , including compute, networking, storage, and identity services. Advanced proficiency in Shell scripting for automation, system administration, and deployment tasks. Strong firsthand experience with Docker for containerization, image optimization, and container lifecycle management. Solid understanding and practical experience with Artifactory (or similar artifact management tools) in a cloud environment. Expertise in using GCloud CLI for automating GCP resource management and deployments. Demonstrable experience with Continuous Integration (CI) principles and practices. Proficiency with Jira for agile project management and Confluence for knowledge sharing. Strong understanding of networking concepts, security best practices, and system monitoring. Excellent critical thinking skills and an initiative-taking approach to identifying and resolving issues. Nice-to-Have Skills: Experience with Kubernetes (GKE) for container orchestration. Familiarity with other Infrastructure as Code (IaC) tools like Terraform . Experience with monitoring and logging tools such as Prometheus, Grafana, or GCP's Cloud Monitoring/Logging. Proficiency in other scripting or programming languages (e.g., Python, Go) for automation and tool development. Experience with database management in a cloud environment (e.g., Cloud SQL, Firestore). Knowledge of DevSecOps principles and tools for integrating security into the CI/CD pipeline. GCP Professional Cloud DevOps Engineer or other relevant GCP certifications. Experience with large-scale distributed systems and microservices architectures.

Posted 2 months ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Pune

Work from Office

Responsibilities Lead and mentor a team of data engineers, providing technical guidance, setting best practices, and overseeing task execution for the migration project. Design, develop, and architect scalable ETL processes to extract, transform, and load petabytes of data from on-premises SQL Server to GCP Cloud SQL PostgreSQL. Oversee the comprehensive analysis of existing SQL Server schemas, data types, stored procedures, and complex data models, defining strategies for their optimal conversion and refactoring for PostgreSQL. Establish and enforce rigorous data validation, quality, and integrity frameworks throughout the migration lifecycle, ensuring accuracy and consistency. Collaborate strategically with Database Administrators, application architects, business stakeholders, and security teams to define migration scope, requirements, and cutover plans. Lead the development and maintenance of advanced scripts (primarily Python) for automating large-scale migration tasks, complex data transformations, and reconciliation processes. Proactively identify, troubleshoot, and lead the resolution of complex data discrepancies, performance bottlenecks, and technical challenges during migration. Define and maintain comprehensive documentation standards for migration strategies, data mapping, transformation rules, and post-migration validation procedures. Ensure data governance, security, and compliance standards are meticulously applied throughout the migration process, including data encryption and access controls within GCP. Implement Schema conversion or custom schema mapping strategy for SQL Server to PostgreSQL shift Refactor and translate complex stored procedures and T-SQL logic to PostgreSQL-compatible constructs while preserving functional equivalence. Develop and execute comprehensive data reconciliation strategies to ensure consistency and parity between legacy and migrated datasets post-cutover. Design fallback procedures and lead post-migration verification and support to ensure business continuity. Ensuring metadata cataloging and data lineage tracking using GCP-native or integrated tools. Must-Have Skills Expertise in data engineering, specifically for Google Cloud Platform (GCP). Deep understanding of relational database architecture, advanced schema design, data modeling, and performance tuning. Expert-level SQL proficiency, with extensive hands-on experience in both T-SQL (SQL Server) and PostgreSQL. Hands-on experience with data migration processes, including moving datasets from on-premises databases to cloud storage solutions. Proficiency in designing, implementing, and optimizing complex ETL/ELT pipelines for high-volume data movement, leveraging tools and custom scripting. Strong knowledge of GCP services: Cloud SQL, Dataflow, Pub/Sub, Cloud Storage, Dataproc, Cloud Composer, Cloud Functions, and Bigquery. Solid understanding of data governance, security, and compliance practices in the cloud, including the management of sensitive data during migration. Strong programming skills in Python or Java for building data pipelines and automating processes. Experience with real-time data processing using Pub/Sub, Dataflow, or similar GCP services. Experience with CI/CD practices and tools like Jenkins, GitLab, or Cloud Build for automating the data engineering pipeline. Knowledge of data modeling and best practices for structuring cloud data storage for optimal query performance and analytics in GCP. Familiarity with observability and monitoring tools in GCP (e.g., Stackdriver, Prometheus) for real-time data pipeline visibility and alerting. Good-to-Have Skills Direct experience with GCP Database Migration Service, Storage Transfer Service, or similar cloud-native migration tools. Familiarity with data orchestration using tools like Cloud Composer (based on Apache Airflow) for managing workflows. Experience with containerization tools like Docker and Kubernetes for deploying data pipelines in a scalable manner. Exposure to DataOps tools and methodologies for managing data workflows. Experience with machine learning platforms like AI Platform in GCP to integrate with data pipelines. Familiarity with data lake architecture and the integration of BigQuery with Google Cloud Storage or Dataproc.

Posted 2 months ago

Apply

2.0 - 6.0 years

5 - 9 Lacs

Noida

Work from Office

About Us :. At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service all backed by TELUS, our multi-billion dollar telecommunications parent.. Required Skills :. Minimum 6 years of experience in Architectecture, Design and building data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms.. Perform application impact assessments, requirements reviews, and develop work estimates.. Develop test strategies and site reliability engineering measures for data products and solutions.. Lead agile development "scrums" and solution reviews.. Mentor junior Data Engineering Specialists.. Lead the resolution of critical operations issues, including post-implementation reviews.. Perform technical data stewardship tasks, including metadata management, security, and privacy by design.. Demonstrate expertise in SQL and database proficiency in various data engineering tasks.. Automate complex data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect.. Develop and manage Unix scripts for data engineering tasks.. Intermediate proficiency in infrastructure-as-code tools like Terraform, Puppet, and Ansible to automate infrastructure deployment.. Proficiency in data modeling to support analytics and business intelligence.. Working knowledge of ML Ops to integrate machine learning workflows with data pipelines.. Extensive expertise in GCP technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud. Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion, Dataproc (good to have), and BigTable.. Advanced proficiency in programming languages (Python).. Qualifications:. Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field.. Analytics certification in BI or AI/ML.. 6+ years of data engineering experience.. 4 years of data platform solution architecture and design experience.. GCP Certified Data Engineer (preferred).. Show more Show less

Posted 2 months ago

Apply

2.0 - 6.0 years

5 - 9 Lacs

Noida

Work from Office

About Us :. At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service all backed by TELUS, our multi-billion dollar telecommunications parent.. Required Skills:. Design, develop, and support data pipelines and related data products and platforms.. Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms.. Perform application impact assessments, requirements reviews, and develop work estimates.. Develop test strategies and site reliability engineering measures for data products and solutions.. Participate in agile development "scrums" and solution reviews.. Mentor junior Data Engineers.. Lead the resolution of critical operations issues, including post-implementation reviews.. Perform technical data stewardship tasks, including metadata management, security, and privacy by design.. Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies. Demonstrate SQL and database proficiency in various data engineering tasks.. Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect.. Develop Unix scripts to support various data operations.. Model data to support business intelligence and analytics initiatives.. Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation.. Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog,. Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have).. Qualifications:. Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field.. 4+ years of data engineering experience.. 2 years of data solution architecture and design experience.. GCP Certified Data Engineer (preferred).. Show more Show less

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies