Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
6 - 11 Lacs
Bengaluru, Karnataka, India
On-site
7+ years of experience in SQL development and database management. Own and drive the end-to-end data migration process from Mainframe DB2 to GCP. Analyze existing DB2 data structures, stored procedures, and ETL processes. Design and implement scalable, secure, and efficient data models. (BigQuery, Cloud SQL, etc.). Develop and optimize SQL scripts for data extraction, transformation, and loading (ETL). Develop, test, and optimize SQL queries for performance, scalability, and maintainability. Collaborate with infrastructure, cloud engineering, and business teams to ensure data integrity and performance. Monitor and troubleshoot data migration pipelines and resolve data quality issues. Proven experience in cloud data migration projects Document data mappings, and technical specifications. Strong understanding of query execution plans, indexing, and optimization techniques. Ensure compliance with data governance, security, and privacy standards. Excellent analytical, communication, and documentation skills.
Posted 4 weeks ago
3.0 - 8.0 years
6 - 11 Lacs
Chennai, Tamil Nadu, India
On-site
7+ years of experience in SQL development and database management. Own and drive the end-to-end data migration process from Mainframe DB2 to GCP. Analyze existing DB2 data structures, stored procedures, and ETL processes. Design and implement scalable, secure, and efficient data models. (BigQuery, Cloud SQL, etc.). Develop and optimize SQL scripts for data extraction, transformation, and loading (ETL). Develop, test, and optimize SQL queries for performance, scalability, and maintainability. Collaborate with infrastructure, cloud engineering, and business teams to ensure data integrity and performance. Monitor and troubleshoot data migration pipelines and resolve data quality issues. Proven experience in cloud data migration projects Document data mappings, and technical specifications. Strong understanding of query execution plans, indexing, and optimization techniques. Ensure compliance with data governance, security, and privacy standards. Excellent analytical, communication, and documentation skills.
Posted 4 weeks ago
3.0 - 8.0 years
6 - 11 Lacs
Hyderabad, Telangana, India
On-site
7+ years of experience in SQL development and database management. Own and drive the end-to-end data migration process from Mainframe DB2 to GCP. Analyze existing DB2 data structures, stored procedures, and ETL processes. Design and implement scalable, secure, and efficient data models. (BigQuery, Cloud SQL, etc.). Develop and optimize SQL scripts for data extraction, transformation, and loading (ETL). Develop, test, and optimize SQL queries for performance, scalability, and maintainability. Collaborate with infrastructure, cloud engineering, and business teams to ensure data integrity and performance. Monitor and troubleshoot data migration pipelines and resolve data quality issues. Proven experience in cloud data migration projects Document data mappings, and technical specifications. Strong understanding of query execution plans, indexing, and optimization techniques. Ensure compliance with data governance, security, and privacy standards. Excellent analytical, communication, and documentation skills.
Posted 4 weeks ago
4.0 - 9.0 years
20 - 35 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Salary: 20 to 35 LPA Exp: 5 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services such as BigQuery, Data Flow, PubSub, Dataproc, and Cloud Storage. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from large datasets stored in Google Cloud SQL databases. Troubleshoot issues related to data processing workflows and provide timely resolutions. Desired Candidate Profile 5-9 years of experience in Data Engineering with expertise GCP & Biq query data engineering. Strong understanding of GCP Cloud Platform Administration including Compute Engine (Dataproc), Kubernetes Engine (K8s), Cloud Storage, Cloud SQL etc. . Experience working on big data analytics projects involving ETL processes using tools like Airflow or similar technologies.
Posted 4 weeks ago
10.0 - 20.0 years
10 - 20 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Job Description: Cloud Infrastructure & Deployment Design and implement secure, scalable, and highly available cloud infrastructure on GCP. Provision and manage compute, storage, network, and database services. Automate infrastructure using Infrastructure as Code (IaC) tools such as Terraform or Deployment Manager. Architecture & Design Translate business requirements into scalable cloud solutions. Recommend GCP services aligned with application needs and cost optimization. Participate in high-level architecture and solution design discussions. DevOps & Automation Build and maintain CI/CD pipelines (e.g., using Cloud Build, Jenkins, GitLab CI). Integrate monitoring, logging, and alerting (e.g., Stackdriver / Cloud Operations Suite). Enable autoscaling, load balancing, and zero-downtime deployments. Security & Compliance Ensure compliance with security standards and best Migration & Optimization Support cloud migration projects from on-premise or other cloud providers to GCP. Optimize performance, reliability, and cost of GCP workloads. Documentation & Support Maintain technical documentation and architecture diagrams. Provide L2/L3 support for GCP-based services and incidents. Required Skills and Qualifications: Google Cloud Certification Associate Cloud Engineer or Professional Cloud Architect/Engineer Hands-on experience with GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) Strong command of Linux , shell scripting , and networking fundamentals Proficiency in Terraform , Cloud Build , Cloud Functions , or other GCP-native tools Experience with containers and orchestration – Docker, Kubernetes (GKE) Familiarity with monitoring/logging – Cloud Monitoring , Prometheus , Grafana Understanding of IAM , VPCs , firewall rules , service accounts , and Cloud Identity
Posted 4 weeks ago
5.0 - 7.0 years
4 - 9 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering,BTech,Bachelor Of Technology,BCA,BSc,MTech,MCA Service Line Application Development and Maintenance Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Preferred LocationsBangalore, Hyderabad, Chennai, Pune Experience Required3 to 5 years of experiencePure hands on and expertise on the skill, able to deliver without any support Experience Required5 - 9 years of experienceDesign knowledge, estimation technique, leading and guiding the team on technical solution Experience Required9 - 13 years of experienceArchitecture, Solutioning, (Optional) proposal Containerization, micro service development on AWS/Azure/GCP is preferred. In-depth knowledge of design issues and best practices Solid understanding of object-oriented programming Familiar with various design, architectural patterns and software development process. Implementing automated testing platforms and unit tests Strong experience in building and developing applications using technologies like Python Knowledge about RESTful APIs and ability to design cloud ready applications using cloud SDK’s , microservices Exposure to cloud compute services like VM’s, PaaS services, containers, serverless and storage services on AWS/Azure/GCP Good understanding of application development design patterns Technical and Professional : Primary SkillPythonSecondary Skills: AWS/Azure/GCP Preferred Skills: Technology-Machine Learning-Python Generic Skills: Technology-Cloud Platform-AWS App Development Technology-Cloud Platform-Azure Development & Solution Architecting Technology-Cloud Platform-GCP Devops
Posted 4 weeks ago
9.0 - 14.0 years
22 - 37 Lacs
Pune
Hybrid
We're Hiring: Senior GCP Data Engineer (L4) for a client || Immediate joiners only Location: Pune | Walk-in Drive: 5th July 2025 Are you a seasoned Data Engineer with 912 years of experience and a passion for building scalable data solutions on Google Cloud Platform? Join us for an exciting walk-in opportunity! Key Skills Required GCP Data Engineering, BigQuery, SQL Python (Cloud Compressor, Cloud Functions, Python Injection) Dataproc + PySpark, Dataflow + Pub/Sub Apache Beam, Spark, Hadoop What You'll Do Architect and implement end-to-end data pipelines on GCP Work with BigQuery, BigTable, Cloud Storage, Spanner, and more Automate data ingestion, transformation, and augmentation Ensure data quality and compliance across systems Collaborate in a fast-paced, dynamic environment Bonus Points Google Professional Data Engineer or Solution Architect certification Experience with Snaplogic, Cloud Dataprep Strong SQL and data integration expertise If interested, Pls share your CV @ Raveena.kalra@in.ey.com
Posted 4 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
About the Role: We are seeking a skilled and detail-oriented Data Engineer with deep expertise in PostgreSQL and SQL to design, maintain, and optimize our database systems. As a key member of our data infrastructure team, you will work closely with developers, DevOps, and analysts to ensure data integrity, performance, and scalability of our applications. Key Responsibilities: Design, implement, and maintain PostgreSQL database systems for high availability and performance. Write efficient, well-documented SQL queries, stored procedures, and database functions. Analyze and optimize slow-performing queries and database structures. Collaborate with software engineers to support schema design, indexing, and query optimization. Perform database migrations, backup strategies, and disaster recovery planning. Ensure data security and compliance with internal and regulatory standards. Monitor database performance and proactively address bottlenecks and anomalies. Automate routine database tasks using scripts and monitoring tools. Contribute to data modeling and architecture discussions for new and existing systems. Support ETL pipelines and data integration processes as needed. Required Qualifications: Bachelor's degree in Computer Science, Information Systems, or related field. 5 years of professional experience in a database engineering role. Proven expertise with PostgreSQL (version 12+ preferred). Strong SQL skills with the ability to write complex queries and optimize them. Experience with performance tuning, indexing, query plans, and execution analysis. Familiarity with database design best practices and normalization techniques. Solid understanding of ACID principles and transaction management. Preferred Qualifications: Experience with cloud platforms (e.g., AWS RDS, GCP Cloud SQL, or Azure PostgreSQL). Familiarity with other database technologies (e.g., MySQL, NoSQL, MongoDB, Redis). Knowledge of scripting languages (e.g., Python, Bash) for automation. Experience with monitoring tools (e.g., pgBadger, pg_stat_statements, Prometheus/Grafana). Understanding of CI/CD processes and infrastructure as code (e.g., Terraform). Exposure to data warehousing or analytics platforms (e.g., Redshift, BigQuery).
Posted 1 month ago
7.0 - 11.0 years
7 - 11 Lacs
Chennai
Work from Office
Strong DB Setup and Implementation Experience SQL Server Strong SQL Server Knowledge- Architecture and Internals Working with Version Management (BitBucket / Stash etc.) and DevOps for DB Experience in data migration Experience in ETL SSIS preferred. Knowledge of other tools added advantage Primary skills AWS Cloud SQL server Performance tuning Data Migration Secondary skills Splunk Monitoring Foglight Tool
Posted 1 month ago
0.0 years
9 - 14 Lacs
Noida
Work from Office
Required Skills: GCP Proficiency Strong expertise in Google Cloud Platform (GCP) services and tools. Strong expertise in Google Cloud Platform (GCP) services and tools, including Compute Engine, Google Kubernetes Engine (GKE), Cloud Storage, Cloud SQL, Cloud Load Balancing, IAM, Google Workflows, Google Cloud Pub/Sub, App Engine, Cloud Functions, Cloud Run, API Gateway, Cloud Build, Cloud Source Repositories, Artifact Registry, Google Cloud Monitoring, Logging, and Error Reporting. Cloud-Native Applications Experience in designing and implementing cloud-native applications, preferably on GCP. Workload Migration Proven expertise in migrating workloads to GCP. CI/CD Tools and Practices Experience with CI/CD tools and practices. Python and IaC Proficiency in Python and Infrastructure as Code (IaC) tools such as Terraform. Responsibilities: Cloud Architecture and Design Design and implement scalable, secure, and highly available cloud infrastructure solutions using Google Cloud Platform (GCP) services and tools such as Compute Engine, Kubernetes Engine, Cloud Storage, Cloud SQL, and Cloud Load Balancing. Cloud-Native Applications Design Development of high-level architecture design and guidelines for develop, deployment and life-cycle management of cloud-native applications on CGP, ensuring they are optimized for security, performance and scalability using services like App Engine, Cloud Functions, and Cloud Run. API ManagementDevelop and implement guidelines for securely exposing interfaces exposed by the workloads running on GCP along with granular access control using IAM platform, RBAC platforms and API Gateway. Workload Migration Lead the design and migration of on-premises workloads to GCP, ensuring minimal downtime and data integrity. Skills (competencies)
Posted 1 month ago
6.0 - 10.0 years
6 - 11 Lacs
Mumbai
Work from Office
Primary Skills Google Cloud Platform (GCP) Expertise in Compute (VMs, GKE, Cloud Run), Networking (VPC, Load Balancers, Firewall Rules), IAM (Service Accounts, Workload Identity, Policies), Storage (Cloud Storage, Cloud SQL, BigQuery), and Serverless (Cloud Functions, Eventarc, Pub/Sub). Strong experience in Cloud Build for CI/CD, automating deployments and managing artifacts efficiently. Terraform Skilled in Infrastructure as Code (IaC) with Terraform for provisioning and managing GCP resources. Proficient in Modules for reusable infrastructure, State Management (Remote State, Locking), and Provider Configuration . Experience in CI/CD Integration with Terraform Cloud and automation pipelines. YAML Proficient in writing Kubernetes manifests for deployments, services, and configurations. Experience in Cloud Build Pipelines , automating builds and deployments. Strong understanding of Configuration Management using YAML in GitOps workflows. PowerShell Expert in scripting for automation, managing GCP resources, and interacting with APIs. Skilled in Cloud Resource Management , automating deployments, and optimizing cloud operations. Secondary Skills CI/CD Pipelines GitHub Actions, GitLab CI/CD, Jenkins, Cloud Build Kubernetes (K8s) Helm, Ingress, RBAC, Cluster Administration Monitoring & Logging Stackdriver (Cloud Logging & Monitoring), Prometheus, Grafana Security & IAM GCP IAM Policies, Service Accounts, Workload Identity Networking VPC, Firewall Rules, Load Balancers, Cloud DNS Linux & Shell Scripting Bash scripting, system administration Version Control Git, GitHub, GitLab, Bitbucket
Posted 1 month ago
4.0 - 8.0 years
5 - 15 Lacs
Chennai
Hybrid
GCP Data Engineer (ETL Migration) We're looking for a GCP Data Engineer with strong ETL migration experience to join our team and lead the transition of legacy data pipelines to Google Cloud Platform . Important Note: This opportunity is open only to candidates currently residing in Tamil Nadu, including those from other states who are currently working in Tamil Nadu. Please note that this is a client-specific requirement, and profiles not meeting this preference may not be considered for further evaluation. Key Responsibilities: Migrate ETL processes to GCP-native services (BigQuery, Dataflow, DataFusion, etc.) Design, develop & deploy scalable data pipelines Analyze existing JCL/ETL jobs and dependencies Troubleshoot production issues and support job performance tuning Lead estimation, testing, deployment, and monitoring Must-Have Skills: GCP (BigQuery, Dataflow, Dataproc, DataFusion, Terraform, Airflow), PySpark, Python, SQL, API Integration, Cloud SQL, Postgres Experience: 5+ years in complex SQL and ETL development 2+ years in GCP at production scale Experience with DataStage, Autosys/Astronomer, GitHub Ready to drive real-time and batch data solutions at scale? Apply now. #GCPJobs #DataEngineer #ETLMigration #BigQuery #Dataflow #HiringNow
Posted 1 month ago
5.0 - 10.0 years
10 - 20 Lacs
Hyderabad, Gurugram, Coimbatore
Work from Office
Role: DevOps Engineer Experience : 5+ years Location : Gurgaon, Hyderabad & Coimbatore Key Responsibilities: Ensure zero-downtime deployments across production. Implement custom Helm deployment and rollback strategies. Refactor Terraform modules for simplicity and efficiency. Enforce secure CI/CD practices with tools like GitHub Actions. Migrate secret management to GCP Secret Manager and Kubernetes Secrets. Standardize drift detection and config audits. Lead GKE workload IAM scoping using workload identity. Maintain infrastructure documentation, SOPs, and disaster recovery playbooks. Mentor team members and contribute to DevOps metrics and postmortems. Requirements: 3+ years in DevOps, SRE, or Infrastructure Engineering. Strong experience with Terraform and reusable modules. Hands-on with Kubernetes (GKE preferred). Familiarity with GitHub Actions, Helm, and CI/CD workflows. Knowledge of GCP services like CloudSQL, VPC, IAM. Experience with observability tools, especially Datadog. Strong attention to deployment quality and operational details. Desirable Experience: Exposure to GitOps (ArgoCD/FluxCD). Experience with Kubernetes operators. Understanding of SLIs, SLOs, and structured alerting. Tools & Expectations: Terraform / HCP Terraform Infrastructure as code, state management, and drift detection. GitHub / GitLab / GitHub Actions Secure CI/CD pipeline setup and governance. Helm Application deployment and lifecycle management. Kubernetes / GKE Cluster and workload management. GCP Services – VPC, IAM, CloudSQL integration. Secret Management – Kubernetes Secrets, CSI Driver, GCP Secret Manager. Datadog – Observability and alerting. Cloudflare – DNS, WAF, and exposure configuration. Snyk / SonarQube / Wiz – Code and container security in CI/CD. Interested candidates can share their resume at Neesha1@damcogroup.com
Posted 1 month ago
5.0 - 10.0 years
11 - 16 Lacs
Bengaluru
Work from Office
: Job TitleSenior GCP Data Engineer Corporate TitleAssociate LocationBangalore, India Role Description Deutsche Bank has set for itself ambitious goals in the areas of Sustainable Finance, ESG Risk Mitigation as well as Corporate Sustainability. As Climate Change throws new Challenges and opportunities, Bank has set out to invest in developing a Sustainability Technology Platform, Sustainability data products and various sustainability applications which will aid Banks goals. As part of this initiative, we are building an exciting global team of technologists who are passionate about Climate Change, want to contribute to greater good leveraging their Technology Skillset in in Cloud / Hybrid Architecture. As part of this Role, we are seeking a highly motivated and experienced Senior GCP Data Engineer to join our team. In this role, you will play a critical role in designing, developing, and maintaining robust data pipelines that transform raw data into valuable insights for our organization. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design, develop, and maintain data pipelines using GCP services like Dataflow, Dataproc, and Pub/Sub. Develop and implement data ingestion and transformation processes using tools like Apache Beam and Apache Spark. Manage and optimize data storage solutions on GCP, including Big Query, Cloud Storage, and Cloud SQL. Implement data security and access controls using GCP's Identity and Access Management (IAM) and Cloud Security Command Center. Monitor and troubleshoot data pipelines and storage solutions using GCP's Stackdriver and Cloud Monitoring tools. Collaborate with data experts, analysts, and product teams to understand data needs and deliver effective solutions. Automate data processing tasks using scripting languages like Python. Participate in code reviews and contribute to establishing best practices for data engineering on GCP. Stay up to date on the latest advancements and innovations in GCP services and technologies. Your skills and experience 5+ years of experience as a Data Engineer or similar role. Proven expertise in designing, developing, and deploying data pipelines. In-depth knowledge of Google Cloud Platform (GCP) and its core data services (GCS, BigQuery, Cloud Storage, Dataflow, etc.). Strong proficiency in Python & SQL for data manipulation and querying. Experience with distributed data processing frameworks like Apache Beam or Apache Spark (a plus). Familiarity with data security and access control principles. Excellent communication, collaboration, and problem-solving skills. Ability to work independently, manage multiple projects, and meet deadlines Knowledge of Sustainable Finance / ESG Risk / CSRD / Regulatory Reporting will be a plus Knowledge of cloud infrastructure and data governance best practices will be a plus. Knowledge of Terraform will be a plus How well support you . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 month ago
10.0 - 15.0 years
30 - 45 Lacs
Bengaluru
Work from Office
Position Overview As an Engineering Manager, you will lead a team of software engineers in building scalable, reliable, and efficient web applications and microservices in the for “ News and Competitive Data Analysis Platform ” . You will drive technical excellence, system architecture, and best practices while fostering a high-performance engineering culture. You’ll be responsible for managing engineering execution, mentoring engineers, and ensuring the timely delivery of high-quality solutions. Your expertise in Python , Django, React, Apache Solr , RabbitMQ, and Postges and other NoSQL cloud databases will help shape the technical strategy of our SaaS platform. You will collaborate closely with Product, Design, and DevOps teams to align engineering efforts with business goals. Key Responsibilities Demonstratesstrong technical leadership, typically across multipleteams by delivering high-quality software solutions ensuring scalability, reliability, and performance. Collaborate with cross-functional teams, product managers, designers, and stakeholders to define project requirements and deliver solutions that meet business goals. Ableto spot the biggest pain points of the systems you’re working with andpropose solutions toimprove. Ableto influence the engineering culture and practices of the teams, workswith self-confidence with stakeholders outside of own team aswell. Experiencedmentor providing technical guidance and inspiring the team to achieve goals, helping them to grow and develop skills. Conduct constructive code reviews, ensure code quality, and promote best practices in software development so everyone can go and learn alongside you. What you need to succeed Technical Leadership: Provide architectural guidance, ensuring best practices in Django-based backend development and React-based frontends. Engineering Execution: Own the technical roadmap, working with teams on key components like Python framework for task management, RabbitMQ for messaging, and Google Cloud SQL/PostgreSQL for data storage. System Scalability & Performance: Optimize & redisgn Backend worker nodes efficiency for tasks like crawling and notifications, ensuring smooth distributed task execution. Database & Search Optimization: Drive performance improvements in PostgreSQL (self-hosted and GCP-managed CloudSQL) and Apache Solr for indexed content retrieval. Cloud & Infrastructure: Oversee the transition to GCP-managed CloudSQL, replacing Google Data Store and PostgreSQL for metadata. Cloud Migration : Stragegise and drive product migration from GCP to AWS Cross-Team Collaboration: Partner with DevOps and infrastructure teams to ensure reliable deployments, CI/CD, and production monitoring. Mentorship & Growth: Coach and mentor engineers, fostering a culture of ownership, continuous learning, and technical excellence. Preferred Experience/Skills: 8+ years of experience in software development, with at least 2+ years in an engineering leadership role. Strong expertise in Python (Django, Celery) and JavaScript (React, frontend development). Deep understanding of message brokers (RabbitMQ) and task orchestration. Proficiency in SQL and NoSQL databases, including PostgreSQL, Apache Solr, and Google Cloud SQL. Experience with scalable architectures, distributed systems, and event-driven designs. Knowledge of NLP and Semantic Search Knowledge of DevOps, CI/CD pipelines, and cloud platforms (GCP, AWS, or Azure). Strong background in code quality, security best practices, and performance tuning. Ability to influence stakeholders, manage engineering teams, and drive technical innovation.
Posted 1 month ago
3.0 - 8.0 years
5 - 10 Lacs
Pune
Work from Office
Job Titles: Technology Service Specialist Corporate title: Associate Location: Pune, India Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. TDI PB Germany Service Operations provides 2nd Level Application Support for business applications used in branches, by mobile sales or via internet. The department is overall responsible for the stability of the applications. Incident Management and Problem Management are the main processes that account for the required stability. In-depth application knowledge and understanding of the business processes that the applications support are our main assets. Within TDI, Partnerdata is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. With the partnership with Google Cloud (GCP), a bunch of applications and functionalities were migrated to GCP from where they will be operated and worked upon in terms of further development. Besides to the maintenance and the implementation of new requirements, the content focus also lies on the regulatory topics surrounding a partner/ client. We are looking for reinforcements for this contemporary and emerging Cloud area of application. What we'll offer you As part of our flexible scheme, here are just some of the benefits that you'll enjoy Your key responsibilities Ensures that the Service Operations team provides optimum service level to the business lines it supports. Takes overall responsibility for the resolution of incidents and problems within the team. Oversees the resolution of complex incidents. Ensure that Analysts apply the right problem-solving techniques and processes. Assists in managing business stakeholder relationships. Assists in defining and managing OLAs with relevant stakeholders. Ensures that the team understands OLAs and resources appropriately and are aligned to business SLAs. Ensures relevant Client Service teams are informed of progress on incidents, where necessary. Ensures that defined divisional Production Management service operations and support processes are adhered to by the team. Make improvement recommendations where appropriate. Prepares for and, if requested, manages steam review meetings. Makes suggestions for continual service improvement. Manages escalations by working with Client Services and other Service Operations Specialists and relevant functions to accurately resolve escalated issues quickly. Observes areas requiring monitoring, reporting and improvement. Identifies required metrics and ensure they are established, monitored and improved where appropriate. Continuously seeks to improve team performance. Participates in team training events, where appropriate. Works with team members to identify areas of focus, where training may improve team performance, and improve incident resolution. Mentors and coaches Production Management Analysts within the team by providing career development and counselling, as needed. Assists Production Management Analysts in setting performance targets; and manages performance against them. Identifies team bottlenecks (obstacles) and takes appropriate actions to eliminate them. Level 3 or Advanced support for technical infrastructure components Evaluation of new products including prototyping and recommending new products including automation Specify/select tools to enhance operational support. Champion activities and establishes best practices in specialist area, working to implement best of breed test practices and processes in area of profession. Defines and implements best practices, solutions and standards related to their area of expertise Builds captures and manages the transfers of knowledge across the Service Operations organization Fulfil Service Requests addressed to L2 Support Communicate with Service Desk function, other L2 and L3 units Incident-, Change-, Problem Management and Service Request Fulfillment Solving incidents of customers in time Log file analysis and root cause analysis Participating in major incident calls for high priority incidents Resolving inconsistencies of data replication Supporting Problem management to solve Application issues Creating/Executing Service Requests for Customers, provide Reports and Statistics Escalating and informing about incidents in a timely manner Documentation of Tasks, Incidents, Problems and Changes Documentation in Service Now Documentation in Knowledgebases Improving monitoring of the application Adding requests for Monitoring Adding alerts and thresholds for occurring issues Implementing automation of tasks Your skills and experience Service Operations Specialist experience within a global operations context Extensive experience of supporting complex application and infrastructure domains Experience managing and mentoring Service Operations teams Broad ITIL/best practice service context within a real time distributed environment Experience managing relationships across multiple disciplines and time zones Ability to converse clearly with internal and external staff via telephone and written communication Good knowledge on interface technologies and communication protocols Be willing to work in DE business hours Clear and concise documentation in general and especially a proper documentation of the current status of incidents, problems and service requests in the Service Management tool Thorough and precise work style with a focus on high quality Distinct service orientation High degree of self-initiative Bachelors Degree from an accredited college or university with a concentration in IT or Computer Science related discipline (equivalent diploma or technical faculty) ITIL certification and experience with ITSM tool ServiceNow (preferred) Know How on Banking domain and preferably regulatory topics around know your customer processes Experience with databases like BigQuery and good understanding of Big Data and GCP technologies Experience in at least: GitHub, Terraform, Cloud SQL, Cloud Storage, Dataproc, Dataflow Architectural skills for big data solutions, especially interface architecture You can work very well in teams but also independent and you are constructive and target oriented Your English skills are very good and you can both communicate professionally but also informally in small talks with the team Area specific tasks / responsibilities: Handling Incident- /Problem Management und Service Request Fulfilment Analyze Incidents, which are addressed from 1st Level Support Analyze occurred errors out of the batch processing and interfaces of related systems Resolution or Workaround determination and implementation Supporting the resolution of high impact incidents on our services, including attendance at incident bridge calls Escalate incident tickets and working with members of the team and Developers Handling Service Request eg. Reports for Business and Projects Providing resolution for open problems, or ensuring that the appropriate parties have been tasked with doing so Supporting the handover from new Projects / Applications into Production Services with Service Transition before Go Life Phase Supporting Oncall-Support activities
Posted 1 month ago
10.0 - 15.0 years
30 - 40 Lacs
Noida, Pune, Bengaluru
Hybrid
Strong Experience in Big Data- Data Modelling, Design, Architecting & Solutioning Understands programming language like SQL, Python, R-Scala. Good Python skills, - Experience from data visualisation tools such as Google Data Studio or Power BI Knowledge in A/B Testing, Statistics, Google Cloud Platform, Google Big Query, Agile Development, DevOps, Date Engineering, ETL Data Processing Strong Migration experience of production Hadoop Cluster to Google Cloud. Good To Have:- Expert in Big Query, Dataproc, Data Fusion, Dataflow, Bigtable, Fire Store, CloudSQL, Cloud Spanner, Google Cloud Storage, Cloud Composer, Cloud Interconnect, Etc
Posted 1 month ago
3.0 - 7.0 years
10 - 20 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Salary: 8 to 24 LPA Exp: 3 to 7 years Location: Gurgaon/Pune/Bengalore Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engineer, Candidate will work on the assignments for one of our Utilities clients. Collaborating with cross-functional teams and stakeholders involves gathering data requirements, aligning business goals, and translating them into scalable data solutions. The role includes working closely with data analysts, scientists, and business users to understand needs, designing robust data pipelines, and ensuring data is accessible, reliable, and well-documented. Regular communication, iterative feedback, and joint problem-solving are key to delivering high-impact, data-driven outcomes that support organizational objectives. This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.
Posted 1 month ago
5.0 - 7.0 years
4 - 8 Lacs
Pune
Work from Office
We are looking for a skilled PostgreSQL Expert with 5 to 7 years of experience in the field. The ideal candidate should have expertise in GCP Cloud SQL knowledge, DB DDL, DML, and production support. This position is located in Pune. Roles and Responsibility Design, develop, and implement database architectures using PostgreSQL. Develop and maintain databases on GCP Cloud SQL. Ensure high availability and performance of database systems. Troubleshoot and resolve database-related issues. Collaborate with cross-functional teams to identify and prioritize database requirements. Implement data security and access controls. Job Strong knowledge of PostgreSQL and GCP Cloud SQL. Experience with DB DDL, DML, and production support. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills. Familiarity with database design principles and best practices.
Posted 1 month ago
10.0 - 20.0 years
12 - 22 Lacs
Pune
Work from Office
Your key responsibilities Ensures that the Service Operations team provides optimum service level to the business lines it supports. Takes overall responsibility for the resolution of incidents and problems within the team. Oversees the resolution of complex incidents. Ensure that Analysts apply the right problem-solving techniques and processes. Assists in managing business stakeholder relationships. Assists in defining and managing OLAs with relevant stakeholders. Ensures that the team understands OLAs and resources appropriately and are aligned to business SLAs. Ensures relevant Client Service teams are informed of progress on incidents, where necessary. Ensures that defined divisional Production Management service operations and support processes are adhered to by the team. Make improvement recommendations where appropriate. Prepares for and, if requested, manages steam review meetings. Makes suggestions for continual service improvement. Manages escalations by working with Client Services and other Service Operations Specialists and relevant functions to accurately resolve escalated issues quickly. Observes areas requiring monitoring, reporting and improvement. Identifies required metrics and ensure they are established, monitored and improved where appropriate. Continuously seeks to improve team performance. Participates in team training events, where appropriate. Works with team members to identify areas of focus, where training may improve team performance, and improve incident resolution. Mentors and coaches Production Management Analysts within the team by providing career development and counselling, as needed. Assists Production Management Analysts in setting performance targets; and manages performance against them. Identifies team bottlenecks (obstacles) and takes appropriate actions to eliminate them. Level 3 or Advanced support for technical infrastructure components Evaluation of new products including prototyping and recommending new products including automation Specify/select tools to enhance operational support. Champion activities and establishes best practices in specialist area, working to implement best of breed test practices and processes in area of profession. Defines and implements best practices, solutions and standards related to their area of expertise Builds captures and manages the transfers of knowledge across the Service Operations organization Fulfil Service Requests addressed to L2 Support Communicate with Service Desk function, other L2 and L3 units Incident-, Change-, Problem Management and Service Request Fulfillment Solving incidents of customers in time Log file analysis and root cause analysis Participating in major incident calls for high priority incidents Resolving inconsistencies of data replication Supporting Problem management to solve Application issues Creating/Executing Service Requests for Customers, provide Reports and Statistics Escalating and informing about incidents in a timely manner Documentation of Tasks, Incidents, Problems and Changes Documentation in Service Now Documentation in Knowledgebases Improving monitoring of the application Adding requests for Monitoring Adding alerts and thresholds for occurring issues Implementing automation of tasks Your skills and experience Service Operations Specialist experience within a global operations context Extensive experience of supporting complex application and infrastructure domains Experience managing and mentoring Service Operations teams Broad ITIL/best practice service context within a real time distributed environment Experience managing relationships across multiple disciplines and time zones Ability to converse clearly with internal and external staff via telephone and written communication Good knowledge on interface technologies and communication protocols Be willing to work in DE business hours Clear and concise documentation in general and especially a proper documentation of the current status of incidents, problems and service requests in the Service Management tool Thorough and precise work style with a focus on high quality Distinct service orientation High degree of self-initiative Bachelors Degree from an accredited college or university with a concentration in IT or Computer Science related discipline (equivalent diploma or technical faculty) ITIL certification and experience with ITSM tool ServiceNow (preferred) Know How on Banking domain and preferably regulatory topics around know your customer processes Experience with databases like BigQuery and good understanding of Big Data and GCP technologies Experience in at least: GitHub, Terraform, Cloud SQL, Cloud Storage, Dataproc, Dataflow Architectural skills for big data solutions, especially interface architecture You can work very well in teams but also independent and you are constructive and target oriented Your English skills are very good and you can both communicate professionally but also informally in small talks with the team Area specific tasks / responsibilities: Handling Incident- /Problem Management und Service Request Fulfilment Analyze Incidents, which are addressed from 1st Level Support Analyze occurred errors out of the batch processing and interfaces of related systems Resolution or Workaround determination and implementation Supporting the resolution of high impact incidents on our services, including attendance at incident bridge calls Escalate incident tickets and working with members of the team and Developers Handling Service Request eg. Reports for Business and Projects Providing resolution for open problems, or ensuring that the appropriate parties have been tasked with doing so Supporting the handover from new Projects / Applications into Production Services with Service Transition before Go Life Phase Supporting Oncall-Support activities
Posted 1 month ago
8.0 - 13.0 years
2 - 30 Lacs
Bengaluru
Work from Office
Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Bengaluru, Karnataka, India; Hyderabad, Telangana, India Minimum qualifications: Bachelor's degree in Computer Science, Engineering, Mathematics, a related field, or equivalent practical experience Experience in distributed data processing frameworks and modern age Google Cloud Platform (GCP) analytical and transactional data stores like BigQuery, CloudSQL, AlloyDB etc, and experience in one Database type to write SQLs Experience in distributed data processing frameworks and modern age GCP analytical and transactional data stores like BigQuery, CloudSQL, AlloyDB etc, and experience in one Database type to write SQLs Experience in GCP Preferred qualifications: Experience in working with/on data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools, environments, and data structures Experience with encryption techniques like symmetric, asymmetric, HSMs, and envelop, and ability to implement secure key storage using Key Management System Experience in building multi-tier, high availability applications with modern technologies such as NoSQL, MongoDB, SparkML, and TensorFlow Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments Experience in Big Data, information retrieval, data mining, or Machine Learning Experience with IaC and CICD tools like Terraform, Ansible, Jenkins etc About The Job The Google Cloud Platform team helps customers transform and build what's next for their business ? all with technology built in the cloud Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware Our teams are dedicated to helping our customers ? developers, small and large businesses, educational institutions and government agencies ? see the benefits of our technology come to life As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze, and explore/visualize data on the Google Cloud Platform You will work on data migrations and modernization projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform/product tests You will travel to customer sites to deploy solutions and deliver workshops to educate and empower customers Additionally, you will work with Product Management and Product Engineering teams to build and drive excellence in our products Google Cloud accelerates every organizations ability to digitally transform its business and industry We deliver enterprise-grade solutions that leverage Googles cutting-edge technology, and tools that help developers build more sustainably Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems Responsibilities Interact with stakeholders to translate customer requirements into recommendations for appropriate solution architectures and advisory services Engage with technical leads, and partners to lead high velocity migration and modernization to Google Cloud Platform (GCP) Design, Migrate/Build and Operationalize data storage and processing infrastructure using Cloud native products Develop and implement data quality and governance procedures to ensure the accuracy and reliability of data Take various project requirements and organize them into clear goals and objectives, and create a work breakdown structure to manage internal and external stakeholders Google is proud to be an equal opportunity workplace and is an affirmative action employer We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status We also consider qualified applicants regardless of criminal histories, consistent with legal requirements See also Google's EEO Policy and EEO is the Law If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form
Posted 1 month ago
5.0 - 8.0 years
6 - 12 Lacs
Chennai
Work from Office
Design and develop scalable cloud-based data solutions on Google Cloud Platform (GCP) Build and optimize Python-based ETL pipelines and data workflows Work with NoSQL databases (Bigtable, Firestore, MongoDB) for high-performance data management
Posted 1 month ago
11.0 - 16.0 years
40 - 45 Lacs
Pune
Work from Office
Role Description This role is for a Senior business functional analyst for Group Architecture. This role will be instrumental in establishing and maintaining bank wide data policies, principles, standards and tool governance. The Senior Business Functional Analyst acts as a link between the business divisions and the data solution providers to align the target data architecture against the enterprise data architecture principles, apply agreed best practices and patterns. Group Architecture partners with each division of the bank to ensure that Architecture is defined, delivered, and managed in alignment with the banks strategy and in accordance with the organizations architectural standards. Your key responsibilities Data Architecture: The candidate will work closely with stakeholders to understand their data needs and break out business requirements into implementable building blocks and design the solution's target architecture. AI/ML: Identity and support the creation of AI use cases focused on delivery the data architecture strategy and data governance tooling. Identify AI/ML use cases and architect pipelines that integrate data flows, data lineage, data quality. Embed AI-powered data quality, detection and metadata enrichment to accelerate data discoverability. Assist in defining and driving the data architecture standards and requirements for AI that need to be enabled and used. GCP Data Architecture & Migration: A strong working experience on GCP Data architecture is must (BigQuery, Dataplex, Cloud SQL, Dataflow, Apigee, Pub/Sub, ...). Appropriate GCP architecture level certification. Experience in handling hybrid architecture & patterns addressing non- functional requirements like data residency, compliance like GDPR and security & access control. Experience in developing reusable components and reference architecture using IaaC (Infrastructure as a code) platforms such as terraform. Data Mesh: The candidate is expected to have proficiency in Data Mesh design strategies that embrace the decentralization nature of data ownership. The candidate must have good domain knowledge to ensure that the data products developed are aligned with business goals and provide real value. Data Management Tool: Access various tools and solutions comprising of data governance capabilities like data catalogue, data modelling and design, metadata management, data quality and lineage and fine-grained data access management. Assist in development of medium to long term target state of the technologies within the data governance domain. Collaboration: Collaborate with stakeholders, including business leaders, project managers, and development teams, to gather requirements and translate them into technical solutions. Your skills and experience Demonstrable experience in designing and deploying AI tooling architectures and use cases Extensive experience in data architecture, within Financial Services Strong technical knowledge of data integration patterns, batch & stream processing, data lake/ data lake house/data warehouse/data mart, caching patterns and policy bases fine grained data access. Proven experience in working on data management principles, data governance, data quality, data lineage and data integration with a focus on Data Mesh Knowledge of Data Modelling concepts like dimensional modelling and 3NF. Experience of systematic structured review of data models to enforce conformance to standards. High level understanding of data management solutions e.g. Collibra, Informatica Data Governance etc. Proficiency at data modeling and experience with different data modelling tools. Very good understanding of streaming and non-streaming ETL and ELT approaches for data ingest. Strong analytical and problem-solving skills, with the ability to identify complex business requirements and translate them into technical solutions.
Posted 1 month ago
18.0 - 22.0 years
25 - 30 Lacs
Pune
Work from Office
Treasury Technology are responsible for the design, build and operation of Deutsche Banks Treasury trading, Balance-sheet Management and Liquidity Reporting ecosystem. In partnership with the Treasury business we look to deliver innovative technology solutions that will enable business to gain competitive edge and operational efficiency. This is a Global role to lead the Engineering function for Treasury Engineering product portfolio. This aim is to develop best in class portfolio consisting of following products: Liqudity Measurement and Management Issuance and Securitization Risk in Banking Book Funds Transfer Pricing Treasury is about managing the money and financial risks in a business. This involves making sure the business has the capital it needs to manage its day-to-day business obligations, while helping develop its long term financial strategy and policies. Economic factors such as interest rate rises, changes in regulations and volatile foreign exchange rates can have a serious impact on any business. Treasuey is responsobile to monitor and assess market conditions and put strategies in place to mitigate any potential financial risks to the business. As a senior leader in Software Engineering, you will lead a highly inspired and inquisitive team of technologists to develop applications to the highest standards. You will be expected to solve complex business and technical challenges while managing a large and senior business stakeholders. You will build an effective and trusted global engineering capability that can deliver consistently against the business ambitions. You are expected to take ownership of the quality of the platform, dev automation, agile processes and production resiliency. Position Specific Responsibilities and Accountabilities: Lead the Global Engineering function across our strategic locations based at Pune, Buchrest, London and New York Communicate with senior business stakeholders with regards to the vision and business goals. Provide transparency to program status, and manage risks and issues Lead a culture of innovation and experimentation, support full software development lifecycle that incorporates the best of technology approaches and delivery methodologies Ensure on time product releases that are of high quality, enabling the core vision of next generation trade processing systems compliant with regulatory requirements Lead development of next generation of cloud enabled platforms which includes modern web frameworks and complex transaction processing systems leveraging a broad set of technology stacks Experience in building fault-tolerant, low-latency, scalable solutions that are performed at a global enterprise scale Implement the talent strategy for engineering aligned to the broader Treasury Technology talent strategy & operating model Develop application with industry best practise using DevOps and automated deployment and testing framework Skills Matrix: Education Qualifications: Degree from an accredited college or university (or equivalent certification and/or relevant work experience). Business Analysis and SME Experience: 18+ years experience in the following areas: Well-developed requirements analysis skills, including good communication abilities (both speaking and listening) and stakeholder management (all levels up to Managing Director). Experience working with Front Office business teams highly desirable Experience in IT delivery or architecture including experience as an Application Developer and people manager Strong object-oriented design skills Previous experience hiring, motivating and managing internal and vendor teams. Technical Experience Mandatory Skills: Java, ideally Spark and Scala Oracle PostGres other Database technologies Experience developing microservices based architectures UI design and implementation Business Process management tools (e.g.g JBPM, IBM BPN) Experience with a range of BI technologies including Tableau Experience with DevOps best practices (DORA), CI/CD Experience in application security, scalability, performance tuning and optimization (NFRs) Experience in API designing, sound knowledge of micro services, containerization (Docker), exposure to federated and NoSQL DB. Experience in database query tuning and optimization Experience in implementing Devops best practices including CI CD, Implementing API testing automation. Experience working in an Agile based team ideally Scrum Desirable skills: Experience with Cloud Services Platforms in particular Google Cloud, and internal cloud based development (Cloud Run, Cloud Composer, Cloud SQL, Docker, K8s) Industry Domain Experience Hands-on knowledge of enterprise technology platforms supporting Front Office, Finance and/or Risk domains would be a significant advantage, as would experience or interest in Sustainable Finance. For example: Knowledge of the Finance/controlling domain and end-to-end workflow for a banking & trading businesses. High level understanding of financial products across Investment, Corporate and Private/Retail banking, in particular Loans. Knowledge of the investment banking, sales & trading, asset management and similar industries is a strong advantage. Clear Thought & Leadership A mindset built on simplicity A clear understanding of the concept of re-use in software development, and the drive to apply it relentlessly Proficiency to talk in functional and data terms to clients, embedded architects and senior managers Technical Leadership skills Ability to work in a fast paced environment with competing and alternating priorities with a constant focus on delivery. Proven ability to balance business demands and IT fulfillment in terms of standardisation, reducing risk and increasing IT flexibility. Logical & structured approach to problem-solving in both near-term (tactical) and mid-long term (strategic) horizons. Communication: Good verbal as well as written communication and presentation capabilities. Good team player facilitator-negotiator networker. Able to lead senior managers towards common goals and build consensus across a diverse group. Able to lead and influence a diverse team from a range of technical and non-technical backgrounds.
Posted 1 month ago
3.0 - 8.0 years
11 - 16 Lacs
Pune
Work from Office
Job Title Lead Engineer Location Pune Corporate Title Director As a lead engineer within the Transaction Monitoring department, you will lead and drive forward critical engineering initiatives and improvements to our application landscape whilst supporting and leading the engineering teams to excel in their roles. You will be closely aligned to the architecture function and delivery leads, ensuring alignment with planning and correct design and architecture governance is followed for all implementation work. You will lead by example and drive and contribute to automation and innovation initiatives with the engineering teams. Join the fight against financial crime with us! Your key responsibilities Experienced hands-on cloud and on-premise engineer, leading by example with engineering squads Thinking analytically, with systematic and logical approach to solving complex problems, and high attention to detail Design & document complex technical solutions at varying levels in an inclusive and participatory manner with a range of stakeholders Liaise and face-off directly to stakeholders in technology, business and modelling areas Collaborating with application development teams to design and prototype solutions (both on-premises and on-cloud), supporting / presenting these via the Design Authority forum for approval and providing good practice and guidelines to the teams Ensuring engineering & architecture compliance with bank-standard processes for deploying new applications, working directly with central functions such as Group Architecture, Chief Security Office and Data Governance Innovate and think creatively, showing willingness to apply new approaches to solving problems and to learn new methods, technologies and potentially outside-of-box solution Your skills and experience Proven hands-on engineering and design experience in a delivery-focused (preferably agile) environment Solid technical/engineering background, preferably with at least two high level languages and multiple relational databases or big-data technologies Proven experience with cloud technologies, preferably GCP (GKE / DataProc / CloudSQL / BigQuery), GitHub & Terraform Competence / expertise in technical skills across a wide range of technology platforms and ability to use and learn new frameworks, libraries and technologies A deep understanding of the software development life cycle and the waterfall and agile methodologies Experience leading complex engineering initiatives and engineering teams Excellent communication skills, with demonstrable ability to interface and converse at both junior and level and with non-IT staff Line management experience including working in a matrix management configuration How well support you Training and development to help you excel in your career Flexible working to assist you balance your personal priorities Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough