Jobs
Interviews

167 Cloud Sql Jobs - Page 7

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 8.0 years

8 - 10 Lacs

Pune

Work from Office

Job Summary We are looking for a seasoned Data Modeler / Data Analyst to design and implement scalable, reusable logical and physical data models on Google Cloud Platformprimarily BigQuery. You will partner closely with data engineers, analytics teams, and business stakeholders to translate complex business requirements into performant data models that power reporting, self-service analytics, and advanced data science workloads. Key Responsibilities Gather and analyze business requirements to translate them into conceptual, logical, and physical data models on GCP (BigQuery, Cloud SQL, Cloud Spanner, etc.). Design star/snowflake schemas, data vaults, and other modeling patterns that balance performance, flexibility, and cost. Implement partitioning, clustering, and materialized views in BigQuery to optimize query performance and cost efficiency. Establish and maintain data modelling standards, naming conventions, and metadata documentation to ensure consistency across analytic and reporting layers. Collaborate with data engineers to define ETL/ELT pipelines and ensure data models align with ingestion and transformation strategies (Dataflow, Cloud Composer, Dataproc, dbt). Validate data quality and lineage; work with BI developers and analysts to troubleshoot performance issues or data anomalies. Conduct impact assessments for schema changes and guide version-control processes for data models. Mentor junior analysts/engineers on data modeling best practices and participate in code/design reviews. Contribute to capacity planning and cost-optimization recommendations for BigQuery datasets and reservations. Must-Have Skills 6-8 years of hands-on experience in data modeling, data warehousing, or database design, including at least 2 years on GCP BigQuery. Proficiency in dimensional modeling, 3NF, and modern patterns such as data vault. Expert SQL skills with demonstrable ability to optimize complex analytical queries on BigQuery (partitioning, clustering, sharding strategies). Strong understanding of ETL/ELT concepts and experience working with tools such as Dataflow, Cloud Composer, or dbt. Familiarity with BI/reporting tools (Looker, Tableau, Power BI, or similar) and how model design impacts dashboard performance. Experience with data governance practices—data cataloging, lineage, and metadata management (e.g., Data Catalog). Excellent communication skills to translate technical concepts into business-friendly language and collaborate across functions. Good to Have Experience of working on Azure Cloud (Fabric, Synapse, Delta Lake) Education Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, Statistics, or a related field. Equivalent experience will be considered.

Posted 2 months ago

Apply

2.0 - 7.0 years

2 - 7 Lacs

Chennai, Tamil Nadu, India

On-site

About The Role : Job Summary: We're seeking an experienced Lead Application Developer with a strong background in PostgreSQL, database management, and PL/SQL. The ideal candidate will have a proven track record of leading teams, designing and developing complex applications, and migrating databases to cloud-based platforms. The successful candidate will be responsible for leading the development team, ensuring the delivery of high-quality applications, and providing technical guidance and expertise. Key Responsibilities: 1. Technical Leadership: Provide technical guidance and leadership to the development team, ensuring the delivery of high-quality applications. 2. Application Development:Design, develop, test, and deploy complex applications using PostgreSQL, PL/SQL, and other relevant technologies. 3. Database Management: Manage and maintain large-scale databases, ensuring data integrity, security, and performance. 4. Database Migration:Lead database migration projects, migrating on-premise databases to cloud-based platforms such as Google Cloud 5. Cloud Exposure: Design and develop cloud-based applications, leveraging cloud services such as Google Cloud SQL. 6. Team Management:Lead, mentor, and coach junior developers, providing guidance and support to ensure their growth and development. 7. Code Reviews: Perform code reviews, ensuring adherence to coding standards, best practices, and security guidelines. 8. Troubleshooting:Troubleshoot complex technical issues, providing timely and effective solutions. 9. Communication:Collaborate with cross-functional teams, communicating technical information and project status to both technical and non-technical stakeholders. 10. Staying Up-to-Date: Stay current with industry trends, emerging technologies, and new tools, applying this knowledge to improve the development team's skills and processes. Requirements: 1. Education: Bachelor's degree in Computer Science, Information Technology, or a related field. 2. Experience: 12+ years of experience in application development, with a focus on PostgreSQL, database management, and PL/SQL.

Posted 2 months ago

Apply

11.0 - 16.0 years

27 - 32 Lacs

Noida

Work from Office

Responsibilities: - Collaborate with the sales team to understand customer challenges and business objectives and propose solutions, POC etc.. - Develop and deliver impactful technical presentations and demos showcasing the capabilities of GCP Data and AI , GenAI Solutions - Conduct technical proof-of-concepts (POCs) to validate the feasibility and value proposition of GCP solutions. - Collaborate with technical specialists and solution architects from COE Team to design and configure tailored cloud solutions. - Manage and qualify sales opportunities, working closely with the sales team to progress deals through the sales funnel. - Stay up to date on the latest GCP offerings, trends, and best practices. Experience : - Design and implement a comprehensive strategy for migrating and modernizing existing relational on-premise databases to scalable and cost-effective solution on Google Cloud Platform ( GCP). - Design and Architect the solutions for DWH Modernization and experience with building data pipelines in GCP - Strong Experience in BI reporting tools ( Looker, PowerBI and Tableau) - In-depth knowledge of Google Cloud Platform (GCP) services, particularly Cloud SQL, Postgres, Alloy DB, BigQuery, Looker Vertex AI and Gemini (GenAI) - Strong knowledge and experience in providing the solution to process massive datasets in real time and batch process using cloud native/open source Orchestration techniques - Build and maintain data pipelines using Cloud Dataflow to orchestrate real-time and batch data processing for streaming and historical data. - Strong knowledge and experience in best practices for data governance, security, and compliance - Excellent Communication and Presentation Skills with ability to tailor technical information as per customer needs - Strong analytical and problem-solving skills. - Ability to work independently and as part of a team. Apply Save Save Pro Insights

Posted 2 months ago

Apply

4.0 - 8.0 years

17 - 22 Lacs

Bengaluru

Work from Office

Job Overview : We are seeking a Site Reliability Engineer (SRE) with expertise in Autosys and Google Cloud Platform (GCP) to join our dynamic team. The ideal candidate will have strong hands-on experience in job scheduling and automation using Autosys, as well as a deep understanding of cloud infrastructure and operations on Google Cloud. You will be responsible for ensuring the reliability, scalability, and performance of cloud-based applications and infrastructure, while managing complex workflows and automating critical operations. This is a great opportunity for a highly motivated individual to work in a collaborative environment where you'll apply your skills to build and maintain highly reliable cloud infrastructure solutions. Key Responsibilities : - Work with Google Cloud Platform (GCP) to design, deploy, and maintain cloud infrastructure. Manage GCP services such as Compute Engine, Cloud Functions, Kubernetes Engine (GKE), and Cloud Storage. - Manage and automate job scheduling using Autosys to ensure that critical workflows run smoothly, are optimized for performance, and have minimal downtime. Troubleshoot, monitor, and resolve issues related to Autosys jobs. - Implement best practices for monitoring, alerting, and incident management to maintain high system uptime and service reliability. Develop automated solutions for routine tasks to ensure consistency and prevent downtime. - Collaborate with development teams to integrate and maintain CI/CD pipelines for continuous delivery of applications, ensuring seamless and efficient deployments across GCP environments. - Ensure the security, integrity, and compliance of all cloud-based systems within the GCP environment. Work with security teams to implement security best practices such as identity and access management (IAM), firewalls, and data encryption. - Set up and maintain monitoring solutions (e.g., Prometheus, Grafana, Stackdriver for GCP) to track system health and performance. Respond promptly to incidents, troubleshoot issues, and ensure effective resolution. - Analyze system performance and provide recommendations for improvements. Optimize resources to ensure applications are running cost-effectively, with good resource utilization in GCP. - Work closely with development, QA, and operations teams to ensure the smooth deployment and operation of applications. Participate in on-call rotations and incident management processes to maintain application uptime. - Document processes, troubleshooting guides, architecture diagrams, and standard operating procedures for system reliability. Conduct knowledge sharing sessions and help build a knowledge base within the team. Requirements : - 4 to 8 years of experience in Site Reliability Engineering (SRE) or Operations Engineering with hands-on experience in cloud environments (specifically GCP). - Strong experience with Autosys, including job scheduling, monitoring, and automation of workflows. Familiarity with Autosys configuration, job dependencies, and troubleshooting is essential. - Experience with Google Cloud Platform (GCP) services such as Compute Engine, Cloud Functions, GKE, Cloud Storage, and Cloud SQL. - Strong experience with Linux/Unix systems and system administration. - Proficiency in scripting languages such as Python, Bash, or Shell scripting to automate workflows, manage cloud resources, and handle repetitive tasks. - Familiarity with tools like Prometheus, Grafana, and Google Stackdriver for cloud monitoring, logging, and alerting. - Hands-on experience with Docker and Kubernetes, especially Google Kubernetes Engine (GKE) for container orchestration and deployment. - Knowledge of continuous integration and deployment tools (e.g., Jenkins, GitLab CI, Terraform) for automated deployments and infrastructure management. - Experience with Git for version control, code reviews, and managing automation scripts. - Strong troubleshooting, debugging, and analytical skills, with the ability to identify and resolve system failures or performance issues. - Familiarity with IAM roles, security best practices, and compliance standards within the GCP ecosystem. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

About The Role : Job TitleDevOps Engineer, AS LocationBangalore, India Role Description Deutsche Bank has set for itself ambitious goals in the areas of Sustainable Finance, ESG Risk Mitigation as well as Corporate Sustainability. As Climate Change throws new Challenges and opportunities, Bank has set out to invest in developing a Sustainability Technology Platform, Sustainability data products and various sustainability applications which will aid Banks goals. As part of this initiative, we are building an exciting global team of technologists who are passionate about Climate Change, want to contribute to greater good leveraging their Technology Skillset in Cloud / Hybrid Architecture. As part of this Role, we are seeking a highly skilled and experienced DevOps Engineer to join our growing team. In this role, you will play a pivotal role in managing and optimizing cloud infrastructure, facilitating continuous integration and delivery, and ensuring system reliability. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy. Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Create, implement, and oversee scalable, secure, and cost-efficient cloud infrastructures on Google Cloud Platform (GCP). Utilize Infrastructure as Code (IaC) methodologies with tools such as Terraform, Deployment Manager, or alternatives. Implement robust security measures to ensure data access control and compliance with regulations. Adopt security best practices, establish IAM policies, and ensure adherence to both organizational and regulatory requirements. Set up and manage Virtual Private Clouds (VPCs), subnets, firewalls, VPNs, and interconnects to facilitate secure cloud networking. Establish continuous integration and continuous deployment (CI/CD) pipelines using Jenkins, GitHub Actions, or comparable tools for automated application deployments. Implement monitoring and alerting solutions through Stackdriver (Cloud Operations), Prometheus, or other third-party applications. Evaluate and optimize cloud expenditures by utilizing committed use discounts, autoscaling features, and resource rightsizing. Manage and deploy containerized applications through Google Kubernetes Engine (GKE) and Cloud Run. Deploy and manage GCP databases like Cloud SQL, BigQuery. Your skills and experience Minimum of 5+ years of experience in DevOps or similar roles with hands-on experience in GCP. In-depth knowledge of Google Cloud services (e.g., GCE, GKE, Cloud Functions, Cloud Run, Pub/Sub, BigQuery, Cloud Storage) and the ability to architect, deploy, and manage cloud-native applications. Proficient in using tools like Jenkins, GitLab, Terraform, Ansible, Docker, Kubernetes. Experience with Infrastructure as Code (IaC) tools like Terraform, CloudFormation, or GCP-native Deployment Manager. Solid understanding of security protocols, IAM, networking, and compliance requirements within cloud environments. Strong problem-solving skills and ability to troubleshoot cloud-based infrastructure. Google Cloud certifications (e.g., Associate Cloud Engineer, Professional Cloud Architect, or Professional DevOps Engineer) are a plus. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 months ago

Apply

3.0 - 8.0 years

15 - 30 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Salary: 15 to 30 LPA Exp: 3 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer

Posted 2 months ago

Apply

7 - 12 years

5 - 15 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

We are hiring for PostgreSQL DBA Experience range & skills: Mandatory, with no exceptions Experience range 7+ years as Postgresql DBA with cloud & SQL Education: BE/B.Tech/MCA/M.Tech/MSc./MS Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!

Posted 2 months ago

Apply

4 - 9 years

15 - 19 Lacs

Pune

Work from Office

About The Role : Job Title: Technical-Specialist GCP Developer LocationPune, India Role Description This role is for Engineer who is responsible for design, development, and unit testing software applications. The candidate is expected to ensure good quality, maintainable, scalable, and high performing software applications getting delivered to users in an Agile development environment. Candidate / Applicant should be coming from a strong technological background. The candidate should have goo working experience in Spark and GCP technology. Should be hands on and be able to work independently requiring minimal technical/tool guidance. Should be able to technically guide and mentor junior resources in the team. As a developer you will bring extensive design and development skills to enforce the group of developers within the team. The candidate will extensively make use and apply Continuous Integration tools and practices in the context of Deutsche Banks digitalization journey. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy. Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design and discuss your own solution for addressing user stories and tasks. Develop and unit-test, Integrate, deploy, maintain, and improve software. Perform peer code review. Actively participate into the sprint activities and ceremonies e.g., daily stand-up/scrum meeting, Sprint planning, retrospectives, etc. Apply continuous integration best practices in general (SCM, build automation, unit testing, dependency management) Collaborate with other team members to achieve the Sprint objectives. Report progress/update Agile team management tools (JIRA/Confluence) Manage individual task priorities and deliverables. Responsible for quality of solutions candidate / applicant provides. Contribute to planning and continuous improvement activities & support PO, ITAO, Developers and Scrum Master. Your skills and experience Engineer with Good development experience in Google Cloud platform for at least 4 years. Hands own experience in Bigquery, Dataproc, Composer, Terraform, GKE, Cloud SQL and Cloud functions. Experience in set-up, maintenance, and ongoing development of continuous build/ integration infrastructure as a part of Devops. Create and maintain fully automated CI build processes and write build and deployment scripts. Has experience with development platformsOpenShift/ Kubernetes/Docker configuration and deployment with DevOps tools e.g., GIT, TeamCity, Maven, SONAR Good Knowledge about the core SDLC processes and tools such as HP ALM, Jira, Service Now. Knowledge on working with APIs and microservices , integrating external and internal web services including SOAP, XML, REST, JSON . Strong analytical skills. Proficient communication skills. Fluent in English (written/verbal). Ability to work in virtual teams and in matrixed organizations. Excellent team player. Open minded and willing to learn business and technology. Keeps pace with technical innovation. Understands the relevant business area. Ability to share information, transfer knowledge to expertise the team members. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.

Posted 2 months ago

Apply

5 - 10 years

20 - 35 Lacs

Bengaluru

Hybrid

GCP Data Engineer - 5+ Years of experience - GCP (all services needed for Big Data pipelines like BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, App Engine), Spark, Scala, Hadoop - Python, PySpark, Orchestration (Airflow), SQL CI/CD (experience with Deployment pipelines) Architecture and Design of cloud-based Big Data pipelines and exposure to any ETL tools Nice to Have - GCP certifications

Posted 2 months ago

Apply

4 - 7 years

10 - 19 Lacs

Indore, Gurugram, Bengaluru

Work from Office

We need GCP engineers for capacity building; - The candidate should have extensive production experience (1-2 Years ) in GCP, Other cloud experience would be a strong bonus. - Strong background in Data engineering 2-3 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc. - Exposure to enterprise application development is a must Roles and Responsibilities 4-7 years of IT experience range is preferred. Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Good to have knowledge on Cloud Composer, Cloud SQL, Big Table, Cloud Function. Strong experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark including DevOPs. Good hands on expertise on either Python or Java programming. Good Understanding of GCP core services like Google cloud storage, Google compute engine, Cloud SQL, Cloud IAM. Good to have knowledge on GCP services like App engine, GKE, Cloud Run, Cloud Built, Anthos. Ability to drive the deployment of the customers’ workloads into GCP and provide guidance, cloud adoption model, service integrations, appropriate recommendations to overcome blockers and technical road-maps for GCP cloud implementations. Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities. Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies. Act as a subject-matter expert OR developer around GCP and become a trusted advisor to multiple teams. Technical ability to become certified in required GCP technical certifications.

Posted 2 months ago

Apply

5 - 7 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Database Administrator Project Role Description : Administer, develop, test, or demonstrate databases. Perform many related database functions across one or more teams or clients, including designing, implementing and maintaining new databases, backup/recovery and configuration management. Install database management systems (DBMS) and provide input for modification of procedures and documentation used for problem resolution and day-to-day maintenance. Must have skills : GCP Dataflow Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : BTech Summary :As a Database Administrator, you will administer, develop, test, or demonstrate databases. You will perform many related database functions across one or more teams or clients, including designing, implementing and maintaining new databases, backup/recovery and configuration management. Your typical day will involve installing database management systems (DBMS) and providing input for modification of procedures and documentation used for problem resolution and day-to-day maintenance. You will play a crucial role in ensuring the smooth functioning of databases and contributing to the overall success of the team. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Design, develop, and implement database systems based on project requirements. Perform database administration tasks, including installation, configuration, and maintenance of database management systems. Ensure data integrity and security by implementing appropriate access controls and backup/recovery procedures. Optimize database performance by monitoring and tuning database parameters and queries. Collaborate with cross-functional teams to identify and resolve database-related issues. Conduct regular database performance analysis and capacity planning to ensure scalability and efficiency. Stay updated with the latest database technologies and trends to recommend improvements and enhancements. Train and mentor junior database administrators to enhance their skills and knowledge. Professional & Technical Skills: Must To Have Skills:Proficiency in GCP Dataflow. Strong understanding of database management systems and concepts. Experience in designing, implementing, and maintaining databases. Knowledge of backup/recovery and configuration management. Hands-on experience with SQL and scripting languages for database administration tasks. Familiarity with database security and access control mechanisms. Good To Have Skills:Experience with cloud-based database platforms such as Google Cloud SQL or Amazon RDS. Experience with database performance tuning and optimization techniques. Knowledge of data warehousing and ETL processes. Understanding of data modeling and database design principles. Additional Information: The candidate should have a minimum of 5 years of experience in GCP Dataflow. This position is based at our Bengaluru office. A BTech degree is required. Qualification BTech

Posted 2 months ago

Apply

7 - 12 years

13 - 17 Lacs

Gurugram

Work from Office

Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills

Posted 2 months ago

Apply

1 - 3 years

3 - 6 Lacs

Bengaluru

Work from Office

Skill required: Record To Report - Invoice Processing Designation: Record to Report Ops Associate Qualifications: BCom/MCom Years of Experience: 1 to 3 years Language - Ability: English(Domestic) - Expert What would you do? You will be aligned with our Finance Operations vertical and will be helping us in determining financial outcomes by collecting operational data/reports, whilst conducting analysis and reconciling transactions.Posting journal entries, preparing balance sheet reconciliations, reviewing entries and reconciliations, preparing cash forecasting statement, supporting month end closing, preparing reports and supports in audits.Refers to the systematic handling and management of incoming invoices within a business or organization. It involves tasks such as verifying the accuracy of the invoice, matching it with purchase orders and delivery receipts, and initiating the payment process. Automated systems and software are often employed to streamline and expedite the invoice processing workflow, improving efficiency and reducing the likelihood of errors. What are we looking for? Google Cloud SQL Adaptable and flexible Ability to perform under pressure Problem-solving skills Agility for quick learning Commitment to quality Roles and Responsibilities: In this role you are required to solve routine problems, largely through precedent and referral to general guidelines Your expected interactions are within your own team and direct supervisor You will be provided detailed to moderate level of instruction on daily work tasks and detailed instruction on new assignments The decisions that you make would impact your own work You will be an individual contributor as a part of a team, with a predetermined, focused scope of work Please note that this role may require you to work in rotational shifts Qualification BCom,MCom

Posted 2 months ago

Apply

11 - 20 years

45 - 75 Lacs

Gurugram

Work from Office

Role & responsibilities Proven success architecting and scaling complex software solutions, Familiarity with interface design. Experience and ability to drive a project/module independently from an execution stand. Prior experience with scalable architecture and distributed processing. Strong Programming expertise in Python, SQL, Scala Hands-on experience on any major big data solutions like Spark, Kafka, Hive. Strong data management skills with ETL, DWH, Data Quality and Data Governance. Hands-on experience on microservices architecture, docker and Kubernetes as orchestration. Experience on cloud-based data stores like Redshift and Big Query. Experience in cloud solution architecture. Experience on architecture of running spark jobs on k8s and optimization of spark jobs. Experience in MLops architecture/tools/orchestrators like Kubeflow, MLflow Experience in logging, metrics and distributed tracing systems (e.g. Prometheus/Grafana/Kibana) Experience in CI/CD using octopus/teamcity/jenkins Interested candidate can share their updated resume at surinder.kaur@mounttalent.com

Posted 2 months ago

Apply

3 - 7 years

8 - 13 Lacs

Pune

Work from Office

About The Role : J ob Title Senior Full Stack Engineer Corporate TitleAssistant Vice President LocationPune, India Role Description Enterprise SRE Team in CB is responsible for making Production Better by boosting Observability and strengthening reliability across Corporate Banking. The team actively works on building common platforms, reference architectures, tools for production engineering teams to standardize processes across CB. We work in agile environment with focus on Customer centricity and outstanding user experience with high reliability and flexibility of technical solutions in mind. With our platform we want to be an enabler for highest quality cloud-based software solutions and processes at Deustche Bank. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities What Youll Do Work on the SLO Dashboard, an application owned by the CB SRE team, ensuring its design (a highly scalable & performant solution), development, and maintenance. Participate in requirement workshops, analyze requirements, perform technical design, and take ownership of the development process. Identify and implement appropriate tools to support engineering automation, including test automation and CI/CD pipelines. Understand technical needs, prioritize requirements, and manage technical debt based on stakeholder urgency. Collaborate with the UI/UX designer while being mindful of backend changes and their impact on architecture or endpoint modifications during discussions. Produce detailed design documents and guide junior developers to align with the priorities and deliverables of the SLO Dashboard. Your skills and experience Several years relevant experiences in software architecture, design, development, and engineering, ideally in banking/finance services industry Strong engineering, solution and domain architecture background and up to date knowledge on software engineering topics such as microservices, streaming architectures, high-performance, horizontal scaling, API design, GraphQL, REST services, database systems, UI frameworks, Distributed Caching (e.g. Apache Ignite, HazelCast, Redis etc.), enterprise integration patterns, modern SDLC practices Good experience in working in GCP (Cloud based technologies) using GKE, CloudSQL (Postgres), Cloudrun, terraform. Good experience in DevOps using GitHub Actions for build, Liquibase pipelines. Fluent in application development stack such as Java/Spring-Boot (3.0+), ReactJS, Python, JavaScript/TypeScript/NodeJS, SQL Postgres DB. Ability to work in a fast-paced environment with competing and alternating priorities with a constant focus on delivery with strong interpersonal skills to manage relationships with a variety of partners and stakeholders; as well as facilitate group sessions AI Integration and Implementation (Nice to have): Leverage AI tools like GitHub Copilot, Google Gemini, Llama and other language models to optimize engineering analytics and workflows. Design and implement AI-driven dashboards and reporting tools for stakeholders Apply AI tools to automate repetitive tasks, analyze complex engineering datasets, and derive trends and patterns. How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs

Posted 2 months ago

Apply

7 - 12 years

15 - 20 Lacs

Navi Mumbai, Bengaluru, Mumbai (All Areas)

Work from Office

Key Responsibilities: Design, implement, and maintain GCP cloud infrastructure using Infrastructure as Code (IaC) tools Manage and optimize Kubernetes clusters on GKE (Google Kubernetes Engine) Build and maintain CI/CD pipelines for efficient application delivery Monitor GCP infrastructure costs and drive optimization strategies Develop observability solutions using GCP-native and third-party tools Collaborate with engineering teams to streamline deployment and operations workflows Enforce security best practices and ensure compliance with internal and industry standards Design and implement high availability (HA) and disaster recovery (DR) architectures Mandatory Technical Skills: GCP Services: Compute Engine, VPC, Cloud Storage, Cloud SQL, IAM, Cloud DNS, Cloud Monitoring Infrastructure as Code: Terraform (preferred), Deployment Manager Containerization: Docker, Kubernetes (GKE expertise required) CI/CD Tools: GitHub Actions, Cloud Build, Jenkins, or similar Version Control: Git Scripting Languages: Python, Bash Monitoring & Logging: Stackdriver, Prometheus, Grafana, ELK Stack Strong experience with automation and configuration management (Terraform, Ansible, etc.) Solid understanding of cloud security best practices Experience designing fault-tolerant, resilient cloud-native architectures 47 years in DevOps/Cloud Engineering roles Minimum 2+ years hands-on with GCP infrastructure and services Proven experience managing CI/CD pipelines and container-based deployments Strong background in modern DevOps tools and cloud-native architectures Preferred candidate profile

Posted 2 months ago

Apply

8 - 13 years

12 - 20 Lacs

Bengaluru

Hybrid

Project Role : Cloud Platform Engineer Project Role Description : Designs, builds, tests, and deploys cloud application solutions that integrate cloud and non-cloud infrastructure. Can deploy infrastructure and platform environments, creates a proof of architecture to test architecture viability, security and performance. Must have skills : Google Cloud Platform Architecture Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : Anu bachelors degree Summary: As a Cloud Platform Engineer, you will be responsible for designing, building, testing, and deploying cloud application solutions that integrate cloud and non-cloud infrastructure. You will deploy infrastructure and platform environments, create proof of architecture to test architecture viability, security, and performance. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead the implementation of cloud solutions - Optimize cloud infrastructure for performance and cost-efficiency - Troubleshoot and resolve technical issues Professional & Technical Skills: - Must To Have Skills: Proficiency in Google Cloud Platform Architecture - Strong understanding of cloud architecture principles - Experience with DevOps practices - Experience with Google Cloud SQL - Hands-on experience in cloud deployment and management - Knowledge of security best practices in cloud environments Additional Information: - The candidate should have a minimum of 7.5 years of experience in Google Cloud Platform Architecture - This position is based at our Bengaluru office - A bachelors degree is required

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies