Home
Jobs

215 Gcp Cloud Jobs - Page 5

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 7.0 years

7 - 11 Lacs

Noida

Hybrid

Naukri logo

Job DescriptionJob Description • Test Planning and Strategy: • Develop and implement comprehensive test plans and strategies based on project requirements and specifications. • Collaborate with cross-functional teams to identify test scenarios and prioritize testing efforts. • Define test objectives, scope, and deliverables for each project. • Test Execution: • Design and execute test plans to verify software functionality, performance, and usability. • Introduce test automation for application workflows. • Monitor and analyze test results, identify defects, and track them using bug tracking systems. • Collaborate with developers to troubleshoot and resolve identified issues. •Continuous Improvement: • Stay up to date with industry trends, tools, and best practices in software testing and quality assurance. • Propose and implement process improvements to enhance the efficiency and effectiveness of testing efforts. • Participate in code reviews and provide feedback on software design and architecture to improve testability and maintainability. • Documentation and Reporting: • Create and maintain detailed test documentation, including test plans, test cases, and test scripts. • Generate regular reports on testing progress, test coverage, and defect metrics. • Communicate testing results, issues, and risks to stakeholders in a clear and concise manner Job QualificationsJob Qualifications • Bachelors degree in Computer Science, Software Engineering, or a related field. • Proven experience as a QA Engineer or Software Tester, preferably in a software development environment. • Strong understanding of software testing methodologies, tools, and processes. • Strong experience in automated testing for Java Swing GUI • Proficient in Postman for API testing • 2+ years experience writing SQL statements for database level testing. Solid knowledge of SQL and relational databases. • Experience with Cloud platforms such as Google Cloud Platform • Familiarity with version control systems • Excellent analytical and problem-solving skills. • Strong attention to detail and ability to effectively prioritize and manage multiple tasks. • Excellent written and verbal communication skills.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

20 - 25 Lacs

Hyderabad, Ahmedabad

Hybrid

Naukri logo

Hi Aspirant, Greetings from TechBlocks - IT Software of Global Digital Product Development - Hyderabad !!! About us : TechBlocks is a global digital product engineering company with 16+ years of experience helping Fortune 500 enterprises and high-growth brands accelerate innovation, modernize technology, and drive digital transformation. From cloud solutions and data engineering to experience design and platform modernization, we help businesses solve complex challenges and unlock new growth opportunities. Job Title: Senior DevOps Site Reliability Engineer (SRE) Location : Hyderabad & Ahmedabad Employment Type: Full-Time Work Model - 3 Days from office Job Overview Dynamic, motivated individuals deliver exceptional solutions for the production resiliency of the systems. The role incorporates aspects of software engineering and operations, DevOps skills to come up with efficient ways of managing and operating applications. The role will require a high level of responsibility and accountability to deliver technical solutions. Summary: As a Senior SRE, you will ensure platform reliability, incident management, and performance optimization. You'll define SLIs/SLOs, contribute to robust observability practices, and drive proactive reliability engineering across services. Experience Required: 610 years of SRE or infrastructure engineering experience in cloud-native environments. Mandatory: Cloud : GCP (GKE, Load Balancing, VPN, IAM) Observability: Prometheus, Grafana, ELK, Datadog Containers & Orchestration : Kubernetes, Docker Incident Management: On-call, RCA, SLIs/SLOs IaC : Terraform, Helm Incident Tools: PagerDuty, OpsGenie Nice to Have : GCP Monitoring, Skywalking Service Mesh, API Gateway GCP Spanner, Scope: Drive operational excellence and platform resilience Reduce MTTR, increase service availability Own incident and RCA processes Roles and Responsibilities: Define and measure Service Level Indicators (SLIs), Service Level Objectives ( SLOs), and manage error budgets across services. Lead incident management for critical production issues drive Root Cause Analysis (RCA) and postmortems. Create and maintain runbooks and standard operating procedures for high availability services. Design and implement observability frameworks using ELK, Prometheus, and Grafana ; drive telemetry adoption. Coordinate cross-functional war-room sessions during major incidents and maintain response logs. Develop and improve automated System Recovery, Alert Suppression, and Escalation logic. Use GCP tools like GKE, Cloud Monitoring, and Cloud Armor to improve performance and security posture. Collaborate with DevOps and Infrastructure teams to build highly available and scalable systems. Analyze performance metrics and conduct regular reliability reviews with engineering leads. Participate in capacity planning, failover testing, and resilience architecture reviews. If you are interested , then please share me your updated resume to kranthikt@tblocks.com Warm Regards, Kranthi Kumar kranthikt@tblocks.com Contact: 8522804902 Senior Talent Acquisition Specialist Toronto | Ahmedabad | Hyderabad | Pune www.tblocks.com

Posted 2 weeks ago

Apply

4.0 - 9.0 years

0 - 1 Lacs

Hyderabad

Work from Office

Naukri logo

Job Title: Software Engineer -Data Engineer Position: Software Engineer Experience: 4-9 years Category: Software Development/ Engineering Shift Timings: 1:00 pm to 10:00 pm Main location: Hyderabad Work Type: Work from office Notice Period: 0-30 Days Skill: Python, Pyspark, Data Bricks Employment Type: Full Time • Bachelor's in Computer Science, Computer Engineering or related field Required qualifications to be successful in this role Must have Skills: • 3+ yrs. Development experience with Spark (PySpark), Python and SQL. • Extensive knowledge building data pipelines • Hands on experience with Databricks Devlopment • Strong experience with • Strong experience developing on Linux OS. • Experience with scheduling and orchestration (e.g. Databricks Workflows,airflow, prefect, control-m). Good to have skills: • Solid understanding of distributed systems, data structures, design principles. • Agile Development Methodologies (e.g. SAFe, Kanban, Scrum). • Comfortable communicating with teams via showcases/demos. • Play key role in establishing and implementing migration patterns for the Data Lake Modernization project. • Actively migrate use cases from our on premises Data Lake to Databricks on GCP. • Collaborate with Product Management and business partners to understand use case requirements and reporting. • Adhere to internal development best practices/lifecycle (e.g. Testing, Code Reviews, CI/CD, Documentation) . • Document and showcase feature designs/workflows. • Participate in team meetings and discussions around product development. • Stay up to date on industry latest industry trends and design patterns. • 3+ years experience with GIT. • 3+ years experience with CI/CD (e.g. Azure Pipelines). • Experience with streaming technologies, such as Kafka, Spark. • Experience building applications on Docker and Kubernetes. • Cloud experience (e.g. Azure, Google). Interested Candidates can drop your Resume on Mail id :- " kalyan.v@talent21.in "

Posted 2 weeks ago

Apply

1.0 - 2.0 years

0 - 0 Lacs

Chennai

Work from Office

Naukri logo

The ideal candidate should have a strong background in SQL, BigQuery, and Google Cloud Platform (GCP), with hands-on experience in developing reports and dashboards using Looker Studio, Looker Standard, and LookML. Excellent communication skills and the ability to work collaboratively with cross-functional teams are essential for success in this role. Key Responsibilities: Design, develop, and maintain dashboards and reports using Looker Studio and Looker Standard. Develop and maintain LookML models, explores, and views to support business reporting requirements. Optimize and write advanced SQL queries for data extraction, transformation, and analysis. Work with BigQuery as the primary data warehouse for managing and analyzing large datasets. Collaborate with business stakeholders to understand data requirements and translate them into scalable reporting solutions. Implement data governance, access controls, and performance optimizations within the Looker environment. Perform root-cause analysis and troubleshooting for reporting and data issues. Maintain documentation for Looker projects, data models, and data dictionaries. Stay updated with the latest Looker and GCP features and best practices.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 - 1 Lacs

Hyderabad

Work from Office

Naukri logo

Job Title: Software Engineer -Data Engineer Position: Software Engineer Experience: 4-6 years (Less YOE will be Rejected) Category: Software Development/ Engineering Shift Timings: 1:00 pm to 10:00 pm Main location: Hyderabad Work Type: Work from office Notice Period: 0-30 Days Skill: Python, Pyspark, Data Bricks Employment Type: Full Time • Bachelor's in Computer Science, Computer Engineering or related field Required qualifications to be successful in this role Must have Skills: • 3+ yrs. Development experience with Spark (PySpark), Python and SQL. • Extensive knowledge building data pipelines • Hands on experience with Databricks Devlopment • Strong experience with • Strong experience developing on Linux OS. • Experience with scheduling and orchestration (e.g. Databricks Workflows,airflow, prefect, control-m). Good to have skills: • Solid understanding of distributed systems, data structures, design principles. • Agile Development Methodologies (e.g. SAFe, Kanban, Scrum). • Comfortable communicating with teams via showcases/demos. • Play key role in establishing and implementing migration patterns for the Data Lake Modernization project. • Actively migrate use cases from our on premises Data Lake to Databricks on GCP. • Collaborate with Product Management and business partners to understand use case requirements and reporting. • Adhere to internal development best practices/lifecycle (e.g. Testing, Code Reviews, CI/CD, Documentation) . • Document and showcase feature designs/workflows. • Participate in team meetings and discussions around product development. • Stay up to date on industry latest industry trends and design patterns. • 3+ years experience with GIT. • 3+ years experience with CI/CD (e.g. Azure Pipelines). • Experience with streaming technologies, such as Kafka, Spark. • Experience building applications on Docker and Kubernetes. • Cloud experience (e.g. Azure, Google). Interested Candidates can drop your Resume on Mail id :- " tarun.k@talent21.in "

Posted 2 weeks ago

Apply

12.0 - 15.0 years

13 - 17 Lacs

Pune

Work from Office

Naukri logo

Title : Data & AI Technical Solution ArchitectsData & AI Technical Solution Architects Req ID: 323749 We are currently seeking a Data & AI Technical Solution ArchitectsData & AI Technical Solution Architects to join our team in Pune, Mahrshtra (IN-MH), India (IN). Job DutiesThe Data & AI Architect is a seasoned level expert who is responsible for participating in the delivery of multi-technology consulting services to clients by providing strategies and solutions on all aspects of infrastructure and related technology components. This role collaborates with other stakeholders on the development of the architectural approach for one or more layer of a solution. This role has the primary objective is to work on strategic projects that ensure the optimal functioning of the client"™s technology infrastructure. "¢ Key Responsibilities: "¢ Ability and experience to have conversations with the CEO, Business owners and CTO/CDO "¢ Break down intricate business challenges, devise effective solutions, and focus on client needs. "¢ Craft high level innovative solution approach for complex business problems "¢ Utilize best practices and creativity to address challenges "¢ Leverage market research, formulate perspectives, and communicate insights to clients "¢ Establish strong client relationships "¢ Interact at appropriate levels to ensure client satisfaction "¢ Knowledge and Attributes: "¢ Ability to focus on detail with an understanding of how it impacts the business strategically. "¢ Excellent client service orientation. "¢ Ability to work in high-pressure situations. "¢ Ability to establish and manage processes and practices through collaboration and the understanding of business. "¢ Ability to create new and repeat business for the organization. "¢ Ability to contribute information on relevant vertical markets "¢ Ability to contribute to the improvement of internal effectiveness by contributing to the improvement of current methodologies, processes and tools. Minimum Skills RequiredAcademic Qualifications and Certifications: "¢ BE/BTech or equivalent in Information Technology and/or Business Management or a related field. "¢ Scaled Agile certification desirable. "¢ Relevant consulting and technical certifications preferred, for example TOGAF. Required Experience12-15 years "¢ Seasoned demonstrable experience in a similar role within a large scale (preferably multi- national) technology services environment. "¢ Very good understanding of Data, AI, Gen AI and Agentic AI "¢ Must have Data Architecture and Solutioning experience. Capable of E2E Data Architecture and GenAI Solution design. "¢ Must be able to work on Data & AI RFP responses as Solution Architect "¢ 10+ years of experience in Solution Architecting of Data & Analytics, AI/ML & Gen AI Technical Architect "¢ Develop Cloud-native technical approach and proposal plans identifying the best practice solutions meeting the requirements for a successful proposal. Create, edit, and review documents, diagrams, and other artifacts in response to RPPs RFQs and Contribute to and participate in presentations to customers regarding proposed solutions. "¢ Proficient with Snowflake, Databricks, Azure, AWS, GCP cloud, Data Engineering & AI tools "¢ Experience with large scale consulting and program execution engagements in AI and data "¢ Seasoned multi-technology infrastructure design experience. "¢ Seasoned demonstrable level of expertise coupled with consulting and client engagement experience, demonstrating good experience in client needs assessment and change management. "¢ Additional Additional Additional Career Level Description: Knowledge and application: "¢ Seasoned, experienced professional; has complete knowledge and understanding of area of specialization. "¢ Uses evaluation, judgment, and interpretation to select right course of action. Problem solving: "¢ Works on problems of diverse scope where analysis of information requires evaluation of identifiable factors. "¢ Resolves and assesses a wide range of issues in creative ways and suggests variations in approach. Interaction: "¢ Enhances relationships and networks with senior internal/external partners who are not familiar with the subject matter often requiring persuasion. "¢ Works

Posted 2 weeks ago

Apply

4.0 - 9.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 324871 We are currently seeking a Senior Backend (Java) Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Once You Are Here, You Will Delivers code on time independently meeting the Definition of Done. Good understanding of platform and deployment model. Depth in two or more technologies or platforms. Depth knowledge of primary language. Mentors more junior developers in pair-programming sessions. Provides design guidance within established architecture. Demonstrates proficiency in advanced debugging and troubleshooting tools and techniques. Understanding of secure coding, common vulnerabilities, and secure architectural practices. Basic Qualifications 4+ years of experience developing Java code. 4+ years of experience in object-oriented design development. 4+ years of experience developing spring framework. 3+ years of experience working in software and on GCP technologies and cloud native services. 2+ years of experience with GCP Cloud Run, designing & scheduling & executing batch jobs. Preferred Skills: GCP Solutions Certification, management of Cloud environment Understanding of IT/ workflows in large enterprises Experience on middleware packages likeWeb Logic, WebSphere and / or JBoss. Experience in either Rest or Soap API development. Experience with other Frameworks likeSpring Boot, MVC, or Hibernate. Strong written and verbal communication skills. Knowledge of common databases Understanding of modern programming languages. Need tech lead level person who can understand architecture design and break it down into stories for team to work on. Bachelor"™s Degree preferred, Master"™s Degree desired. Ideal Mindset Lifelong Learner. You are always seeking to improve your technical and nontechnical skills. Team Player. You are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. Communicator. You know how to communicate your design ideas to both technical and nontechnical stakeholders, prioritizing critical information and leaving out extraneous details.

Posted 2 weeks ago

Apply

12.0 - 15.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 323777 We are currently seeking a Data Architect Sr. Advisor to join our team in Bangalore, Karntaka (IN-KA), India (IN). "Job DutiesThe Data & AI Architect is a seasoned level expert who is responsible for participating in the delivery of multi-technology consulting services to clients by providing strategies and solutions on all aspects of infrastructure and related technology components. This role collaborates with other stakeholders on the development of the architectural approach for one or more layer of a solution. This role has the primary objective is to work on strategic projects that ensure the optimal functioning of the client"™s technology infrastructure. "¢ Key Responsibilities: "¢ Ability and experience to have conversations with the CEO, Business owners and CTO/CDO "¢ Break down intricate business challenges, devise effective solutions, and focus on client needs. "¢ Craft high level innovative solution approach for complex business problems "¢ Utilize best practices and creativity to address challenges "¢ Leverage market research, formulate perspectives, and communicate insights to clients "¢ Establish strong client relationships "¢ Interact at appropriate levels to ensure client satisfaction "¢ Knowledge and Attributes: "¢ Ability to focus on detail with an understanding of how it impacts the business strategically. "¢ Excellent client service orientation. "¢ Ability to work in high-pressure situations. "¢ Ability to establish and manage processes and practices through collaboration and the understanding of business. "¢ Ability to create new and repeat business for the organization. "¢ Ability to contribute information on relevant vertical markets "¢ Ability to contribute to the improvement of internal effectiveness by contributing to the improvement of current methodologies, processes and tools. Minimum Skills RequiredAcademic Qualifications and Certifications: "¢ BE/BTech or equivalent in Information Technology and/or Business Management or a related field. "¢ Scaled Agile certification desirable. "¢ Relevant consulting and technical certifications preferred, for example TOGAF. Required Experience12-15 years "¢ Seasoned demonstrable experience in a similar role within a large scale (preferably multi- national) technology services environment. "¢ Very good understanding of Data, AI, Gen AI and Agentic AI "¢ Must have Data Architecture and Solutioning experience. Capable of E2E Data Architecture and GenAI Solution design. "¢ Must be able to work on Data & AI RFP responses as Solution Architect "¢ 10+ years of experience in Solution Architecting of Data & Analytics, AI/ML & Gen AI Technical Architect "¢ Develop Cloud-native technical approach and proposal plans identifying the best practice solutions meeting the requirements for a successful proposal. Create, edit, and review documents, diagrams, and other artifacts in response to RPPs RFQs and Contribute to and participate in presentations to customers regarding proposed solutions. "¢ Proficient with Snowflake, Databricks, Azure, AWS, GCP cloud, Data Engineering & AI tools "¢ Experience with large scale consulting and program execution engagements in AI and data "¢ Seasoned multi-technology infrastructure design experience. "¢ Seasoned demonstrable level of expertise coupled with consulting and client engagement experience, demonstrating good experience in client needs assessment and change management. "¢ Additional Additional Additional Career Level Description: Knowledge and application: "¢ Seasoned, experienced professional; has complete knowledge and understanding of area of specialization. "¢ Uses evaluation, judgment, and interpretation to select right course of action. Problem solving: "¢ Works on problems of diverse scope where analysis of information requires evaluation of identifiable factors. "¢ Resolves and assesses a wide range of issues in creative ways and suggests variations in approach. Interaction: "¢ Enhances relationships and networks with senior internal/external partners who are not familiar with the subject matter often requiring persuasion. "¢ Works"

Posted 2 weeks ago

Apply

7.0 - 12.0 years

6 - 10 Lacs

Noida

Work from Office

Naukri logo

Req ID: 327205 We are currently seeking a Lead Engineer to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Grade 8 At NTT DATA, we know that with the right people on board, anything is possible. The quality, integrity, and commitment of our employees are key factors in our company"™s growth, market presence and our ability to help our clients stay a step ahead of the competition. By hiring, the best people and helping them grow both professionally and personally, we ensure a bright future for NTT DATA and for the people who work here. Preferred Experience ""¢ Ideal candidate has been supporting traditional server based relational databases (Postgresql and MongoDB) for over 7+ years out of which last 4+ years in public cloud environments (GCP). "¢ Hands-on experience with PostgreSQL/MongoDB, including installation, configuration, performance tuning, and troubleshooting. "¢ Demonstrated expertise in managing PostgreSQL databases on AZURE, GCP and AWS RDS. "¢ Experience with features such as automated backups, maintenance, and scaling - Postgresql "¢ Ability to analyze and optimize complex SQL queries for performance improvement. "¢ Proficiency in setting up and managing monitoring tools for PostgreSQL on GCP. "¢ Experience with configuring alerts based on performance metrics. "¢ Experience in implementing and testing backup and recovery strategies for PostgreSQL databases on AWS RDS/AZURE SQL/GCP Cloud SQL. "¢ Knowledge and experience in designing and implementing disaster recovery plans for PostgreSQL databases on AWS RDS/AZURE SQL/GCP Cloud SQL. "¢ Good Understanding of database security principles and best practices. "¢ Proven ability to identify and resolve performance bottlenecks in PostgreSQL databases. "¢ Experience in optimizing database configurations for better performance. "¢ Able to provide 24*7 shift hours support at L2/L3 level "¢ Experience in updating KB articles, Problem Management articles, and SOPs/runbooks "¢ Passion for delivering timely and outstanding customer service "¢ Great written and oral communication skills with internal and external customers "¢ Strong ITIL foundation experience "¢ Ability to work independently or no direct supervision. "¢ Share domain and technical expertise, providing technical mentorship and cross-training to other peers and team members. "¢ Work directly with end customer, business stakeholders as well as technical resources. Share domain and technical expertise, providing technical mentorship and cross-training to other peers and team members. "¢ Work directly with end customer, business stakeholders as well as technical resources." Basic Qualifications " "¢ 7+ years of overall operational experience "¢ 4+ years of GCP experience as a cloud DBA (Postgresql/Mongo DB) "¢ 3+ years of experience working in diverse cloud support database environments in a 24*7 production support model "¢ Query fine tuning - MongoDB "¢ Shell scripts for Monitoring like "˜slow queries"™, replication lag, nodes fails, disk usage. etc "¢ Backup and restores (Backups should be automated with shell scripts/Ops Manager) "¢ Database Health check (Complete review of Database slow queries, fragmentation, index usage. etc) "¢ Upgrades (Java version, Mongo version. etc) "¢ Maintenance (Data Centre outages etc) "¢ Architecture design as per the Application requirement "¢ Writing best practices documents for shading, replication for Dev/App teams "¢ Log rotation/ maintenance (mongos, mongodb, config etc) "¢ Segregation of duties (User Management "“ designing User roles and responsibilities) "¢ Designing DR (Disaster Recovery)/COB (Continuity of Business) plans as applicable "¢ Database Profiling, Locks, Memory Usage, No of connections, page fault etc., "¢ Export and Import of Data to and From MongoDB, Run time configuration of MongoDB, "¢ Data Managements in MongoDB Capped Collections Expired data from TTL, "¢ Monitoring of Various issues related with Database, "¢ Monitoring at Server, Database, Collection Level, and Various Monitoring Tools related to MongoDB, "¢ Database software Installation and Configuration in accordance with Client defined standards. "¢ Database Migrations and Updates "¢ Capacity management- MongoDB "¢ Hands on experience in Server Performance tuning and Recommendations "¢ High availability solutions and recommendations "¢ Hands on experience in Root cause analysis for business impacting issues. "¢ Experience with SQL,SQL Developer,TOAD,Pgadmin,mongo db atlas "¢ Experience with python / powershell scripting - preferred "¢ Secondary skill in MySQL/oracle - preferred "¢ Installation, configuration and upgrading of postgresql server software and related products " Preferred Certifications Azure fundamentals certification (AZ-900) - REQUIRED Google Cloud Associate Engineer - REQUIRED Azure Database Certification (DP-300) - preferred AWS Certified Database Specialty - preferred Postgresql certification a plus MongoDB certification a plus B.Tech/BE/MCA in Information Technology degree or equivalent experience

Posted 2 weeks ago

Apply

12.0 - 15.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 323775 We are currently seeking a Data & AI Technical Solution Architects to join our team in Bangalore, Karntaka (IN-KA), India (IN). "Job DutiesThe Data & AI Architect is a seasoned level expert who is responsible for participating in the delivery of multi-technology consulting services to clients by providing strategies and solutions on all aspects of infrastructure and related technology components. This role collaborates with other stakeholders on the development of the architectural approach for one or more layer of a solution. This role has the primary objective is to work on strategic projects that ensure the optimal functioning of the client"™s technology infrastructure. "¢ Key Responsibilities: "¢ Ability and experience to have conversations with the CEO, Business owners and CTO/CDO "¢ Break down intricate business challenges, devise effective solutions, and focus on client needs. "¢ Craft high level innovative solution approach for complex business problems "¢ Utilize best practices and creativity to address challenges "¢ Leverage market research, formulate perspectives, and communicate insights to clients "¢ Establish strong client relationships "¢ Interact at appropriate levels to ensure client satisfaction "¢ Knowledge and Attributes: "¢ Ability to focus on detail with an understanding of how it impacts the business strategically. "¢ Excellent client service orientation. "¢ Ability to work in high-pressure situations. "¢ Ability to establish and manage processes and practices through collaboration and the understanding of business. "¢ Ability to create new and repeat business for the organization. "¢ Ability to contribute information on relevant vertical markets "¢ Ability to contribute to the improvement of internal effectiveness by contributing to the improvement of current methodologies, processes and tools. Minimum Skills RequiredAcademic Qualifications and Certifications: "¢ BE/BTech or equivalent in Information Technology and/or Business Management or a related field. "¢ Scaled Agile certification desirable. "¢ Relevant consulting and technical certifications preferred, for example TOGAF. Required Experience12-15 years "¢ Seasoned demonstrable experience in a similar role within a large scale (preferably multi- national) technology services environment. "¢ Very good understanding of Data, AI, Gen AI and Agentic AI "¢ Must have Data Architecture and Solutioning experience. Capable of E2E Data Architecture and GenAI Solution design. "¢ Must be able to work on Data & AI RFP responses as Solution Architect "¢ 10+ years of experience in Solution Architecting of Data & Analytics, AI/ML & Gen AI Technical Architect "¢ Develop Cloud-native technical approach and proposal plans identifying the best practice solutions meeting the requirements for a successful proposal. Create, edit, and review documents, diagrams, and other artifacts in response to RPPs RFQs and Contribute to and participate in presentations to customers regarding proposed solutions. "¢ Proficient with Snowflake, Databricks, Azure, AWS, GCP cloud, Data Engineering & AI tools "¢ Experience with large scale consulting and program execution engagements in AI and data "¢ Seasoned multi-technology infrastructure design experience. "¢ Seasoned demonstrable level of expertise coupled with consulting and client engagement experience, demonstrating good experience in client needs assessment and change management. "¢ Additional Additional Additional Career Level Description: Knowledge and application: "¢ Seasoned, experienced professional; has complete knowledge and understanding of area of specialization. "¢ Uses evaluation, judgment, and interpretation to select right course of action. Problem solving: "¢ Works on problems of diverse scope where analysis of information requires evaluation of identifiable factors. "¢ Resolves and assesses a wide range of issues in creative ways and suggests variations in approach. Interaction: "¢ Enhances relationships and networks with senior internal/external partners who are not familiar with the subject matter often requiring persuasion. "¢ Works"

Posted 2 weeks ago

Apply

12.0 - 15.0 years

13 - 17 Lacs

Pune

Work from Office

Naukri logo

Req ID: 323754 We are currently seeking a Data & AI Technical Solution Architects to join our team in Pune, Mahrshtra (IN-MH), India (IN). Job DutiesThe Data & AI Architect is a seasoned level expert who is responsible for participating in the delivery of multi-technology consulting services to clients by providing strategies and solutions on all aspects of infrastructure and related technology components. This role collaborates with other stakeholders on the development of the architectural approach for one or more layer of a solution. This role has the primary objective is to work on strategic projects that ensure the optimal functioning of the client"™s technology infrastructure. "¢ Key Responsibilities: "¢ Ability and experience to have conversations with the CEO, Business owners and CTO/CDO "¢ Break down intricate business challenges, devise effective solutions, and focus on client needs. "¢ Craft high level innovative solution approach for complex business problems "¢ Utilize best practices and creativity to address challenges "¢ Leverage market research, formulate perspectives, and communicate insights to clients "¢ Establish strong client relationships "¢ Interact at appropriate levels to ensure client satisfaction "¢ Knowledge and Attributes: "¢ Ability to focus on detail with an understanding of how it impacts the business strategically. "¢ Excellent client service orientation. "¢ Ability to work in high-pressure situations. "¢ Ability to establish and manage processes and practices through collaboration and the understanding of business. "¢ Ability to create new and repeat business for the organization. "¢ Ability to contribute information on relevant vertical markets "¢ Ability to contribute to the improvement of internal effectiveness by contributing to the improvement of current methodologies, processes and tools. Minimum Skills RequiredAcademic Qualifications and Certifications: "¢ BE/BTech or equivalent in Information Technology and/or Business Management or a related field. "¢ Scaled Agile certification desirable. "¢ Relevant consulting and technical certifications preferred, for example TOGAF. Required Experience12-15 years "¢ Seasoned demonstrable experience in a similar role within a large scale (preferably multi- national) technology services environment. "¢ Very good understanding of Data, AI, Gen AI and Agentic AI "¢ Must have Data Architecture and Solutioning experience. Capable of E2E Data Architecture and GenAI Solution design. "¢ Must be able to work on Data & AI RFP responses as Solution Architect "¢ 10+ years of experience in Solution Architecting of Data & Analytics, AI/ML & Gen AI Technical Architect "¢ Develop Cloud-native technical approach and proposal plans identifying the best practice solutions meeting the requirements for a successful proposal. Create, edit, and review documents, diagrams, and other artifacts in response to RPPs RFQs and Contribute to and participate in presentations to customers regarding proposed solutions. "¢ Proficient with Snowflake, Databricks, Azure, AWS, GCP cloud, Data Engineering & AI tools "¢ Experience with large scale consulting and program execution engagements in AI and data "¢ Seasoned multi-technology infrastructure design experience. "¢ Seasoned demonstrable level of expertise coupled with consulting and client engagement experience, demonstrating good experience in client needs assessment and change management. "¢ Additional Additional Additional Career Level Description: Knowledge and application: "¢ Seasoned, experienced professional; has complete knowledge and understanding of area of specialization. "¢ Uses evaluation, judgment, and interpretation to select right course of action. Problem solving: "¢ Works on problems of diverse scope where analysis of information requires evaluation of identifiable factors. "¢ Resolves and assesses a wide range of issues in creative ways and suggests variations in approach. Interaction: "¢ Enhances relationships and networks with senior internal/external partners who are not familiar with the subject matter often requiring persuasion. "¢ Works

Posted 2 weeks ago

Apply

10.0 - 15.0 years

6 - 9 Lacs

Noida

Work from Office

Naukri logo

Req ID: 327207 We are currently seeking a Staff Engineer to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Grade 10 At NTT DATA, we know that with the right people on board, anything is possible. The quality, integrity, and commitment of our employees are key factors in our company"™s growth, market presence and our ability to help our clients stay a step ahead of the competition. By hiring, the best people and helping them grow both professionally and personally, we ensure a bright future for NTT DATA and for the people who work here. Preferred Experience ""¢ Ideal candidate has been supporting traditional server based relational databases (Postgresql and MongoDB) for over 10+ years out of which last 5+ years in public cloud environments (GCP). "¢ Hands-on experience with PostgreSQL/MongoDB, including installation, configuration, performance tuning, and troubleshooting. "¢ Demonstrated expertise in managing PostgreSQL databases on AZURE, GCP and AWS RDS. "¢ Experience with features such as automated backups, maintenance, and scaling - Postgresql "¢ Ability to analyze and optimize complex SQL queries for performance improvement. "¢ Proficiency in setting up and managing monitoring tools for PostgreSQL on GCP. "¢ Experience with configuring alerts based on performance metrics. "¢ Experience in implementing and testing backup and recovery strategies for PostgreSQL databases on AWS RDS/AZURE SQL/GCP Cloud SQL. "¢ Knowledge and experience in designing and implementing disaster recovery plans for PostgreSQL databases on AWS RDS/AZURE SQL/GCP Cloud SQL. "¢ Good Understanding of database security principles and best practices. "¢ Proven ability to identify and resolve performance bottlenecks in PostgreSQL databases. "¢ Experience in optimizing database configurations for better performance. "¢ Able to provide 24*7 shift hours support at L2/L3 level "¢ Experience in updating KB articles, Problem Management articles, and SOPs/runbooks "¢ Passion for delivering timely and outstanding customer service "¢ Great written and oral communication skills with internal and external customers "¢ Strong ITIL foundation experience "¢ Ability to work independently or no direct supervision. "¢ Share domain and technical expertise, providing technical mentorship and cross-training to other peers and team members. "¢ Work directly with end customer, business stakeholders as well as technical resources. Share domain and technical expertise, providing technical mentorship and cross-training to other peers and team members. "¢ Work directly with end customer, business stakeholders as well as technical resources." Basic Qualifications " "¢ 10+ years of overall operational experience "¢ 5+ years of GCP experience as a cloud DBA (Postgresql/Mongo DB) "¢ 3+ years of experience working in diverse cloud support database environments in a 24*7 production support model "¢ Query fine tuning - MongoDB "¢ Shell scripts for Monitoring like "˜slow queries"™, replication lag, nodes fails, disk usage. etc "¢ Backup and restores (Backups should be automated with shell scripts/Ops Manager) "¢ Database Health check (Complete review of Database slow queries, fragmentation, index usage. etc) "¢ Upgrades (Java version, Mongo version. etc) "¢ Maintenance (Data Centre outages etc) "¢ Architecture design as per the Application requirement "¢ Writing best practices documents for shading, replication for Dev/App teams "¢ Log rotation/ maintenance (mongos, mongodb, config etc) "¢ Segregation of duties (User Management "“ designing User roles and responsibilities) "¢ Designing DR (Disaster Recovery)/COB (Continuity of Business) plans as applicable "¢ Database Profiling, Locks, Memory Usage, No of connections, page fault etc., "¢ Export and Import of Data to and From MongoDB, Run time configuration of MongoDB, "¢ Data Managements in MongoDB Capped Collections Expired data from TTL, "¢ Monitoring of Various issues related with Database, "¢ Monitoring at Server, Database, Collection Level, and Various Monitoring Tools related to MongoDB, "¢ Database software Installation and Configuration in accordance with Client defined standards. "¢ Database Migrations and Updates "¢ Capacity management- MongoDB "¢ Hands on experience in Server Performance tuning and Recommendations "¢ High availability solutions and recommendations "¢ Hands on experience in Root cause analysis for business impacting issues. "¢ Experience with SQL,SQL Developer,TOAD,Pgadmin,mongo db atlas "¢ Experience with python / powershell scripting - preferred "¢ Secondary skill in MySQL/oracle - preferred "¢ Installation, configuration and upgrading of postgresql server software and related products "¢ Secondary skill - DB2 is a plus. " Preferred Certifications Azure fundamentals certification (AZ-900) - REQUIRED Google Cloud Associate Engineer - REQUIRED Azure Database Certification (DP-300) - preferred AWS Certified Database Specialty - preferred Postgresql certification a plus MongoDB certification a plus B.Tech/BE/MCA in Information Technology degree or equivalent experience

Posted 2 weeks ago

Apply

12.0 - 15.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

Req ID: 323774 We are currently seeking a Data & AI Technical Solution Architects to join our team in Hyderabad, Telangana (IN-TG), India (IN). "Job DutiesThe Data & AI Architect is a seasoned level expert who is responsible for participating in the delivery of multi-technology consulting services to clients by providing strategies and solutions on all aspects of infrastructure and related technology components. This role collaborates with other stakeholders on the development of the architectural approach for one or more layer of a solution. This role has the primary objective is to work on strategic projects that ensure the optimal functioning of the client"™s technology infrastructure. "¢ Key Responsibilities: "¢ Ability and experience to have conversations with the CEO, Business owners and CTO/CDO "¢ Break down intricate business challenges, devise effective solutions, and focus on client needs. "¢ Craft high level innovative solution approach for complex business problems "¢ Utilize best practices and creativity to address challenges "¢ Leverage market research, formulate perspectives, and communicate insights to clients "¢ Establish strong client relationships "¢ Interact at appropriate levels to ensure client satisfaction "¢ Knowledge and Attributes: "¢ Ability to focus on detail with an understanding of how it impacts the business strategically. "¢ Excellent client service orientation. "¢ Ability to work in high-pressure situations. "¢ Ability to establish and manage processes and practices through collaboration and the understanding of business. "¢ Ability to create new and repeat business for the organization. "¢ Ability to contribute information on relevant vertical markets "¢ Ability to contribute to the improvement of internal effectiveness by contributing to the improvement of current methodologies, processes and tools. Minimum Skills RequiredAcademic Qualifications and Certifications: "¢ BE/BTech or equivalent in Information Technology and/or Business Management or a related field. "¢ Scaled Agile certification desirable. "¢ Relevant consulting and technical certifications preferred, for example TOGAF. Required Experience12-15 years "¢ Seasoned demonstrable experience in a similar role within a large scale (preferably multi- national) technology services environment. "¢ Very good understanding of Data, AI, Gen AI and Agentic AI "¢ Must have Data Architecture and Solutioning experience. Capable of E2E Data Architecture and GenAI Solution design. "¢ Must be able to work on Data & AI RFP responses as Solution Architect "¢ 10+ years of experience in Solution Architecting of Data & Analytics, AI/ML & Gen AI Technical Architect "¢ Develop Cloud-native technical approach and proposal plans identifying the best practice solutions meeting the requirements for a successful proposal. Create, edit, and review documents, diagrams, and other artifacts in response to RPPs RFQs and Contribute to and participate in presentations to customers regarding proposed solutions. "¢ Proficient with Snowflake, Databricks, Azure, AWS, GCP cloud, Data Engineering & AI tools "¢ Experience with large scale consulting and program execution engagements in AI and data "¢ Seasoned multi-technology infrastructure design experience. "¢ Seasoned demonstrable level of expertise coupled with consulting and client engagement experience, demonstrating good experience in client needs assessment and change management. "¢ Additional Additional Additional Career Level Description: Knowledge and application: "¢ Seasoned, experienced professional; has complete knowledge and understanding of area of specialization. "¢ Uses evaluation, judgment, and interpretation to select right course of action. Problem solving: "¢ Works on problems of diverse scope where analysis of information requires evaluation of identifiable factors. "¢ Resolves and assesses a wide range of issues in creative ways and suggests variations in approach. Interaction: "¢ Enhances relationships and networks with senior internal/external partners who are not familiar with the subject matter often requiring persuasion. "¢ Works"

Posted 2 weeks ago

Apply

6.0 - 11.0 years

25 - 35 Lacs

Pune, Bengaluru, Delhi / NCR

Hybrid

Naukri logo

Job Title: GCP DevOps Engineer / Lead / Architect Experience: 6 to 12 Years Location: Bangalore/ Chennai/ Gurgaon/ Pune/ Kolkata Job Description: We are looking for a skilled DevOps Engineer with 5 to 12 years of experience to join our dynamic team. The ideal candidate will have a strong background in DevOps practices, CI/CD pipeline creation, and experience with GCP services. You will play a crucial role in ensuring smooth development, deployment, and integration processes. Key Responsibilities: CI/CD Pipeline Creation: Design, implement, and manage CI/CD pipelines using GitHub, ensuring seamless integration and delivery of software. Version Control: Manage and maintain code repositories using GitHub, ensuring best practices for version control and collaboration. Infrastructure as Code: Write and maintain infrastructure as code (IaC) using Terraform/YAML, ensuring consistent and reliable deployment processes. GCP Services Management: Utilize Google Cloud Platform (GCP) services to build, deploy, and scale applications. Manage and optimize cloud resources to ensure cost-effective operations. Automation s Monitoring: Implement automation scripts and monitoring tools to enhance the efficiency, reliability, and performance of our systems. Collaboration: Work closely with development, QA, and operations teams to ensure smooth workflows and resolve issues efficiently. Security s Compliance: Ensure that all systems and processes comply with security and regulatory standards. Required Skills: DevOps Practices: Strong understanding of DevOps principles, including continuous integration, continuous delivery, and continuous deployment. GitHub: Extensive experience with GitHub for version control, collaboration, and pipeline integration. CI/CD: Hands-on experience in creating and managing CI/CD pipelines. GCP Services: Solid experience with GCP services, including compute, storage, networking, and security. Preferred Qualifications: GCP Certification: Google Cloud Platform certification is highly desirable and will be an added advantage. Scripting Languages: Proficiency in scripting languages such as Python, Bash, or similar. Monitoring Tools: Experience with monitoring and logging tools like Prometheus, Grafana, or Stackdriver. Educational Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field.

Posted 2 weeks ago

Apply

2.0 - 7.0 years

2 - 6 Lacs

New Delhi, Gurugram

Work from Office

Naukri logo

CUSTOMER SUPPORT ROLE FOR INTERNATIONAL VOICE PROCESS PRIYA - 8744857809 TRAVEL/BANKING/TECHNICAL GRAD/UG/FRESHER/EXPERIENCE SALARY DEPENDING ON LAST TAKEHOME(UPTO 7 LPA) LOCATION - GURUGRAM WFO, 5 DAYS, 24*7 SHIFTS CAB+ INCENTIVES IMM. JOINERS

Posted 2 weeks ago

Apply

6.0 - 11.0 years

1 - 2 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

We are looking for candidate with 6+ Years of exp in Java development & strong expertise in GCP environment

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Bengaluru

Remote

Naukri logo

Job Requirement for Offshore Data Engineer (with ML expertise) Work Mode: Remote Base Location: Bengaluru Experience: 5+ Years Technical Skills & Expertise: PySpark & Apache Spark: Extensive experience with PySpark and Spark for big data processing and transformation. Strong understanding of Spark architecture, optimization techniques, and performance tuning. Ability to work with Spark jobs in distributed computing environments like Databricks. Data Mining & Transformation: Hands-on experience in designing and implementing data mining workflows. Expertise in data transformation processes, including ETL (Extract, Transform, Load) pipelines. Experience in large-scale data ingestion, aggregation, and cleaning. Programming Languages: Python & Scala: Proficient in Python for data engineering tasks, including using libraries like Pandas and NumPy. Scala proficiency is preferred for Spark job development. Big Data Concepts: In-depth knowledge of big data frameworks and paradigms, such as distributed file systems, parallel computing, and data partitioning. Big Data Technologies: Cassandra & Hadoop: Experience with NoSQL databases like Cassandra and distributed storage systems like Hadoop. Data Warehousing Tools: Proficiency with Hive for data warehousing solutions and querying. ETL Tools: Experience with Beam architecture and other ETL tools for large-scale data workflows. Cloud Technologies (GCP): Expertise in Google Cloud Platform (GCP), including core services like Cloud Storage, BigQuery, and DataFlow. Experience with DataFlow jobs for batch and stream processing. Familiarity with managing workflows using Airflow for task scheduling and orchestration in GCP. Machine Learning & AI: GenAI Experience: Familiarity with Generative AI and its applications in ML pipelines. ML Model Development: Knowledge of basic ML model building using tools like Pandas, NumPy, and visualization with Matplotlib. ML Ops Pipeline: Experience in managing end-to-end ML Ops pipelines for deploying models in production, particularly LLM (Large Language Models) deployments. RAG Architecture: Understanding and experience in building pipelines using Retrieval-Augmented Generation (RAG) architecture to enhance model performance and output. Tech stack : Spark, Pyspark, Python, Scala, GCP data flow, Data composer (Air flow), ETL, Databricks, Hadoop, Hive, GenAI, ML Modeling basic knowledge, ML Ops experience , LLM deployment, RAG

Posted 2 weeks ago

Apply

0.0 - 5.0 years

2 - 6 Lacs

New Delhi, Gurugram, Delhi / NCR

Work from Office

Naukri logo

CUSTOMER SERVICE ROLE FOR INTERNATIONAL PROCESS SHRUTI 9958371382 TRAVEL/BANKING/TECHNICAL GRAD/UG/FRESHER/EXPERIENCE SALARY DEPENDING ON LAST TAKEHOME(UPTO 7 LPA) LOCATION - GURUGRAM/ NOIDA WFO, 5 DAYS, 24*7 SHIFTS CAB+ INCENTIVES Required Candidate profile GOOD COMMUNICATION SKILLS IMMEDIATE JOINERS SHOULD BE WILLING TO DO 24*7 SHIFTS

Posted 2 weeks ago

Apply

10.0 - 15.0 years

10 - 17 Lacs

Hyderabad, Chennai

Work from Office

Naukri logo

Job Summary / Purpose The Network Engineer job family has responsibility for infrastructure/technical planning, implementation and support activities for systems owned by the HA IT Operations Team. Specific responsibilities include installing and supporting system hardware and software, performing system upgrades, and evaluating and installing patches and software updates. Responsibilities also include operational support activities such as resolving software and hardware related problems, managing backup and recovery activity, administering technology layers, managing monitoring and alerting functions, performing capacity planning and conducting version management. Network Engineers work closely with architects, infrastructure support, database administrators and application support teams to ensure seamless and quality IT support for HA customers and alignment with HA IT standards, controls and governance. Essential Key Job Responsibilities Provides evaluation, engineering/design and implementation services for new products, technologies and solutions to address corporate business requirements. Provides escalation support to Tier 1 and 2 engineers. Demonstrates creativity and takes initiative in problem solving. Resolves or facilitates resolution of complex problems for assigned program Has a thorough and comprehensive mastery of supported platforms/products and environments. Focuses the majority of time on complex engineering, architectural and implementation tasks. Project assignments are large in scope and are highly complex. Provide exceptional customer service to HA end users, Business Stakeholders and other members of HA IT Perform daily production support activities Participate in team on-call rotations. Perform patching and code deployments across all environments Lead projects associated to the enhancement, upgrade/patching, or implementation of new or existing technology solutions. Coordinate implementation and support efforts between IT Operations and other HA IT teams. Conduct performance tuning and troubleshooting Provide oversight and facilitation of the HA IT change management process as it applies to the IT Operations team. Review, recommend and monitor the source code/versioning management function adhering to technical management guidelines Provide technical leadership and ownership of issues across multiple disciplines and technologies Design, implement and maintain a comprehensive monitoring and alerting process across all IT Operations platforms With oversight from the System Engineering Lead and Operations Manager(s), initiate and facilitate strategic planning activities (capacity planning, process improvement, maintenance, upgrade and end-of-life planning, roadmap development) Build new test and production environments on existing or new hardware as required. Identify automation opportunities and implement scripted solutions. Identify technical innovation and process improvement opportunities. Design and implement a comprehensive and ongoing release management and planning program. Utilize standard tools and methodology to develop system and support performance metrics. Demonstrate potential leadership qualities through team motivation, coaching, and mentoring. Demonstrate comprehensive knowledge & expertise with HA business processes and routines. Research and recommend appropriate System Administration best practices, and tools. Perform crisis management during high-severity operational incidents. The job summary and responsibilities listed above are designed to indicate the general nature of the work performed within this job. They are not designed to contain or be interpreted as a comprehensive inventory of all job responsibilities required of employees assigned to this job. Employees may be required to perform other duties as assigned. Minimum Qualifications Required Education and Experience BA or BS in Computer Science, Technology, or Business discipline or equivalent experience is required. 6-10 years of professional experience in an IT technical or infrastructure field is required. 5-7 years of experience in Google cloud large complex project oversight Established understanding of Infrastructure and relative technologies. Established understanding of network connectivity and protocols. Required Licensure and Certifications CCNA or CCNP, Google Cloud Network Engineer Certification Preferred Required Minimum Knowledge, Skills, Abilities and Training 7+ years of experience with Cisco Route Switch and Wireless infrastructure. Experience with datacenter and MDF/IDF closet construction design (cabling, racks, power, cooling) Experience or understanding of Cisco Nexus platform 5-7 years of experience with vendor management Advanced experience with Cisco wireless, and strong understanding of wireless design and coverage Understanding and experience with WAN Design MPLS QoS BGP Metro Optical Ethernet MultiPoint MOE Healthcare industry experience a plus

Posted 2 weeks ago

Apply

6.0 - 10.0 years

8 - 15 Lacs

Bengaluru, Mumbai (All Areas)

Hybrid

Naukri logo

We are hiring for Java / Full Stack Developers who are passionate about designing scalable enterprise applications, modernizing legacy systems, and delivering high-performance backend and frontend solutions. This role involves hands-on coding, architectural input, mentoring junior developers, and working in Agile environments using DevOps and microservices best practices. Key Responsibilities Design and develop enterprise-grade web applications based on client business requirements. Lead design sessions, contribute to architectural decisions, and perform code reviews. Collaborate with application, security, data, and infrastructure teams for scalable and secure system design. Execute modernization projects using microservices, APIs, containers, and cloud-native tools. Build and enhance backend services using Java 8+, Spring Boot, Hibernate. Build and integrate frontend components using React, JavaScript, HTML, CSS. Ensure clean code quality, performance tuning, and CI/CD pipeline integration. Mentor junior developers and contribute to agile delivery practices. Deploy applications on Kubernetes/OpenShift with SQL/NoSQL backends. Required Experience & Skills 58 years of hands-on experience in Java-based application development. Strong understanding of Spring Boot, Hibernate, and REST/SOAP API development. Proficiency with frontend frameworks like React, JavaScript, HTML, CSS. Microservices architecture and cloud-native development (12-factor apps, containerization). Knowledge of tools like Git, Maven/Gradle, Docker, Kubernetes. Experience with relational (PostgreSQL) and NoSQL databases. Familiarity with DevOps concepts including CI/CD, shell scripting, Linux CLI. Exposure to Agile methodologies and working in fast-paced cross-functional teams. Good understanding of security, scalability, and performance tuning. Preferred Qualifications Bachelors degree in Computer Science or a related technical field. Experience in full-stack development within Agile/Scrum setups. Certifications in Java, Cloud (AWS/Azure/GCP), or Kubernetes/Docker. Experience in large-scale application deployments and modernization programs. Knowledge of modern engineering practices like TDD, CI/CD, Observability. Top 10 Must-Have Skills Java 8+ & Spring Boot Microservices Architecture & REST APIs Hibernate / JPA Cloud Platforms AWS / Azure / GCP Frontend Development React, JavaScript, HTML, CSS SQL / NoSQL Databases (PostgreSQL, MongoDB) DevOps Tools Docker, Kubernetes, CI/CD pipelines Git & Build Tools – Maven / Gradle Test-Driven Development, Agile Practices Linux CLI, Shell Scripting & Container Orchestration

Posted 2 weeks ago

Apply

6.0 - 9.0 years

7 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Role Overview: We are seeking a talented and forward-thinking Data Engineer for one of the large financial services GCC based in Hyderabad with responsibilities that include designing and constructing data pipelines, integrating data from multiple sources, developing scalable data solutions, optimizing data workflows, collaborating with cross-functional teams, implementing data governance practices, and ensuring data security and compliance. Technical Requirements: 1. Proficiency in ETL, Batch, and Streaming Process 2. Experience with BigQuery, Cloud Storage, and CloudSQL 3. Strong programming skills in Python, SQL, and Apache Beam for data processing 4. Understanding of data modeling and schema design for analytics 5. Knowledge of data governance, security, and compliance in GCP 6. Familiarity with machine learning workflows and integration with GCP ML tools 7. Ability to optimize performance within data pipelines Functional Requirements: 1. Ability to collaborate with Data Operations, Software Engineers, Data Scientists, and Business SMEs to develop Data Product Features 2. Experience in leading and mentoring peers within an existing development team 3. Strong communication skills to craft and communicate robust solutions 4. Proficient in working with Engineering Leads, Enterprise and Data Architects, and Business Architects to build appropriate data foundations 5. Willingness to work on contemporary data architecture in Public and Private Cloud environments T his role offers a compelling opportunity for a seasoned Data Engineering to drive transformative cloud initiatives within the financial sector, leveraging unparalleled experience and expertise to deliver innovative cloud solutions that align with business imperatives and regulatory requirements . Qualification Engineering Grad / Postgraduate CRITERIA 1. Proficient in ETL, Python, and Apache Beam for data processing efficiency. 2. Demonstrated expertise in BigQuery, Cloud Storage, and CloudSQL utilization. 3. Strong collaboration skills with cross-functional teams for data product development. 4. Comprehensive knowledge of data governance, security, and compliance in GCP. 5. Experienced in optimizing performance within data pipelines for efficiency. 6. Relevant Experience: 6-9 years Connect at 9993809253

Posted 2 weeks ago

Apply

7.0 - 12.0 years

20 - 25 Lacs

Gurugram

Work from Office

Naukri logo

As a Technical Lead, you will be responsible for leading the development and delivery of the platforms. This includes overseeing the entire product lifecycle from the solution until execution and launch, building the right team close collaboration with business and product teams. Primary Responsibilities: Design end-to-end solutions that meet business requirements and align with the enterprise architecture. Define the architecture blueprint, including integration, data flow, application, and infrastructure components. Evaluate and select appropriate technology stacks, tools, and frameworks. Ensure proposed solutions are scalable, maintainable, and secure. Collaborate with business and technical stakeholders to gather requirements and clarify objectives. Act as a bridge between business problems and technology solutions. Guide development teams during the execution phase to ensure solutions are implemented according to design. Identify and mitigate architectural risks and issues. Ensure compliance with architecture principles, standards, policies, and best practices. Document architectures, designs, and implementation decisions clearly and thoroughly. Identify opportunities for innovation and efficiency within existing and upcoming solutions. Conduct regular performance and code reviews, and provide feedback to the development team members to improve professional development. Lead proof-of-concept initiatives to evaluate new technologies. Functional Responsibilities: Facilitate daily stand-up meetings, sprint planning, sprint review, and retrospective meetings. Work closely with the product owner to priorities the product backlog and ensure that user stories are well-defined and ready for development. Identify and address issues or conflicts that may impact project delivery or team morale. Experience with Agile project management tools such as Jira and Trello. Required Skills: Bachelor's degree in Computer Science, Engineering, or related field. 7+ years of experience in software engineering, with at least 3 years in a solution architecture or technical leadership role. Proficiency with AWS or GCP cloud platform. Strong implementation knowledge in JS tech stack, NodeJS, ReactJS, Experience with JS stack - ReactJS, NodeJS. Experience with Database Engines - MySQL and PostgreSQL with proven knowledge of Database migrations, high throughput and low latency use cases. Experience with key-value stores like Redis, MongoDB and similar. Preferred knowledge of distributed technologies - Kafka, Spark, Trino or similar with proven experience in event-driven data pipelines. Proven experience with setting up big data pipelines to handle high volume transactions and transformations. Experience with BI tools - Looker, PowerBI, Metabase or similar. Experience with Data warehouses like BigQuery, Redshift, or similar. Familiarity with CI/CD pipelines, containerization (Docker/Kubernetes), and IaC (Terraform/CloudFormation). Good to Have: Certifications such as AWS Certified Solutions Architect, Azure Solutions Architect Expert, TOGAF, etc. Experience setting up analytical pipelines using BI tools (Looker, PowerBI, Metabase or similar) and low-level Python tools like Pandas, Numpy, PyArrow Experience with data transformation tools like DBT, SQLMesh or similar. Experience with data orchestration tools like Apache Airflow, Kestra or similar.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

3 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

CLOUD ADMIN : Core Responsibilities Cloud Resource Provisioning & Management Execute provisioning and decommissioning activities for cloud resources (compute, storage, network, security groups) as per approved service requests or automation triggers, ensuring adherence to approved configurations and naming conventions. Maintain accurate tagging and metadata for all provisioned resources to enable visibility, policy enforcement, and proper cost attribution across business units or projects. Implement and maintain resource policies, such as auto-shutdown schedules, retention rules, and rightsizing recommendations, to ensure optimal usage and compliance with organizational guardrails. Operational Maintenance & Monitoring Monitor the health and completion of backup jobs, including VM snapshots, database dumps, and object storage versioning, escalating failures to relevant teams or L3 support. Apply and track lifecycle rules for storage buckets, archives, and temporary data to avoid unnecessary storage cost accumulation. Perform routine cloud maintenance tasks, including patch scheduling, instance resizing, log file rotation and cleanup, and temporary volume management, as per defined maintenance windows and SOPs. Deliverables Provisioning & Lifecycle Records Cloud provisioning logs, including timestamps, resource details, request origin, and tagging validation for all resources created or modified. Backup job summaries, highlighting success/failure status, size, timestamp, and target recovery point, with retention validation. Resource deallocation and cost recovery reports, listing terminated resources, associated cost savings, and confirmation of associated tag/policy cleanup. PLEASE SHARE YOUR UPDATED RESUME ON netra.chaubal@bootlabstech.com

Posted 2 weeks ago

Apply

8.0 - 12.0 years

20 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

The ideal candidate will have extensive experience with Google Cloud Platform's data services, building scalable data pipelines, and implementing modern data architecture solutions. Key Responsibilities Design and implement data lake solutions using GCP Storage and Data Transfer Service Develop and maintain ETL/ELT pipelines for data processing and transformation Orchestrate complex data workflows using Cloud Composer (managed Apache Airflow) Build and optimize BigQuery data models and implement data governance practices Configure and maintain Dataplex for unified data management across our organization Implement monitoring solutions using Cloud Monitoring to ensure data pipeline reliability Create and maintain data visualization solutions using Looker for business stakeholders Collaborate with data scientists and analysts to deliver high-quality data products Required Skills & Experience 8+ years of hands-on experience with GCP data services including: Cloud Storage and Storage Transfer Service for data lake implementation BigQuery for data warehousing and analytics Cloud Composer for workflow orchestration Dataplex for data management and governance Cloud Monitoring for observability and alerting Strong experience with ETL/ELT processes and data pipeline development Proficiency in SQL and at least one programming language (Python preferred) Experience with Looker or similar BI/visualization tools Knowledge of data modeling and dimensional design principles Experience implementing data quality monitoring and validation Preferred Qualifications Google Cloud Professional Data Engineer certification Experience with streaming data processing using Dataflow or Pub/Sub Knowledge of data mesh or data fabric architectures Experience with dbt or similar transformation tools Familiarity with CI/CD practices for data pipelinesRole & responsibilities Preferred candidate profile

Posted 3 weeks ago

Apply

13.0 - 25.0 years

15 - 30 Lacs

Navi Mumbai, Bengaluru

Work from Office

Naukri logo

gcp with docker

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies