Home
Jobs

220 Gcp Cloud Jobs - Page 4

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

15 - 25 Lacs

Hyderabad, Pune

Work from Office

Naukri logo

Devops Engineer 5-10 Years Pune & Hyderabad We are looking for a skilled DevOps Engineer with 5 to 12 years of experience to join our dynamic team. The ideal candidate will have a strong background in DevOps practices, CI/CD pipeline creation, and experience with GCP services. You will play a crucial role in ensuring smooth development, deployment, and integration processes. Key Responsibilities: CI/CD Pipeline Creation: Design, implement, and manage CI/CD pipelines using GitHub, ensuring seamless integration and delivery of software. Version Control: Manage and maintain code repositories using GitHub, ensuring best practices for version control and collaboration. Infrastructure as Code: Write and maintain infrastructure as code (IaC) using Terraform/YAML, ensuring consistent and reliable deployment processes. GCP Services Management: Utilize Google Cloud Platform (GCP) services to build, deploy, and scale applications. Manage and optimize cloud resources to ensure

Posted 1 week ago

Apply

3.0 - 4.0 years

1 - 6 Lacs

Pune

Work from Office

Naukri logo

About Invezza Technologies Invezza is a technology consulting and outsourced product development company. We believe in growing together and creating long-term relationships with the common purpose of delivering innovative solutions and cutting-edge digital experiences. We are technically creative innovators with a deep passion for technology. Work Location: Baner, Pune Job Description Responsibilities : Collaborate with software engineering teams to understand application requirements and architecture and integrate DevOps practices into the development lifecycle. Manage and maintain cloud infrastructure, and ensuring scalability, performance, and security. Design, implement, and maintain automated CI/CD pipelines to facilitate rapid and reliable software delivery. Monitor system health, performance, and security using tools like Prometheus, Grafana, and other monitoring solutions. Implement and manage containerization technologies like Docker and orchestration tools like Kubernetes for deploying and managing applications. Collaborate with security teams to implement and maintain best practices for security and compliance in the DevOps processes. Troubleshoot and resolve infrastructure and application-related issues in development, testing, and production environments. Stay up to date with the latest trends and technologies in DevOps, cloud computing, and automation. Qualifications: 2-4 years of professional experience in DevOps administration, or software development roles. Strong experience with Linux Commands, Shell Scripting, Docker, MySQL Configuration, and cloud platforms such as AWS, Azure, or GCP. Proficiency in scripting languages such as Bash and Python for automation tasks. Hands-on experience with CI/CD tools such as Jenkins, GitLab CI/CD, or CircleCI. Solid understanding of networking concepts and security best practices. Experience with version control systems such as Git. Strong problem-solving skills and the ability to troubleshoot complex issues. Excellent communication and collaboration skills to work effectively in cross-functional teams. Relevant certifications like AWS Certified DevOps Engineer, or similar, are a plus. Job Type: Full-time.

Posted 1 week ago

Apply

8.0 - 10.0 years

12 - 15 Lacs

Chennai

Work from Office

Naukri logo

Experienced in Java, JAX-WS, REST APIs, SQL, Spring Boot, cloud (AWS/GCP), Agile, Tomcat, scalable web apps & performance tuning.

Posted 1 week ago

Apply

2.0 - 7.0 years

2 - 6 Lacs

New Delhi, Gurugram

Work from Office

Naukri logo

CUSTOMER SUPPORT ROLE FOR INTERNATIONAL VOICE PROCESS KAJAL - 8860800235 TRAVEL/BANKING/TECHNICAL GRAD/UG/FRESHER/EXPERIENCE SALARY DEPENDING ON LAST TAKEHOME(UPTO 7 LPA) LOCATION - GURUGRAM WFO, 5 DAYS, 24*7 SHIFTS CAB+ INCENTIVES IMMEDIATE JOINERS

Posted 1 week ago

Apply

5.0 - 7.0 years

10 - 20 Lacs

Chennai

Work from Office

Naukri logo

Were Hiring – Encryption Specialist Location: Chennai (On-site) Open Position: 1 Apply at: Shrihari.r@camsonline.com Join CAMS (Computer Age Management Services Limited) – a leading fintech organization and be part of a high-impact security team focused on protecting sensitive data through strong encryption and key management practices. Role: Encryption Specialist We are looking for a skilled and experienced professional with a strong foundation in encryption technologies, cryptographic key management, and regulatory compliance within enterprise environments. Required Qualifications Bachelor’s degree in Computer Science, Information Security, Mathematics, Engineering, or a related field. Relevant professional certifications are highly desirable, such as: CISSP (Certified Information Systems Security Professional) CEH (Certified Ethical Hacker) ECES (EC-Council Certified Encryption Specialist) Google Professional Cloud Security Engineer Microsoft Security Certifications Required Experience 5–7 years of experience in Information Technology or Information Security. 3–5 years of dedicated, hands-on experience in designing, implementing, and managing encryption technologies and cryptographic key management solutions. Proven experience with data-at-rest encryption techniques for databases, file systems, and storage across On-premise and Cloud environments. Hands-on experience with Google Cloud Platform (GCP) Cloud KMS, including key concepts like CMEK (Customer Managed Encryption Keys) and CSEK (Customer Supplied Encryption Keys). Experience in implementing and managing data-in-transit encryption using TLS/SSL, VPNs. Familiarity with O365 encryption capabilities, including Service Encryption and Purview Information Protection. Proficiency in key lifecycle management and working with KMS/HSM platforms. Experience with specific On-prem HSM/KMS vendors is a strong plus. Exposure to regulated environments, especially within Financial Services or BFSI sectors, is preferred. Skills & Competencies Deep understanding of cryptographic concepts: Symmetric/Asymmetric encryption, Hashing algorithms, Digital signatures, Key exchange methods Strong knowledge of TLS/SSL protocols and secure configurations Expertise in Key Management technologies: KMS, HSM Hands-on experience with GCP Cloud KMS and related GCP Security Services Familiarity with Microsoft O365/Purview encryption features and Azure Key Vault Understanding of encryption in common OS, databases, and storage platforms Strong analytical and troubleshooting skills related to cryptographic implementations Excellent documentation and communication skills

Posted 1 week ago

Apply

2.0 - 6.0 years

10 - 20 Lacs

Gurugram

Hybrid

Naukri logo

Responsibilities: Own and be responsible for testing and delivery of product or core modules. Assessing the quality, usability and functionality of each release. Reviewing software requirement and capable in preparing test scenarios for complex business rules Interact with the stakeholders to understand the detailed requirements and expectations Be able to gain technical knowledge and aim to be a quality SME(s) in core functional components Developing and organizing QA Processes for assigned projects to align with overall QA goals Designing and implementing a test automation strategy supporting multiple product development teams Leading efforts for related automation projects, design and code reviews Producing regular reports on the status and quality of software releases and be prepared to speak to findings in an informative way to all levels of audiences. What Were Looking For: Participate in and improve the whole lifecycle of services—from inception and design, through deployment, operation, and refinement. Participate in the release planning process to review functional specifications and create release plans. Collaborate with software engineers to design verification test plans. Design regression test suites and review with engineering, applications and the field organization. Produce regular reports on the status and quality of software releases and be prepared to speak to findings in an informative way to all levels of audience. Assess the quality, usability and functionality of each release. Develop and organize QA Processes for assigned projects to align with overall QA goals Lead and train a dynamically changing team of colleagues who participate in testing processes Exhibit expertise in handling large scale programs/projects that involve multiple stakeholders (Product, Dev, DevOps) Maintain a leading edge understanding of QA as related to interactive technologies best practices Design and implement test automation strategy for multiple product development teams at the onset of the project. Lead efforts for related automation projects, design and code reviews. Work closely with leadership and IT to provide input into the design and implementation of the automation framework. Work with Architecture, Engineering, Quality Engineering, IT, and Product Operations leaders to create and implement processes that accelerate the delivery of new features and products with high quality and at scale. Develop and contribute to a culture of high performance, transparency and continuous improvement as it relates to the infrastructure services and streamlining of the development pipeline. Participate in a diverse team of talented engineers globally, providing guidance, support and clear priorities. Who you are: Total Experience: 2 to 6 years. Hands on experience with at least 2 or more of leading testing tools/framework like Playwright, Robot Framework, K6, Jmeter. Hands on experience working on Python. Experience with Databases SQL/NoSQL. Experience working on CloudNative Applications. Hands on experience with Google Cloud Services like Kubernetes, Composer, Dataplex, Pub-Sub, BigQuery, AlloyDb, CloudSQL , lookerstudio etc. Strong analytical skills and ability to solve complex technical problems. API testing - must have understanding of RESTful design / best practices. Hands on experience testing APIs and test tools Experience with load / stress / performance testing and tools, Experience with Azure DevOps (or other similar issue/bug tracking systems) is required, Experience working with Cloud native applications. Ability to think abstract – to ensure ability to not conform to the norm. Norms do not find bugs quickly, Experience working in an Agile software development organization, Experience supporting development and product teams Excellent verbal, written, and interpersonal communication skills; ability to interact with all levels of an organization Ability to work in an advisory capacity to identify key technical and business problems, develop and evaluate.

Posted 1 week ago

Apply

20.0 - 25.0 years

22 - 27 Lacs

Mumbai

Work from Office

Naukri logo

As a Enterprise Architect in IBM Consulting, you'll serve as a leader in defining solutions for clients. You'll be the advocate for the client while guiding the technical team to implementation.You'll collaborate with client stakeholders and internal partners to understand the business problem and requirements, constraints of the system and concerns of the various stakeholders to systematically transform detailed solutions (architectures) for the client.Your primary responsibilities include: Strategic Analysis and Modernized Solution DesignAnalyse and translate IT requirements using modernization frameworks into components of a modernized solution. Optimize Modernization Assets and MethodsReuse and enhance Digital modernization assets, methods, and collateral. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong skill’s expected in cloud platfo Bachelor's or higher degree in Computer Science, Information Technology, or related field. 15 – 20 years of experience in Technology Architecture domain 10+ years of experience in architecture and design, with a focus on enterprise-level solutions. Proven hands on expertise in Azure /AWS/GCP Cloud Strong understanding of cloud-native technologies, microservices, APIs, containers, Kubernetes and serverless computing. Experience in using atleast 2 Low Code platforms for development projects Experience in DevSecOps practices and tools. Experience in Gen AI based Code generators, Microsoft Co-Pilot Demonstrated experience in creating and maintaining cloud strategies and architectural standards. Experience of architecting medium & large complex systems in the India market- with a value greater than USD 2 Million. Must have expertise in working with at least 1 industry – financial services or manufacturing Should have had earlier career Experience in design, development and architecture of Java/JEE/.NET/Integration based systems Should have experience in Infrastructure architecture, Sizing, estimation, provisioning, and automation of compute/storage/network Excellent interpersonal and communication skills to collaborate effectively with cross-functional teams. Azure/AWS Architect Level Certification. India Market experience is mandatory Preferred technical and professional experience Experience in Agile ways of working Creative problem-solving skills Excellent written and oral communication skills

Posted 1 week ago

Apply

4.0 - 6.0 years

5 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Immediate joiner Only 5+ years of software development experience. o 5+ years experience on Python or Java Hands-on experience on writing and understanding complex SQL (Hive/PySpark-data frames), optimizing joins while processing huge amount of data. o 3+ years of hands-on experience of working with Map-Reduce, Hive, Spark (core, SQL and PySpark). o Hands on Experience on Google Cloud Platform (BigQuery, DataProc, Cloud Composer) o 3+ years of experience in UNIX shell scripting o Should have experience in analysis, design, development, testing, and implementation of system applications. o Ability to effectively communicate with internal and external business partners. • Additional Good to have requirements: o Understanding of Distributed eco system. o Experience in designing and building solutions using Kafka streams or queues. o Experience with NoSQL i.e., HBase, Cassandra, Couchbase or MongoDB o Experience with Data Visualization tools like Tableau, SiSense, Looker o Ability to learn and apply new programming concepts. o Knowledge of Financial reporting ecosystem will be a plus. o Experience in leading teams of engineers and scrum teams

Posted 1 week ago

Apply

5.0 - 10.0 years

27 - 42 Lacs

Pune, Mumbai (All Areas)

Work from Office

Naukri logo

Are you passionate about building cutting-edge AI applications that make real-world impact? We're looking for a skilled AI Engineer to join our team at a fast-growing SaaS-based company in Pune. Who Were Looking For: An experienced AI Engineer with 5+ years of hands-on experience in developing and deploying AI/ML solutions using the Python stack, with deep knowledge of modern AI/ML frameworks and agentic technologies like LangChain, LangGraph, AutoGen, CrewAI, Hugging Face, OpenAI APIs, and more. Educational Background: B.Tech/M.Tech in Computer Science from Tier 1 institutions only (IITs, NITs, IIITs, BITS Pilani) Key Responsibilities: Develop and deploy scalable, AI-powered applications using Python Build intelligent systems using advanced frameworks (LangChain, Hugging Face, PyTorch, TensorFlow, etc.) Create robust APIs for integrating AI into web and cloud applications Apply MLOps for streamlining model deployment, CI/CD, and monitoring Collaborate with cross-functional teams to align AI capabilities with business needs Manage deployments on AWS, Azure, or GCP, ensuring high performance and security Troubleshoot production issues and deliver timely resolutions Continuously explore and experiment with emerging AI technologies. Tech Stack & Skills: Python, REST APIs, AI/ML agentic frameworks PyTorch, TensorFlow, Hugging Face, OpenAI APIs Cloud: AWS, GCP, or Azure | Docker, Kubernetes MLOps, CI/CD, Git, automation frameworks Strong grasp of data processing and machine learning fundamentals. Soft Skills: Curious, self-driven, and collaborative mindset Passion for innovation in AI/ML Excellent problem-solving and communication skills. Location: Pune Mode: Full-time, 5-day Work from Office Client: Leading SaaS-based product company

Posted 1 week ago

Apply

5.0 - 7.0 years

8 - 9 Lacs

Hyderabad, Coimbatore

Work from Office

Naukri logo

Job Title: Process Specialist Cloud & Technical Support (SME Voice) Location : Hyderabad / Coimbatore (Willing to Relocate) Account : EGS Work Mode : Work From Office | Rotational Shifts (24x7) Experience : 2.5 to 5 Years Education : Any Graduate (Preferred: B.E./B.Tech in CS/IT or relevant field) CTC: 8 to 9 LPA About the Opportunity: This is a one-of-a-kind opportunity to dive into multiple cloud-based technologies, learn new features, and transform traditional IaaS layers into more advanced and automated environments. Join a team driven by technology, innovation, and collaboration, delivering best-in-class cloud solutions to enterprise clients. Key Responsibilities: Provide expert-level voice-based technical support (B2B) to enterprise clients. Troubleshoot complex issues involving networking, web protocols, system administration, scripting, and cloud tools. Analyze system and trace logs, read error logs, and identify root causes. Use CLI tools on Windows and Linux systems for advanced troubleshooting. Manage escalated customer tickets with a focus on timely resolution and customer satisfaction. Maintain thorough case documentation, follow quality standards, and adhere to Service Level Objectives (SLOs). Collaborate with cross-functional teams for complex issue resolution. Contribute to knowledge base articles and internal resources. Stay updated with emerging technologies and cloud solutions. Participate in rotational on-call schedules for critical incident handling. Mandatory Skills: Technical Customer Support / Server Support / B2B Support (Voice) Networking, DNS, VPN, Web Protocols (HTTP/S, SSL/TLS) System Administration (Windows & Linux CLI), Active Directory Cloud Support Azure / Google Cloud Platform (GCP) (Intermediate Level) Log analysis and troubleshooting (Advanced Level) Working knowledge of APIs and SQL Moderate coding/scripting knowledge Python, JavaScript, HTML Strong analytical and problem-solving abilities Excellent communication skills (English B2 level proficiency) Preferred Skills / Nice to Have: Familiarity with Google Workspace (GWS) Knowledge of GCP Project setup & migration tools Experience with BigQuery Technical Certifications (e.g., CompTIA Network+, Security+, Linux+, Azure Admin, GCP Associate Cloud Engineer) Soft Skills: Detail-oriented in support case management and documentation Effective problem-solving and logical reasoning Strong verbal and written communication Excellent time management and multitasking in a fast-paced environment Team player with ability to work independently Selection Process: Written Assessment: BU Customized Technical Test Rounds of Interview: Technical Test + Technical Ops Interested candidates can connect to me at: Email: jinal@careerguideline.net Phone: 7758825565 Note: Immediate Joiners are preferred On paper designation as an SME is Mandatory. We are hiring for MNCs

Posted 1 week ago

Apply

5.0 - 9.0 years

20 - 35 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Naukri logo

Salary: 20 to 35 LPA Exp: 5 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services such as BigQuery, Data Flow, PubSub, Dataproc, and Cloud Storage. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from large datasets stored in Google Cloud SQL databases. Troubleshoot issues related to data processing workflows and provide timely resolutions. Desired Candidate Profile 5-9 years of experience in Data Engineering with expertise GCP & Biq query data engineering. Strong understanding of GCP Cloud Platform Administration including Compute Engine (Dataproc), Kubernetes Engine (K8s), Cloud Storage, Cloud SQL etc. . Experience working on big data analytics projects involving ETL processes using tools like Airflow or similar technologies.

Posted 1 week ago

Apply

5.0 - 9.0 years

20 - 35 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Naukri logo

Salary: 20 to 35 LPA Exp: 5 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services such as BigQuery, Data Flow, PubSub, Dataproc, and Cloud Storage. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from large datasets stored in Google Cloud SQL databases. Troubleshoot issues related to data processing workflows and provide timely resolutions. Desired Candidate Profile 5-9 years of experience in Data Engineering with expertise GCP & Biq query data engineering. Strong understanding of GCP Cloud Platform Administration including Compute Engine (Dataproc), Kubernetes Engine (K8s), Cloud Storage, Cloud SQL etc. . Experience working on big data analytics projects involving ETL processes using tools like Airflow or similar technologies.

Posted 1 week ago

Apply

4.0 - 8.0 years

20 - 35 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Naukri logo

Salary: 20 to 35 LPA Exp: 4 to 8 years Location: Gurgaon Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer

Posted 1 week ago

Apply

8.0 - 10.0 years

20 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

About Us: ValueLabs is a leading provider of technology solutions and services to businesses and organizations around the world. We're passionate about delivering exceptional service and support to our clients, and we're committed to building long-term relationships based on trust, respect, and mutual benefit. GenAI Product Development | Digital Technology Solutions | ValueLabs - ValueLabs Position: Senior/Principal Full Stack Engineer AI & Cloud Solutions Location: Banglore Work Mode: Hybrid (3: WFO; 2: WFH) Experience: 8–10 Years Employment Type: Full-Time Job Description: We are seeking a highly skilled and experienced Full Stack Engineer with a strong background in software architecture and AI-based solution development . The ideal candidate will have 8–10 years of experience in full stack development and 3–4 years in building and deploying AI-driven applications. This role requires deep expertise in modern web technologies, cloud platforms, and scalable microservices architecture. Roles and Responsibilities: Design and develop scalable, secure, and high-performance full stack applications. Architect and implement microservices using FastAPI , Python , and PostgreSQL . Build dynamic and responsive front-end interfaces using ReactJS , Redux , and Context API . Integrate AI/ML models into production systems, leveraging tools for NLP , computer vision , and deep learning . Optimize application performance and ensure high code quality through unit testing and TDD practices. Collaborate with cross-functional teams to deliver end-to-end solutions. Implement CI/CD pipelines and containerized deployments using Docker , Kubernetes , and cloud platforms like AWS , Azure , or GCP . Follow secure coding practices and contribute to architectural decisions. Required Skills Frontend : ReactJS, Redux, Hooks, Context API, JavaScript, HTML, CSS Backend : Python, FastAPI, RESTful APIs, GraphQL, Asynchronous programming Database : PostgreSQL, SQLAlchemy (ORM) Testing : Jest, Mocha, PyTest, unittest, TDD DevOps & Cloud : Docker, Kubernetes, CI/CD, AWS, Azure, GCP AI/ML : Experience with AI tools, NLP, computer vision, machine learning, deep learning frameworks (e.g., TensorFlow, PyTorch) Soft Skills : Strong problem-solving, communication, and collaboration skills

Posted 1 week ago

Apply

4.0 - 6.0 years

14 - 15 Lacs

Mumbai

Work from Office

Naukri logo

4+yrs exp in cloud pre-sales, consulting/architecture roles, specifically with GCP Working with enterprise clients across different vertical Prepare technical presentations,demos, proof-of-concept solution documentation,proposals&responses RFPs/RFIs Required Candidate profile Solid understanding of GCP services such as Compute Engine, App Engine, GKE, BigQuery, Cloud Storage, Pub/Sub, IAM, VPC, etc Ability to perform TCO/ROI analysis and cost optimization for GCP solutions

Posted 1 week ago

Apply

6.0 - 11.0 years

17 - 30 Lacs

Hyderabad/Secunderabad, Bangalore/Bengaluru, Delhi / NCR

Hybrid

Naukri logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of GCP Sr Data Engineer We are seeking a highly experienced and visionary Senior Google Cloud Data Engineer to spearhead the design, development, and optimization of our data infrastructure and pipelines on the Google Cloud Platform (GCP). With over 10 years of hands-on experience in data engineering, you will be instrumental in building scalable, reliable, and performant data solutions that power our advanced analytics, machine learning initiatives, and real-time reporting. You will provide technical leadership, mentor team members, and champion best practices for data engineering within a GCP environment. Responsibilities Architect, design, and implement end-to-end data pipelines on GCP using services like Dataflow, Cloud Composer (Airflow), Pub/Sub, and BigQuery. Build and optimize data warehousing solutions leveraging BigQuery's capabilities for large-scale data analysis. Design and implement data lakes on Google Cloud Storage, ensuring efficient data organization and accessibility. Develop and maintain scalable ETL/ELT processes to ingest, transform, and load data from diverse sources into GCP. Implement robust data quality checks, monitoring, and alerting mechanisms within the GCP data ecosystem. Collaborate closely with data scientists, analysts, and business stakeholders to understand their data requirements and deliver high-impact solutions on GCP. Lead the evaluation and adoption of new GCP data engineering services and technologies. Implement and enforce data governance policies, security best practices, and compliance requirements within the Google Cloud environment. Provide technical guidance and mentorship to other data engineers on the team, promoting knowledge sharing and skill development within the GCP context. Troubleshoot and resolve complex data-related issues within the GCP infrastructure. Contribute to the development of data engineering standards, best practices, and comprehensive documentation specific to GCP. Qualifications we seek in you! Minimum Qualifications / Skills • Bachelor's or Master's degree in Computer Science, Engineering, or a related field. • 10+ years of progressive experience in data engineering roles, with a strong focus on cloud technologies. • Deep and demonstrable expertise with the Google Cloud Platform (GCP) and its core data engineering services (e.g., BigQuery, Dataflow, Cloud Composer, Cloud Storage, Pub/Sub, Cloud Functions). • Extensive experience designing, building, and managing large-scale data pipelines and ETL/ELT workflows specifically on GCP. • Strong proficiency in SQL and at least one programming language relevant to data engineering on GCP (e.g., Python). • Comprehensive understanding of data warehousing concepts, data modeling techniques optimized for BigQuery, and NoSQL database options on GCP (e.g., Cloud Bigtable, Firestore). • Solid grasp of data governance principles, data security best practices within GCP (IAM, KMS), and compliance frameworks. • Excellent problem-solving, analytical, and debugging skills within a cloud environment. • Exceptional communication, collaboration, and presentation skills, with the ability to articulate technical concepts clearly to various audiences. Preferred Qualifications/ Skills Google Cloud certifications relevant to data engineering (e.g., Professional Data Engineer). Experience with infrastructure-as-code tools for GCP (e.g., Terraform, Deployment Manager). Familiarity with data streaming technologies on GCP (e.g., Dataflow, Pub/Sub). Experience with machine learning workflows and MLOps on GCP (e.g., Vertex AI). Knowledge of containerization technologies (Docker, Kubernetes) and their application within GCP data pipelines (e.g., Dataflow FlexRS). Experience with data visualization tools that integrate well with GCP (e.g., Looker). Familiarity with data cataloging and data lineage tools on GCP (e.g., Data Catalog). Experience in [mention specific industry or domain relevant to your company]. Proven experience in leading technical teams and mentoring junior engineers in a GCP environment. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

7 - 17 Lacs

Hyderabad, Coimbatore

Hybrid

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Analytics Services Minimum 7 year(s) of experience is required Educational Qualification : BEBTECHMTECH Summary: As an Application Lead, you will be responsible for leading the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve working with Microsoft Azure Analytics Services and collaborating with cross-functional teams to deliver high-quality solutions. Roles & Responsibilities: - Lead the design, development, and deployment of applications using Microsoft Azure Analytics Services. - Collaborate with cross-functional teams to ensure the timely delivery of high-quality solutions. - Act as the primary point of contact for all application-related issues, providing technical guidance and support to team members. - Ensure adherence to best practices and standards for application development, testing, and deployment. - Provide technical leadership and mentorship to team members, fostering a culture of innovation and continuous improvement. Professional & Technical Skills: - Must To Have Skills: Strong experience in Microsoft Azure Analytics Services. - Good To Have Skills: Experience with other cloud platforms such as AWS or Google Cloud Platform. - Experience in designing, developing, and deploying applications using Microsoft Azure Analytics Services. - Strong understanding of cloud computing concepts and architectures. - Experience with agile development methodologies and DevOps practices. - Excellent problem-solving and analytical skills.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Hyderabad, Pune

Work from Office

Naukri logo

Job Description Job Overview We are looking for a skilled and proactive SRE (Site Reliability Engineer) to manage, maintain, and troubleshoot cloud data pipelines across our infrastructure. The ideal candidate is a data engineering expert with deep knowledge of cloud services, data pipeline architecture, and a software engineering mindset to optimize performance, reliability, and cost-efficiency. This role demands strong problem-solving abilities, hands-on experience with any cloud platforms (preferably GCP), and the capability to work independently in a fast-paced environment. Key Responsibilities Manage and support cloud data pipelines and associated infrastructure Monitor the performance and reliability of pipelines, including Informatica ETL workflows, MDM, and Control-M jobs Troubleshoot and resolve complex issues related to data pipelines and data processing systems Optimize data pipeline efficiency to reduce operational costs and failure rates Automate repetitive tasks and streamline data pipeline management processes Conduct post-incident reviews and implement improvements for future reliability Perform SLA-oriented monitoring and recommend enhancements to ensure compliance Collaborate with cross-functional teams to improve and document systems and workflows Support real-time monitoring and alerting for mission-critical data processes Continuously improve systems based on proactive testing and performance insights Required Skills and Qualifications 5+ years of experience in Data Engineering support and enhancement Proficiency in Python for data processing and automation Strong SQL skills and experience working with relational databases Solid understanding of data pipeline architectures and ETL processes Hands-on experience with any cloud platforms (GCP, Azure, AWS GCP preferred) Familiarity with version control systems like Git. Experience in monitoring and alerting solutions for data systems Skilled in conducting post-incident analysis and reliability improvements Exposure to data visualization tools such as Google Looker Studio, Tableau, Domo, or Power BI is a plus Strong analytical and problem-solving skills Excellent verbal and written communication abilities Ability to work in a 24x7 shift environment Preferred Qualifications Bachelor’s degree in computer science , Engineering, or a related technical field. Professional Cloud Certification (e.g., GCP Professional Data Engineer) is a plus.

Posted 2 weeks ago

Apply

6.0 - 10.0 years

15 - 30 Lacs

Pune

Work from Office

Naukri logo

Job description Location: Pune Exp: 6-10 Years We are looking to hire a Data Engineer with strong hands-on experience in SQL, ETL, Pyspark, GCP, Required Past Experience: 3+ Years experience in ETL pipelines experience along with any GCP cloud experience along with pyspark. Experience in Shorked on at least one development project from ETL Perspective. File Processing usiell/Python Scripting Hand-On Experience to write Business Logic SQL or PL/SQL ETL Testing and Troubleshooting Good to have experience on Building a Cloud ETL PipeLine Hands-on experience in Code Versioning Tools like Git , SVN.. Good Knowledge of Code Deployment Process and Documentation Required Skills and Abilities: Mandatory Skills - Hands-on and deep experience working in ETL , GCP cloud (AWS/ Azure/ GCP) , Pyspark Secondary Skills - Strong in SQL Query and Shell Scripting Better Communication skill to understand business requirements from SME. Basic knowledge of data modeling Good Understanding of E2E Data Pipeline and Code Optimization Hands on experience in Developing ETL PipeLine for heterogeneous sources Good to have experience on Building a Cloud ETL PipeLine

Posted 2 weeks ago

Apply

6.0 - 9.0 years

11 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Job Title : Java Backend Developer Location : Bangalore , Chennai, Hyderabad Job Type: Full time Job Summary: We are looking for a skilled Java Developer to join our dynamic team. The successful candidate will be responsible for developing high-performance, scalable web applications and services using Java and Node.js technologies. You will work closely with cross-functional teams to deliver innovative solutions. Key Responsibilities: Design, develop, and maintain server-side applications in Java and Node.js. Collaborate with front-end developers to integrate user-facing elements with server-side logic. Optimize applications for maximum speed and scalability. Qualifications: 5+ Years of experience tinto Java backend applications 2 + Year into GCP/ Google cloud platform What We Offer: Competitive salary and benefits package. Opportunities for professional growth and development. A dynamic work environment with a supportive team. Flexible working hours and potential for remote work. How to Apply: Interested candidates should submit their resume and a cover letter to [contact information or application link].

Posted 2 weeks ago

Apply

1.0 - 3.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

SRE 1 (Clouds Op) Locations: B'lore & Pune Exp - 1 to 3 yrs Candiates only from B2C product companies Exp - GCP, Prometheus, Grafana, ELK, Newrelic, Pingdom, or Pagerduty , Kubernets Experience with CI/CD tools 5 days week Rotational Shift

Posted 2 weeks ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Bengaluru

Hybrid

Naukri logo

As an Advanced Analytics & Reporting Analyst 3 in the Mobile Analytics domain, you will be responsible for analyzing and interpreting data related to the entire mobile user journey. Your insights will be crucial in understanding user behavior, optimizing app performance, and supporting our GTM team. Key Responsibilities: Mobile User Journey Analysis: Analyze stages such as downloads (organic/paid), app ranking, category penetration, and user transitions from freemium to paid models. App Store Analytics: Track and interpret metrics from Google and Apple consoles, focusing on key indicators like app ratings, reviews, and crash rates. App Launch Metrics: Monitor and analyze the frequency of app launches by users. Feature Usage Analysis: Evaluate the usage of specific app features. 3rd Party Tools Expertise: Utilize tools like Sensor Tower, Data.AI, Google App Console, App Store Console, AppTweak, and Branch to track performance and understand paid media effectiveness. Regular Reporting: Compile and present mobile DDOM scorecards and paid media performance reports every Monday. Deep-Dive Analysis: Conduct in-depth analyses and support monthly category mobile RTBs (e.g., DC RTB, LrM RTB). Project Support: Assist in unlocking growth and driving various analytical projects. Qualifications: Proven experience in the mobile analytics domain. Strong understanding of the overall mobile user journey and key metrics. Proficiency with 3rd party tools such as Sensor Tower/Data.AI, Google App Console, App Store Console, AppTweak, and Branch. Excellent analytical and problem-solving skills. Ability to compile and present detailed reports. Strong communication and collaboration skills. Experience working in an app agency or similar environment is highly valuable.

Posted 2 weeks ago

Apply

4.0 - 5.0 years

8 - 18 Lacs

Gurugram

Work from Office

Naukri logo

4–5 yrs total dev exp, incl. 2+ yrs in Golang Prior Java, JS, or Node.js background preferred Build scalable backend services & REST APIs Strong in microservices & concurrent programming Work with SQL/NoSQL DBs (PostgreSQL, MongoDB) Required Candidate profile Familiar with Git, CI/CD, Docker, Kubernetes Bonus: Kafka/RabbitMQ, cloud (AWS/GCP), Agile Computer Science /Engineering degree preferred

Posted 2 weeks ago

Apply

4.0 - 8.0 years

12 - 18 Lacs

Hyderabad, Chennai, Coimbatore

Hybrid

Naukri logo

We are seeking a skilled and motivated Data Engineer to join our dynamic team. The ideal candidate will have experience in designing, developing, and maintaining scalable data pipelines and architectures using Hadoop, PySpark, ETL processes , and Cloud technologies . Responsibilities: Design, develop, and maintain data pipelines for processing large-scale datasets. Build efficient ETL workflows to transform and integrate data from multiple sources. Develop and optimize Hadoop and PySpark applications for data processing. Ensure data quality, governance, and security standards are met across systems. Implement and manage Cloud-based data solutions (AWS, Azure, or GCP). Collaborate with data scientists and analysts to support business intelligence initiatives. Troubleshoot performance issues and optimize query executions in big data environments. Stay updated with industry trends and advancements in big data and cloud technologies . Required Skills: Strong programming skills in Python, Scala, or Java . Hands-on experience with Hadoop ecosystem (HDFS, Hive, Spark, etc.). Expertise in PySpark for distributed data processing. Proficiency in ETL tools and workflows (SSIS, Apache Nifi, or custom pipelines). Experience with Cloud platforms (AWS, Azure, GCP) and their data-related services. Knowledge of SQL and NoSQL databases. Familiarity with data warehousing concepts and data modeling techniques. Strong analytical and problem-solving skills. Interested can reach us at +91 7305206696/ saranyadevib@talentien.com

Posted 2 weeks ago

Apply

10.0 - 16.0 years

27 - 42 Lacs

Bengaluru

Work from Office

Naukri logo

Skill: Oracle PLSQL with Postgres SQL Experience: 10 to 15 years Location: Bangalore, Hyderabad, Pune & Kolkata Oracle JD: 10+ years of experience in Oracle databases. Should have hands on development and support experience in Oracle DB. Well versed with PL/SQL coding, Implement queries, packages, stored procedures, cursors, functions & triggers. Ability to debug and document the SQL, PL/SQL code. Good understanding of Database designs and data modelling techniques. Well versed with Data loading, and validations. Exposure and knowledge of PostgreSQL DB, PL/pgSQL coding is an added advantage. Experience in DB migrations from Oracle to Postgres is a plus. Knowledge or exposure to Oracle Database administration tasks. Good understanding of Database designs and data modelling techniques. Knowledge and exposure to GCP cloud infrastructure and tools. Ability to communicate with Onshore clients, and dependent teams. Should have good problem solving and communication skills with a positive can-do attitude. Should be proactive, requires minimal supervision, ability to work as a team player.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies