Jobs
Interviews

20 Compute Engine Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

1 - 1 Lacs

Chennai

Hybrid

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Specialty Development Practitioner Location: Chennai Work Type: Hybrid Position Description: At the client's Credit Company, we are modernizing our enterprise data warehouse in Google Cloud to enhance data, analytics, and AI/ML capabilities, improve customer experience, ensure regulatory compliance, and boost operational efficiencies. As a GCP Data Engineer, you will integrate data from various sources into novel data products. You will build upon existing analytical data, including merging historical data from legacy platforms with data ingested from new platforms. You will also analyze and manipulate large datasets, activating data assets to enable enterprise platforms and analytics within GCP. You will design and implement the transformation and modernization on GCP, creating scalable data pipelines that land data from source applications, integrate into subject areas, and build data marts and products for analytics solutions. You will also conduct deep-dive analysis of Current State Receivables and Originations data in our data warehouse, performing impact analysis related to the client's Credit North America's modernization and providing implementation solutions. Moreover, you will partner closely with our AI, data science, and product teams, developing creative solutions that build the future for the client's Credit. Experience with large-scale solutions and operationalizing data warehouses, data lakes, and analytics platforms on Google Cloud Platform or other cloud environments is a must. We are looking for candidates with a broad set of analytical and technology skills across these areas and who can demonstrate an ability to design the right solutions with the appropriate combination of GCP and 3rd party technologies for deployment on Google Cloud Platform. Skills Required: Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform - Biq Query Experience Required: GCP Data Engineer Certified Successfully designed and implemented data warehouses and ETL processes for over five years, delivering high-quality data solutions. 5+ years of complex SQL development experience 2+ experience with programming languages such as Python, Java, or Apache Beam. Experienced cloud engineer with 3+ years of GCP expertise, specializing in managing cloud infrastructure and applications into production-scale solutions. Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API, cloudbuild, App Engine, Apache Kafka, Pub/Sub, AI/ML, Kubernetes Experience Preferred: In-depth understanding of GCP's underlying architecture and hands-on experience of crucial GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, Big Query, Dataflow, Pub/Sub, Data form, astronomer, Data Fusion, DataProc, Pyspark, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Cloud build and App Engine, alongside and storage including Cloud Storage DevOps tools such as Tekton, GitHub, Terraform, Docker. Expert in designing, optimizing, and troubleshooting complex data pipelines. Experience developing with microservice architecture from container orchestration framework. Experience in designing pipelines and architectures for data processing. Passion and self-motivation to develop/experiment/implement state-of-the-art data engineering methods/techniques. Self-directed, work independently with minimal supervision, and adapts to ambiguous environments. Evidence of a proactive problem-solving mindset and willingness to take the initiative. Strong prioritization, collaboration & coordination skills, and ability to simplify and communicate complex ideas with cross-functional teams and all levels of management. Proven ability to juggle multiple responsibilities and competing demands while maintaining a high level of productivity. Data engineering or development experience gained in a regulated financial environment. Experience in coaching and mentoring Data Engineers Project management tools like Atlassian JIRA Experience working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Experience with data security, governance, and compliance best practices in the cloud. Experience with AI solutions or platforms that support AI solutions Experience using data science concepts on production datasets to generate insights Experience Range: 5+ years Education Required: Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity.

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

As a Google Cloud Engineer at our company, you will play a crucial role in designing, building, deploying, and maintaining our cloud infrastructure and applications on Google Cloud Platform (GCP). Your collaboration with development, operations, and security teams will ensure that our cloud environment is scalable, secure, highly available, and cost-optimized. If you are enthusiastic about cloud-native technologies, automation, and overcoming intricate infrastructure challenges, we welcome you to apply. Your responsibilities will include: - Designing, implementing, and managing robust, scalable, and secure cloud infrastructure on GCP utilizing Infrastructure as Code (IaC) tools like Terraform. - Deploying, configuring, and managing core GCP services such as Compute Engine, Kubernetes Engine (GKE), Cloud SQL, Cloud Storage, Cloud Functions, BigQuery, Pub/Sub, and networking components. - Developing and maintaining CI/CD pipelines for automated deployment and release management using various tools. - Implementing and enforcing security best practices within the GCP environment, including IAM, network security, data encryption, and compliance adherence. - Monitoring cloud infrastructure and application performance, identifying bottlenecks, and implementing optimization solutions. - Troubleshooting and resolving complex infrastructure and application issues in production and non-production environments. - Collaborating with development teams to ensure cloud-native deployment, scalability, and resilience of applications. - Participating in on-call rotations for critical incident response and timely resolution of production issues. - Creating and maintaining comprehensive documentation for cloud architecture, configurations, and operational procedures. - Keeping up-to-date with new GCP services, features, and industry best practices to propose and implement improvements. - Contributing to cost optimization efforts by identifying and implementing efficiencies in cloud resource utilization. We require you to have: - A Bachelors or Masters degree in Computer Science, Software Engineering, or a related field. - 6+ years of experience with C#, .NET Core, .NET Framework, MVC, Web API, Entity Framework, and SQL Server. - 3+ years of experience with cloud platforms, preferably GCP, including designing and deploying cloud-native applications. - 3+ years of experience with source code management, CI/CD pipelines, and Infrastructure as Code. - Strong experience with Javascript and a modern Javascript framework, with VueJS preferred. - Proven leadership and mentoring skills with development teams. - Strong understanding of microservices architecture and serverless computing. - Experience with relational databases like SQL Server and PostgreSQL. - Excellent problem-solving, analytical, and communication skills, along with Agile/Scrum environment experience. What can make you stand out: - GCP Cloud Certification. - UI development experience with HTML, JavaScript, Angular, and Bootstrap. - Agile environment experience with Scrum, XP. - Relational database experience with SQL Server, PostgreSQL. - Proficiency in Atlassian tools like JIRA, Confluence, and Github. - Working knowledge of Python and exceptional problem-solving and analytical abilities, along with strong teamwork skills.,

Posted 1 week ago

Apply

5.0 - 13.0 years

0 Lacs

pune, maharashtra

On-site

You are a highly skilled and experienced Cloud Architect/Engineer with deep expertise in Google Cloud Platform (GCP). Your primary responsibility is to design, build, and manage scalable and reliable cloud infrastructure on GCP. You will leverage various GCP services such as Compute Engine, Cloud Run, BigQuery, Pub/Sub, Cloud Functions, Dataflow, Dataproc, IAM, and Cloud Storage to ensure high-performance cloud solutions. Your role also includes developing and maintaining CI/CD pipelines, automating infrastructure deployment using Infrastructure as Code (IaC) principles, and implementing best practices in cloud security, monitoring, performance tuning, and logging. Collaboration with cross-functional teams to deliver cloud solutions aligned with business objectives is essential. You should have 5+ years of hands-on experience in cloud architecture and engineering, with at least 3 years of practical experience on Google Cloud Platform (GCP). In-depth expertise in GCP services mentioned above is required. Strong understanding of networking, security, containerization (Docker, Kubernetes), and CI/CD pipelines is essential. Experience with monitoring, performance tuning, and logging in cloud environments is preferred. Familiarity with DevSecOps practices and tools such as HashiCorp Vault is a plus. Your role as a GCP Cloud Architect/Engineer will contribute to ensuring system reliability, backup, and disaster recovery strategies. This hybrid role is based out of Pune and requires a total of 10 to 13 years of relevant experience.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. You would be working on: - Developing and implementing Generative AI / AI solutions on Google Cloud Platform - Working with cross-functional teams to design and deliver AI-powered products and services - Developing, versioning, and executing Python code - Deploying models as endpoints in Dev Environment - Having a solid understanding of python - Utilizing deep learning frameworks such as TensorFlow, PyTorch, or JAX - Working on Natural language processing (NLP) and machine learning (ML) - Utilizing Cloud storage, compute engine, VertexAI, Cloud Function, Pub/Sub, Vertex AI, etc. - Providing Generative AI support in Vertex, specifically hands-on experience with Generative AI models like Gemini, vertex Search, etc. Your Profile should include: - Experience in Generative AI development with Google Cloud Platform - Experience in delivering an AI solution on VertexAI platform - Experience in developing and deploying AI Solutions with ML What you'll love about working here: - You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. - You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work. - You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market-leading capabilities in AI, generative AI, cloud, and data, combined with its deep industry expertise and partner ecosystem.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

About GlobalLogic GlobalLogic, a leader in digital product engineering with over 30,000 employees, helps brands worldwide in designing and developing innovative products, platforms, and digital experiences. By integrating experience design, complex engineering, and data expertise, GlobalLogic assists clients in envisioning possibilities and accelerating their transition into the digital businesses of tomorrow. Operating design studios and engineering centers globally, GlobalLogic extends its deep expertise to customers in various industries such as communications, financial services, automotive, healthcare, technology, media, manufacturing, and semiconductor. GlobalLogic is a Hitachi Group Company. Requirements Leadership & Strategy As a part of GlobalLogic, you will lead and mentor a team of cloud engineers, providing technical guidance and support for career development. You will define cloud architecture standards and best practices across the organization and collaborate with senior leadership to develop a cloud strategy and roadmap aligned with business objectives. Your responsibilities will include driving technical decision-making for complex cloud infrastructure projects and establishing and maintaining cloud governance frameworks and operational procedures. Leadership Experience With a minimum of 3 years in technical leadership roles managing engineering teams, you should have a proven track record of successfully delivering large-scale cloud transformation projects. Experience in budget management, resource planning, and strong presentation and communication skills for executive-level reporting are essential for this role. Certifications (Preferred) Preferred certifications include Google Cloud Professional Cloud Architect, Google Cloud Professional Data Engineer, and additional relevant cloud or security certifications. Technical Excellence You should have over 10 years of experience in designing and implementing enterprise-scale Cloud Solutions using GCP services. As a technical expert, you will architect and oversee the development of sophisticated cloud solutions using Python and advanced GCP services. Your role will involve leading the design and deployment of solutions utilizing Cloud Functions, Docker containers, Dataflow, and other GCP services. Additionally, you will design complex integrations with multiple data sources and systems, implement security best practices, and troubleshoot and resolve technical issues while establishing preventive measures. Job Responsibilities Technical Skills Your expertise should include expert-level proficiency in Python and experience in additional languages such as Java, Go, or Scala. Deep knowledge of GCP services like Dataflow, Compute Engine, BigQuery, Cloud Functions, and others is required. Advanced knowledge of Docker, Kubernetes, and container orchestration patterns, along with experience in cloud security, infrastructure as code, and CI/CD practices, will be crucial for this role. Cross-functional Collaboration Collaborating with C-level executives, senior architects, and product leadership to translate business requirements into technical solutions, leading cross-functional project teams, presenting technical recommendations to executive leadership, and establishing relationships with GCP technical account managers are key aspects of this role. What We Offer At GlobalLogic, we prioritize a culture of caring, continuous learning and development, interesting and meaningful work, balance and flexibility, and a high-trust organization. Join us to experience an inclusive culture, opportunities for growth and advancement, impactful projects, work-life balance, and a safe, reliable, and ethical global company. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating innovative digital products and experiences since 2000. Collaborating with forward-thinking companies globally, GlobalLogic continues to transform businesses and redefine industries through intelligent products, platforms, and services.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Cloud Engineer at AVP level in Bangalore, India, you will be responsible for designing, implementing, and managing cloud infrastructure and services on Google Cloud Platform (GCP). Your key responsibilities will include designing, deploying, and managing scalable, secure, and cost-effective cloud environments on GCP, developing Infrastructure as Code (IaC) using tools like Terraform, ensuring security best practices, IAM policies, and compliance with organizational and regulatory standards, configuring and managing VPCs, subnets, firewalls, VPNs, and interconnects for secure cloud networking, setting up CI/CD pipelines for automated deployments, implementing monitoring and alerting using tools like Stackdriver, optimizing cloud spending, designing disaster recovery and backup strategies, deploying and managing GCP databases, and managing containerized applications using GKE and Cloud Run. You will be part of the Platform Engineering Team, which is responsible for building and maintaining foundational infrastructure, tooling, and automation to enable efficient, secure, and scalable software development and deployment. The team focuses on creating a self-service platform for developers and operational teams, ensuring reliability, security, and compliance while improving developer productivity. To excel in this role, you should have strong experience with GCP services, proficiency in scripting and Infrastructure as Code, knowledge of DevOps practices and CI/CD tools, understanding of security, IAM, networking, and compliance in cloud environments, experience with monitoring tools, strong problem-solving skills, and Google Cloud certifications would be a plus. You will receive training, development, coaching, and support to help you excel in your career, along with a culture of continuous learning and a range of flexible benefits tailored to suit your needs. The company strives for a positive, fair, and inclusive work environment where employees are empowered to excel together every day. For further information about the company and its teams, please visit the company website: https://www.db.com/company/company.htm. The Deutsche Bank Group welcomes applications from all individuals and promotes a culture of shared successes and collaboration.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

kochi, kerala

On-site

Beinex is seeking a skilled and motivated Google Cloud Consultant to join our dynamic team. As a Google Cloud Consultant, you will play a pivotal role in assisting our clients in harnessing the power of Google Cloud technologies to drive innovation and transformation. If you are passionate about cloud solutions, client collaboration, and cutting-edge technology, we invite you to join our journey. Responsibilities - Collaborate with clients to understand their business objectives and technology needs, translating them into effective Google Cloud solutions - Design, implement, and manage Google Cloud Platform (GCP) architectures, ensuring scalability, security, and performance - Provide technical expertise and guidance to clients on GCP services, best practices, and cloud-native solutions and adopt an Infrastructure as Code (IaC) approach to establish an advanced infrastructure for both internal and external stakeholders - Conduct cloud assessments and create migration strategies for clients looking to transition their applications and workloads to GCP - Work with cross-functional teams to plan, execute, and optimise cloud migrations, deployments, and upgrades - Assist clients in optimising their GCP usage by analysing resource utilisation, recommending cost-saving measures, and enhancing overall efficiency - Collaborate with development teams to integrate cloud-native technologies and solutions into application design and development processes - Stay updated with the latest trends, features, and updates in the Google Cloud ecosystem and provide thought leadership to clients - Troubleshoot and resolve technical issues related to GCP services and configurations - Create and maintain documentation for GCP architectures, solutions, and best practices - Conduct training sessions and workshops for clients to enhance their understanding of GCP technologies and usage Key Skills Requirements - Profound expertise in Google Cloud Platform services, including but not limited to Compute Engine, App Engine, Kubernetes Engine, Cloud Storage, BigQuery, Pub/Sub, Cloud Functions, VPC, IAM, and Cloud Security - Strong understanding of GCP networking concepts, including VPC peering, firewall rules, VPN, and hybrid cloud configurations - Experience with Infrastructure as Code (IaC) tools such as Terraform, Deployment Manager, or Google Cloud Deployment Manager - Hands-on experience with containerisation technologies like Docker and Kubernetes - Proficiency in scripting languages such as Python and Bash - Familiarity with cloud monitoring, logging, and observability tools and practices - Knowledge of DevOps principles and practices, including CI/CD pipelines and automation - Strong problem-solving skills and the ability to troubleshoot complex technical issues - Excellent communication skills to interact effectively with clients, team members, and stakeholders - Previous consulting or client-facing experience is a plus - Relevant Google Cloud certifications are highly desirable Perks: Careers at Beinex - Comprehensive Health Plans - Learning and development - Workation and outdoor training - Hybrid working environment - On-site travel Opportunity - Beinex Branded Merchandise,

Posted 1 week ago

Apply

8.0 - 13.0 years

20 - 30 Lacs

Gurugram

Work from Office

Hi, Wishes from GSN!!! Pleasure connecting with you!!! We been into Corporate Search Services for Identifying & Bringing in Stellar Talented Professionals for our reputed IT / Non-IT clients in India. We have been successfully providing results to various potential needs of our clients for the last 20 years. At present, GSN is hiring GCP ENGINEER for one of our leading MNC client. PFB the details for your better understanding: 1. WORK LOCATION : Gurugram 2. Job Role: GCP ENGINEER 3. EXPERIENCE : 8+ yrs 4. CTC Range: Rs. 20 LPA to Rs. 30 LPA 5. Work Type : WFO (Hybrid) ****** Looking for IMMEDIATE JOINER ****** Who are we looking for ? MLOPS Engineer with AWS Experience. Required Skills : GCP Arch. Certification Terraform GitLab Shell Scripting GCP Services Compute Engine Cloud Storage Data Flow Big Query IAM . ****** Looking for IMMEDIATE JOINER ****** Best regards, Kaviya | GSN | Kaviya@gsnhr.net | 9150016092 | Google review : https://g.co/kgs/UAsF9W

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Cloud Engineering Team Leader at GlobalLogic, you will be responsible for providing technical guidance and career development support to a team of cloud engineers. You will define cloud architecture standards and best practices across the organization, collaborating with senior leadership to develop a cloud strategy aligned with business objectives. Your role will involve driving technical decision-making for complex cloud infrastructure projects, establishing and maintaining cloud governance frameworks, and operational procedures. With a background in technical leadership roles managing engineering teams, you will have a proven track record of successfully delivering large-scale cloud transformation projects. Experience in budget management, resource planning, and strong presentation and communication skills for executive-level reporting are essential. Preferred certifications include Google Cloud Professional Cloud Architect, Google Cloud Professional Data Engineer, and additional relevant cloud or security certifications. You will leverage your 10+ years of experience in designing and implementing enterprise-scale Cloud Solutions using GCP services to architect sophisticated cloud solutions using Python and advanced GCP services. Leading the design and deployment of solutions utilizing Cloud Functions, Docker containers, Dataflow, and other GCP services will be part of your responsibilities. Ensuring optimal performance and scalability of complex integrations with multiple data sources and systems, implementing security best practices and compliance frameworks, and troubleshooting and resolving technical issues will be key aspects of your role. Your technical skills will include expert-level proficiency in Python with experience in additional languages, deep expertise with GCP services such as Dataflow, Compute Engine, BigQuery, Cloud Functions, and others, advanced knowledge of Docker, Kubernetes, and container orchestration patterns, extensive experience in cloud security, proficiency in Infrastructure as Code tools like Terraform, Cloud Deployment Manager, and CI/CD experience with advanced deployment pipelines and GitOps practices. As part of the GlobalLogic team, you will benefit from a culture of caring, continuous learning and development opportunities, interesting and meaningful work, balance and flexibility in work arrangements, and being part of a high-trust organization. You will have the chance to work on impactful projects, engage with collaborative teammates and supportive leaders, and contribute to shaping cutting-edge solutions in the digital engineering domain.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

indore, madhya pradesh

On-site

As a GCP Cloud Engineer at Ascentt, you will play a crucial role in designing, deploying, and managing cloud infrastructure on Google Cloud Platform to provide scalable solutions for our development teams. Your expertise will contribute to turning enterprise data into real-time decisions using advanced machine learning and GenAI, with a focus on solving hard engineering problems with real-world industry impact. Your key responsibilities will include designing and managing GCP infrastructure such as Compute Engine, GKE, Cloud Run, and networking components. You will be expected to implement CI/CD pipelines and infrastructure as code, preferably using Terraform, and configure monitoring, logging, and security using Cloud Operations Suite. Automation of deployments and maintenance of disaster recovery procedures will be essential aspects of your role, along with collaborating closely with development teams on architecture and troubleshooting. To excel in this role, you should possess at least 5 years of GCP experience with core services like Compute, Storage, Cloud SQL, and BigQuery. Strong knowledge of Kubernetes, Docker, and networking is essential, along with proficiency in Terraform and scripting languages such as Python and Bash. Experience with CI/CD tools, cloud migrations, and GitHub is required, and holding GCP Associate/Professional certification would be advantageous. A Bachelor's degree or equivalent experience is also necessary to succeed in this position. Additionally, preferred skills for this role include experience with multi-cloud environments like AWS/Azure, familiarity with configuration management tools such as Ansible and Puppet, database administration knowledge, and expertise in cost optimization strategies. If you are a passionate builder looking to shape the future of industrial intelligence through cutting-edge data analytics and AI/ML solutions, Ascentt welcomes your application to join our team and make a significant impact in the automotive and manufacturing industries.,

Posted 2 weeks ago

Apply

10.0 - 20.0 years

10 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Job Description: Cloud Infrastructure & Deployment Design and implement secure, scalable, and highly available cloud infrastructure on GCP. Provision and manage compute, storage, network, and database services. Automate infrastructure using Infrastructure as Code (IaC) tools such as Terraform or Deployment Manager. Architecture & Design Translate business requirements into scalable cloud solutions. Recommend GCP services aligned with application needs and cost optimization. Participate in high-level architecture and solution design discussions. DevOps & Automation Build and maintain CI/CD pipelines (e.g., using Cloud Build, Jenkins, GitLab CI). Integrate monitoring, logging, and alerting (e.g., Stackdriver / Cloud Operations Suite). Enable autoscaling, load balancing, and zero-downtime deployments. Security & Compliance Ensure compliance with security standards and best Migration & Optimization Support cloud migration projects from on-premise or other cloud providers to GCP. Optimize performance, reliability, and cost of GCP workloads. Documentation & Support Maintain technical documentation and architecture diagrams. Provide L2/L3 support for GCP-based services and incidents. Required Skills and Qualifications: Google Cloud Certification Associate Cloud Engineer or Professional Cloud Architect/Engineer Hands-on experience with GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) Strong command of Linux , shell scripting , and networking fundamentals Proficiency in Terraform , Cloud Build , Cloud Functions , or other GCP-native tools Experience with containers and orchestration – Docker, Kubernetes (GKE) Familiarity with monitoring/logging – Cloud Monitoring , Prometheus , Grafana Understanding of IAM , VPCs , firewall rules , service accounts , and Cloud Identity

Posted 3 weeks ago

Apply

10.0 - 20.0 years

10 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Job Description: Cloud Infrastructure & Deployment Design and implement secure, scalable, and highly available cloud infrastructure on GCP. Provision and manage compute, storage, network, and database services. Automate infrastructure using Infrastructure as Code (IaC) tools such as Terraform or Deployment Manager. Architecture & Design Translate business requirements into scalable cloud solutions. Recommend GCP services aligned with application needs and cost optimization. Participate in high-level architecture and solution design discussions. DevOps & Automation Build and maintain CI/CD pipelines (e.g., using Cloud Build, Jenkins, GitLab CI). Integrate monitoring, logging, and alerting (e.g., Stackdriver / Cloud Operations Suite). Enable autoscaling, load balancing, and zero-downtime deployments. Security & Compliance Ensure compliance with security standards and best Migration & Optimization Support cloud migration projects from on-premise or other cloud providers to GCP. Optimize performance, reliability, and cost of GCP workloads. Documentation & Support Maintain technical documentation and architecture diagrams. Provide L2/L3 support for GCP-based services and incidents. Required Skills and Qualifications: Google Cloud Certification Associate Cloud Engineer or Professional Cloud Architect/Engineer Hands-on experience with GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) Strong command of Linux , shell scripting , and networking fundamentals Proficiency in Terraform , Cloud Build , Cloud Functions , or other GCP-native tools Experience with containers and orchestration – Docker, Kubernetes (GKE) Familiarity with monitoring/logging – Cloud Monitoring , Prometheus , Grafana Understanding of IAM , VPCs , firewall rules , service accounts , and Cloud Identity

Posted 4 weeks ago

Apply

6.0 - 10.0 years

25 - 35 Lacs

Hyderabad

Work from Office

About Client Hiring for One of the top most MNC!! Job Description Job Title : Sr. DevOps Engineer -GCP Qualification : Any Graduate or Above Relevant Experience : 6 to 9 Years MAIN SKILL GCP (Google Cloud Platform) expertise CI/CD (Continuous Integration/Continuous Deployment) implementation Infrastructure as Code (IaC) Containerization and orchestration with Docker and Kubernetes Monitoring and alerting with tools Scripting and automation using languages like Python , Bash, or PowerShell Security and compliance best practices Backup and disaster recovery planning Performance optimization and troubleshooting Cross-functional collaboration and communication Agile methodologies and DevOps culture Version control with Git and GitHub/GitLab Configuration management Networking and load balancing in GCP GCP resource management and cost optimization Familiarity with GCP services like Compute Engine, App Engine, Cloud Storage, and BigQuery Secondary Skills Set up and maintain cloud-based development and production environments on GCP Monitor and optimize cloud infrastructure performanc Automate deployment of applications and services to GCP Develop and maintain scripts and tools to manage GCP resources Troubleshoot and debug GCP-related issues Implement and maintain security policies and procedures Develop and maintain CI/CD pipelines Monitor and analyze application and system logs Implement and maintain monitoring and alerting systems Develop and maintain backup and disaster recovery plans Participate in code reviews and provide feedback Research and recommend new technologies and best practices Location : HYDERABAD CTC Range : 25 LPA TO 30 LPA Notice period : ANY Shift Timing : N/A Mode of Interview : VIRTUAL Mode of Work : WORK FROM OFFICE Vardhani IT Staffing Analyst Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA. 8686127477 I vardhani@blackwhite.in I www.blackwhite.in

Posted 4 weeks ago

Apply

8.0 - 13.0 years

30 - 45 Lacs

Pune, Bengaluru

Hybrid

Technical Project Manager - GCP Devops Immediate Joiner Preferred Job Summary We are looking for a seasoned Project Manager with a strong background in Google Cloud Platform (GCP) and DevOps methodologies. The ideal candidate will be responsible for planning, executing, and finalizing projects according to strict deadlines and within budget. This includes acquiring resources and coordinating the efforts of team members and third-party contractors or consultants in order to deliver projects according to plan. The GCP DevOps Project Manager will also define the projects objectives and oversee quality control throughout its life cycle. Key Responsibilities Project Leadership: Lead and manage the end-to-end lifecycle of complex cloud infrastructure and DevOps projects on Google Cloud Platform. Planning & Scoping: Define project scope, goals, and deliverables that support business objectives in collaboration with senior management and stakeholders. Agile/Scrum Management: Facilitate sprint planning, daily stand-ups, retrospectives, and sprint demos within an Agile framework. Resource Management: Effectively communicate project expectations to team members and stakeholders in a timely and clear fashion; manage and allocate resources efficiently. Risk & Issue Management: Proactively identify, track, and mitigate project risks and issues. Develop and implement effective contingency plans. Budget & Timeline: Develop and manage project budgets, timelines, and resource allocation plans. Track project milestones and deliverables. Stakeholder Communication: Serve as the primary point of contact for project stakeholders. Prepare and present regular status reports on project progress, problems, and solutions. Technical Oversight: Work closely with technical leads and architects to ensure solutions are designed and implemented in line with best practices for security, reliability, and scalability on GCP. CI/CD Pipeline Management: Oversee the implementation and optimization of CI/CD pipelines to automate the deployment, testing, and delivery of software. Quality Assurance: Ensure that all project deliverables meet high-quality standards and are fully tested before release. Required Skills and Qualifications Experience: 5+ years of experience in technical project management, with at least 2-3 years focused on cloud infrastructure projects, specifically on GCP. GCP Expertise: Strong understanding of core GCP services (e.g., Compute Engine, GKE, Cloud Storage, BigQuery, Cloud SQL, IAM, Cloud Build). DevOps Acumen: In-depth knowledge of DevOps principles and hands-on experience with CI/CD tools (e.g., Jenkins, GitLab CI, CircleCI, Cloud Build), infrastructure as code (e.g., Terraform, Deployment Manager), and containerization (e.g., Docker, Kubernetes). Project Management Methodology: Proven experience with Agile, Scrum, and/or Kanban methodologies. PMP or Certified ScrumMaster (CSM) certification is a strong plus. Leadership: Demonstrated ability to lead and motivate cross-functional technical teams in a fast-paced environment. Communication: Exceptional verbal, written, and interpersonal communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences. Problem-Solving: Strong analytical and problem-solving skills with a high attention to detail. Preferred Qualifications GCP Professional Cloud Architect or Professional Cloud DevOps Engineer certification. Experience with hybrid or multi-cloud environments. Background in software development or systems administration. Experience with monitoring and logging tools (e.g., Prometheus, Grafana, ELK Stack, Google Cloud's operations suite).

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Pune, Ahmedabad

Work from Office

We are seeking a skilled and motivated Google / AWS Cloud DevOps Engineer with over 3 years of hands-on experience in building and maintaining scalable, reliable, and secure cloud infrastructure. You will be part of a dynamic team that focuses on delivering robust DevOps solutions using Google Cloud Platform (GCP), AWS, helping streamline CI/CD pipelines, automate infrastructure provisioning, and optimize cloud-based deployments. Key Responsibilities: Design, implement, and manage scalable and secure infrastructure on Google Cloud Platform / AWS. Develop and maintain CI/CD pipelines using tools such as Cloud Build, Jenkins, GitLab CI/CD, or similar. Implement infrastructure as code (IaC) using Terraform or Pulumi. Monitor system health and performance using AWS / GCPs operations suite (formerly Stackdriver). Automate manual processes to improve system reliability and deployment frequency. Collaborate with software engineers to ensure best DevOps practices are followed in application development and deployment. Handle incident response and root cause analysis for production issues. Ensure compliance with security and governance policies on AWS / GCP. Optimize cost and resource utilization across cloud services. Required Qualifications: 3+ years of hands-on experience with DevOps tools and practices in a cloud environment. Strong experience with Google Cloud Platform (GCP) / AWS services (Compute Engine, Kubernetes Engine, Cloud Functions, Cloud Storage, VPC, etc.). Google / AWS Cloud Professional Cloud DevOps Engineer certification is mandatory. Proficiency with CI/CD tools and version control systems (e.g., Git, GitHub/GitLab, Cloud Build). Solid scripting skills in Bash, Python, or similar languages. Experience with Docker and Kubernetes. Familiarity with monitoring/logging tools such as Prometheus, Grafana, and Cloud Monitoring. Knowledge of networking, security best practices, and IAM on GCP / AWS. Preferred Qualifications: Experience with multi-cloud or hybrid cloud environments. Familiarity with Agile and DevOps culture and practices. Experience with serverless architectures and event-driven design patterns. Knowledge of cost optimization and GCP/AWS billing.

Posted 1 month ago

Apply

1.0 - 3.0 years

2 - 4 Lacs

Kolkata

Hybrid

Required Skills Strong proficiency in Python (3.x) and Django (2.x/3.x/4.x) Hands-on experience with Django REST Framework (DRF) Expertise in relational databases like PostgreSQL or MySQL Proficiency with Git and Bitbucket Solid understanding of RESTful API design and integration Experience in domain pointing and hosting setup on AWS or GCP Deployment knowledge on EC2 , GCP Compute Engine , etc. SSL certificate installation and configuration Familiarity with CI/CD pipelines (GitHub Actions, Bitbucket Pipelines, GitLab CI) Basic usage of Docker for development and containerization Ability to independently troubleshoot server/deployment issues Experience managing cloud resources like S3 , Load Balancers , and IAM roles Preferred Skills Experience with Celery and Redis / RabbitMQ for asynchronous task handling Familiarity with front-end frameworks like React or Vue.js Exposure to Cloudflare or similar CDN/DNS tools Experience with monitoring tools: Prometheus , Grafana , Sentry , or CloudWatch Why Join Us? Work on impactful and modern web solutions Growth opportunities across technologies and cloud platforms Collaborative, inclusive, and innovation-friendly work environment Exposure to challenging and rewarding projects

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai

Work from Office

We are looking for an experienced and motivated Senior GCP Data Engineer to join our dynamic data team. In this role, you will be responsible for designing, building, and optimizing data pipelines, implementing advanced analytics solutions, and maintaining robust data infrastructure using Google Cloud Platform (GCP) services. You will play a key role in enabling data-driven decision-making and enhancing the performance and scalability of our data ecosystem. Key Responsibilities: Design, implement, and optimize data pipelines using Google Cloud Platform (GCP) services, including Compute Engine , BigQuery , Cloud Pub/Sub , Dataflow , Cloud Storage , and AlloyDB . Lead the design and optimization of schema for large-scale data systems, ensuring data consistency, integrity, and scalability. Work closely with cross-functional teams to understand data requirements and deliver efficient, high-performance solutions. Design and execute complex SQL queries for BigQuery and other databases, ensuring optimal performance and efficiency. Implement efficient data processing workflows and streaming data solutions using Cloud Pub/Sub and Dataflow . Develop and maintain data models, schemas, and data marts to ensure consistency and scalability across datasets. Ensure the scalability, reliability, and security of cloud-based data architectures. Optimize cloud storage, compute, and query performance, driving cost-effective solutions. Collaborate with data scientists, analysts, and software engineers to create actionable insights and drive business outcomes. Implement best practices for data management, including governance, quality, and monitoring of data pipelines. Provide mentorship and guidance to junior data engineers and collaborate with them to achieve team goals. Required Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent work experience). 5+ years of experience in data engineering, with a strong focus on Google Cloud Platform (GCP) . Extensive hands-on experience with GCP Compute Engine , BigQuery , Cloud Pub/Sub , Dataflow , Cloud Storage , and AlloyDB . Strong expertise in SQL for query optimization and performance tuning in large-scale datasets. Solid experience in designing data schemas, data pipelines, and ETL processes. Strong understanding of data modeling techniques, and experience with schema design for both transactional and analytical systems. Proven experience optimizing BigQuery performance, including partitioning, clustering, and cost optimization strategies. Experience with managing and processing streaming data and batch data processing workflows. Knowledge of AlloyDB for managing transactional databases in the cloud and integrating them into data pipelines. Familiarity with data security, governance, and compliance best practices on GCP. Excellent problem-solving skills, with the ability to troubleshoot complex data issues and find efficient solutions. Strong communication and collaboration skills, with the ability to work with both technical and non-technical stakeholders. Preferred Qualifications: Bachelor's/Masters degree in Computer Science, Data Engineering, or a related field. Familiarity with infrastructure as code tools like Terraform or Cloud Deployment Manager . GCP certifications (e.g., Google Cloud Professional Data Engineer or Cloud Architect ).

Posted 1 month ago

Apply

6.0 - 10.0 years

12 - 18 Lacs

Hyderabad

Hybrid

Role & Responsibilities Role Overview: We are seeking a talented and forward-thinking DevOps Engineer for one of the large financial services GCC based in Hyderabad with responsibilities including designing, implementing, and maintaining CI/CD pipelines, monitoring system performance, automating deployments, ensuring infrastructure scalability and security, collaborating with development and IT teams, and optimizing workflow efficiency. Technical Requirements: Experienced in setting and delivering DevOps strategy Proficient in collaborating with engineering teams to understand their needs Skilled in setting up, maintaining, optimizing, and evolving DevOps tooling and infrastructure Strong knowledge of automating development, quality engineering, deployment, and release processes Familiarity with Agile and Waterfall methodologies and supporting toolchains Ability to identify technical problems and develop effective solutions Hands-on experience with a variety of technologies including Git, Kubernetes, Docker, Jenkins, and scripting/programming languages Competence in implementing DevOps and Agile patterns such as CI/CD pipelines, source code management, automation, and infrastructure as code Understanding of IT management practices, software currency, and security measures Experience in GCP infrastructure, Terraform, Harness for CI/CD automation, and deployments Proficiency in team leadership, communication, and problem-solving skills Functional Requirements: Demonstrated team leadership and DevOps experience Exposure to GCP infrastructure including Compute Engine, VPC, IAM, Cloud Functions, and GKE Hands-on experience with various DevOps technologies such as Git, Kubernetes, Docker, Jenkins, SonarQube, and scripting/programming languages Strong organizational, time management, and multitasking skills Ability to work collaboratively, build relationships, and adapt to various domains and disciplines Passion for developing new technologies and optimizing software delivery processes Understanding of security compliance, networking, and firewalls Willingness to learn, grow, and develop within a supportive and inclusive environment Ability to propose new technologies and methodologies for software delivery optimization This role offers a compelling opportunity for a seasoned DevOps Engineering to drive transformative cloud initiatives within the financial sector, leveraging unparalleled experience and expertise to deliver innovative cloud solutions that align with business imperatives and regulatory requirements. Qualification Engineering Grad / Postgraduate CRITERIA Helm experience Networking and security (firewalls, IAM roles) experience Security compliance understanding Relevant Experience: 6-9 years

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Chennai, Tamil Nadu

Work from Office

Duration: 12Months Work Type: Onsite Position Description: We seeking an experienced GCP Data Engineer who can build cloud analytics platform to meet ever expanding business requirements with speed and quality using lean Agile practices. You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the Google Cloud Platform (GCP). You will be responsible for designing the transformation and modernization on GCP, as well as landing data from source applications to GCP. Experience with large scale solution and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on Google Cloud Platform. Skills Required: Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production Experience in analyzing complex data, organizing raw data and integrating massive datasets from multiple data sources to build subject areas and reusable data products Experience in working with architects to evaluate and productionalize appropriate GCP tools for data ingestion, integration, presentation, and reporting Experience in working with all stakeholders to formulate business problems as technical data requirement, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management Proficient in Machine Learning model architecture, data pipeline interaction and metrics interpretation. This includes designing and deploying a pipeline with automated data lineage. Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions. Test and compare competing solutions and report out a point of view on the best solution. Integration between GCP Data Catalog and Informatica EDC. Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. Skills Preferred: Strong drive for results and ability to multi-task and work independently Self-starter with proven innovation skills Ability to communicate and work with cross-functional teams and all levels of management Demonstrated commitment to quality and project timing Demonstrated ability to document complex systems Experience in creating and executing detailed test plans Experience Required: 3 to 5 Yrs Education Required: BE or Equivalent

Posted 2 months ago

Apply

4.0 - 7.0 years

8 - 14 Lacs

Noida

Hybrid

Data Engineer (L3) || GCP Certified Employment Type : Full-Time Work Mode : In-office/ Hybrid Notice : Immediate joiners As a Data Engineer, you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development ""scrums"" and solution reviews, mentor junior Data Engineering Specialists, lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design. Required Skills : Design, develop, and support data pipelines and related data products and platforms. Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms. Perform application impact assessments, requirements reviews, and develop work estimates. Develop test strategies and site reliability engineering measures for data products and solutions. Participate in agile development ""scrums"" and solution reviews. Mentor junior Data Engineers. Lead the resolution of critical operations issues, including post-implementation reviews. Perform technical data stewardship tasks, including metadata management, security, and privacy by design. Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies Demonstrate SQL and database proficiency in various data engineering tasks. Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect. Develop Unix scripts to support various data operations. Model data to support business intelligence and analytics initiatives. Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation. Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have). data pipelines, agile development,scrums, GCP Data Technologies, Python, DAGs, Control-M, Apache Airflow, Data solution architecture Qualifications : Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field. 4+ years of data engineering experience. 2 years of data solution architecture and design experience. GCP Certified Data Engineer (preferred). Job Type : Full-time

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies