Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5 - 8 years
15 - 25 Lacs
Pune
Hybrid
Role & responsibilities Data Pipeline Development: Design, develop, and maintain data pipelines utilizing Google Cloud Platform (GCP) services like Dataflow, Dataproc, and Pub/Sub. Data Ingestion & Transformation: Build and implement data ingestion and transformation processes using tools such as Apache Beam and Apache Spark. Data Storage Management: Optimize and manage data storage solutions on GCP, including BigQuery, Cloud Storage, and Cloud SQL. Security Implementation: Implement data security protocols and access controls with GCP's Identity and Access Management (IAM) and Cloud Security Command Center. System Monitoring & Troubleshooting: Monitor and troubleshoot data pipelines and storage solutions using GCP's Stackdriver and Cloud Monitoring tools. Generative AI Systems: Develop and maintain scalable systems for deploying and operating generative AI models, ensuring efficient use of computational resources. Gen AI Capability Building: Build generative AI capabilities among engineers, covering areas such as knowledge engineering, prompt engineering, and platform engineering. Knowledge Engineering: Gather and structure domain-specific knowledge to be utilized by large language models (LLMs) effectively. Prompt Engineering: Design effective prompts to guide generative AI models, ensuring relevant, accurate, and creative text output. Collaboration: Work with data experts, analysts, and product teams to understand data requirements and deliver tailored solutions. Automation: Automate data processing tasks using scripting languages such as Python. Best Practices: Participate in code reviews and contribute to establishing best practices for data engineering within GCP. Continuous Learning: Stay current with GCP service innovations and advancements. Core data services (GCS, BigQuery, Cloud Storage, Dataflow, etc.). Skills and Experience: Experience: 5+ years of experience in Data Engineering or similar roles. Proficiency in GCP: Expertise in designing, developing, and deploying data pipelines, with strong knowledge of GCP core data services (GCS, BigQuery, Cloud Storage, Dataflow, etc.). Generative AI & LLMs: Hands-on experience with Generative AI models and large language models (LLMs) such as GPT-4, LLAMA3, and Gemini 1.5, with the ability to integrate these models into data pipelines and processes. Experience in Webscraping Technical Skills: Strong proficiency in Python and SQL for data manipulation and querying. Experience with distributed data processing frameworks like Apache Beam or Apache Spark is a plus. Security Knowledge: Familiarity with data security and access control best practices. • Collaboration: Excellent communication and problem-solving skills, with a demonstrated ability to collaborate across teams. Project Management: Ability to work independently, manage multiple projects, and meet deadlines. Preferred Knowledge: Familiarity with Sustainable Finance, ESG Risk, CSRD, Regulatory Reporting, cloud infrastructure, and data governance best practices. Bonus Skills: Knowledge of Terraform is a plus. Education: Degree: Bachelors or masters degree in computer science, Information Technology, or a related field. Experience: 3-5 years of hands-on experience in data engineering. Certification: Google Professional Data Engineer
Posted 2 months ago
6 - 10 years
1 - 2 Lacs
Noida, Pune, Bengaluru
Hybrid
Role & responsibilities : Business Intelligence/Data Warehousing experience 3-4 Years hands-on experience with Snowflake database Strong SQL, PL/SQL, and Snowflake functionality experience Strong exposure to other RDBMS likes Oracle, SQL server, etc. Exposure to cloud storage services like AWS S3 2-3 years Informatica PowerCenter and/or IDMC experience Decent exposure to dimensional data modeling techniques and implementation Decent exposure to data warehousing approaches Basic exposure to reporting/dashboards using PowerBI or similar tools Basic exposure to Linux environments and shell scripting Exposure to Microsoft Fabric is a nice to have
Posted 2 months ago
7 - 12 years
13 - 17 Lacs
Gurugram
Work from Office
Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills
Posted 2 months ago
1 - 3 years
3 - 6 Lacs
Bengaluru
Work from Office
Skill required: Record To Report - Invoice Processing Designation: Record to Report Ops Associate Qualifications: BCom/MCom Years of Experience: 1 to 3 years Language - Ability: English(Domestic) - Expert What would you do? You will be aligned with our Finance Operations vertical and will be helping us in determining financial outcomes by collecting operational data/reports, whilst conducting analysis and reconciling transactions.Posting journal entries, preparing balance sheet reconciliations, reviewing entries and reconciliations, preparing cash forecasting statement, supporting month end closing, preparing reports and supports in audits.Refers to the systematic handling and management of incoming invoices within a business or organization. It involves tasks such as verifying the accuracy of the invoice, matching it with purchase orders and delivery receipts, and initiating the payment process. Automated systems and software are often employed to streamline and expedite the invoice processing workflow, improving efficiency and reducing the likelihood of errors. What are we looking for? Google Cloud SQL Adaptable and flexible Ability to perform under pressure Problem-solving skills Agility for quick learning Commitment to quality Roles and Responsibilities: In this role you are required to solve routine problems, largely through precedent and referral to general guidelines Your expected interactions are within your own team and direct supervisor You will be provided detailed to moderate level of instruction on daily work tasks and detailed instruction on new assignments The decisions that you make would impact your own work You will be an individual contributor as a part of a team, with a predetermined, focused scope of work Please note that this role may require you to work in rotational shifts Qualification BCom,MCom
Posted 2 months ago
7 - 10 years
16 - 21 Lacs
Mumbai
Work from Office
Position Overview: The Google Cloud Data Engineering Lead role is ideal for an experienced Google Cloud Data Engineer who will drive the design, development, and optimization of data solutions on the Google Cloud Platform (GCP). The role requires the candidate to lead a team of data engineers and collaborate with data scientists, analysts, and business stakeholders to enable scalable, secure, and high-performance data pipelines and analytics platforms. Key Responsibilities: Lead and manage a team of data engineers delivering end-to-end data pipelines and platforms on GCP. Design and implement robust, scalable, and secure data architectures using services like BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage. Develop and maintain batch and real-time ETL/ELT workflows using tools such as Apache Beam, Dataflow, or Composer (Airflow). Collaborate with data scientists, analysts, and application teams to gather requirements and ensure data availability and quality. Define and enforce data engineering best practices including version control, testing, code reviews, and documentation. Drive automation and infrastructure-as-code approaches using Terraform or Deployment Manager for provisioning GCP resources. Implement and monitor data quality, lineage, and governance frameworks across the data platform. Optimize query performance and storage strategies, particularly within BigQuery and other GCP analytics tools. Mentor team members and contribute to the growth of technical capabilities across the organization. Qualifications: Education : Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field. Experience : 7+ years of experience in data engineering, including 3+ years working with GCP data services. Proven leadership experience in managing and mentoring data engineering teams. Skills : Expert-level understanding of BigQuery, Dataflow (Apache Beam), Cloud Storage, and Pub/Sub. Strong SQL and Python skills for data processing and orchestration. Experience with workflow orchestration tools (Airflow/Composer). Hands-on experience with CI/CD, Git, and infrastructure-as-code tools (e.g., Terraform). Familiarity with data security, governance, and compliance practices in cloud environments. Certifications : GCP Professional Data Engineer certification.
Posted 2 months ago
10 - 20 years
25 - 40 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Hi, Hope you are looking for a job change. We have opening for GCP Data Architect for an MNC in Pan India Location, I'm sharing JD with you. Please have a look and revert with below details and Updated Resume. Apply only if you can join in 10 Days. Its 5-Days Work from Office . We dont process High-Notice period Candidates Role GCP Data Architect Experience: 10+ Years Mode: Permanent Work Location: Pan India Notice Period: immediate to 10 Days Mandatory Skills: : GCP, Architecture Experience , Big Data, Data Modelling , BigQuery Full Name ( As Per Aadhar Card ): Email ID: Mobile Number: Alternate No: Qualification: Graduation Year: Regular Course: Total Experience: Relevant experience: Current Organization: Working as Permanent Employee: Payroll Company:: Experience in GCP : Experience in Architecture : Experience in GCP Data Architecture : Experience in Big Data: Experience in BigQuery : Experience in Data Management : Official Notice period: Serving Notice Period: Current location: Preferred location: Current CTC: Exp CTC: CTC Breakup: Pan Card Number : Date of Birth : Any Offer in Hand: Offered CTC: LWD: Any Offer in hand: Serving Notice Period: Can you join immediately: Ready to work from Office for 5 Days : Job Description: GCP Data Architect We are seeking a skilled Data Solution Architect to design Solution and Lead the implementation on GCP. The ideal candidate will possess extensive experience in data Architecting, Solution design and data management practices. Responsibilities: Architect and design end-to-end data solutions on Cloud Platform, focusing on data warehousing and big data platforms. Collaborate with clients, developers, and architecture teams to understand requirements and translate them into effective data solutions. Develop high-level and detailed data architecture and design documentation. Implement data management and data governance strategies, ensuring compliance with industry standards. Architect both batch and real-time data solutions, leveraging cloud native services and technologies. Design and manage data pipeline processes for historic data migration and data integration. Collaborate with business analysts to understand domain data requirements and incorporate them into the design deliverables. Drive innovation in data analytics by leveraging cutting-edge technologies and methodologies. Demonstrate excellent verbal and written communication skills to communicate complex ideas and concepts effectively. Stay updated on the latest advancements in Data Analytics, data architecture, and data management techniques. Requirements Minimum of 5 years of experience in a Data Architect role, supporting warehouse and Cloud data platforms/environments. Experience with common GCP services such as BigQuery, Dataflow, GCS, Service Accounts, cloud function Extremely strong in BigQuery design, development Extensive knowledge and implementation experience in data management, governance, and security frameworks. Proven experience in creating high-level and detailed data architecture and design documentation. Strong aptitude for business analysis to understand domain data requirements. Proficiency in Data Modelling using any Modelling tool for Conceptual, Logical, and Physical models is preferred Hands-on experience with architecting end-to-end data solutions for both batch and real-time designs. Ability to collaborate effectively with clients, developers, and architecture teams to implement enterprise-level data solutions. Familiarity with Data Fabric and Data Mesh architecture is a plus. Excellent verbal and written communication skills. Regards, Rejeesh S Email : rejeesh.s@jobworld.jobs Mobile : +91 - 9188336668
Posted 2 months ago
11 - 20 years
45 - 75 Lacs
Gurugram
Work from Office
Role & responsibilities Proven success architecting and scaling complex software solutions, Familiarity with interface design. Experience and ability to drive a project/module independently from an execution stand. Prior experience with scalable architecture and distributed processing. Strong Programming expertise in Python, SQL, Scala Hands-on experience on any major big data solutions like Spark, Kafka, Hive. Strong data management skills with ETL, DWH, Data Quality and Data Governance. Hands-on experience on microservices architecture, docker and Kubernetes as orchestration. Experience on cloud-based data stores like Redshift and Big Query. Experience in cloud solution architecture. Experience on architecture of running spark jobs on k8s and optimization of spark jobs. Experience in MLops architecture/tools/orchestrators like Kubeflow, MLflow Experience in logging, metrics and distributed tracing systems (e.g. Prometheus/Grafana/Kibana) Experience in CI/CD using octopus/teamcity/jenkins Interested candidate can share their updated resume at surinder.kaur@mounttalent.com
Posted 2 months ago
4 - 6 years
18 - 20 Lacs
Noida
Work from Office
Job Description Role: Google Cloud Engineer Skillset: GCP, Kubernates, Docker, CI/CD Experience: 4-6 years Location: Noida and Chennai Required Skillset: 4-6 years of experience in DevOps, with a focus on GCP . Strong experience with GCP services including GKE, GCE, Cloud DNS, VPC, Cloud Storage, Cloud Monitoring, Cloud logging, Cloud Nat , Anthos Service Mesh. Proficiency in Docker and Docker Compose . Experience with build tools like Maven and Gradle . Strong knowledge of Jenkins for CI/CD and Jenkins shared library concept. Familiarity with code quality tools like SonarQube . Experience with JFrog Artifactory . Strong understanding of networking, security, and cloud architecture . Basic knowledge of different application servers, like - Jboss, Wildfly and Websphere Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Understanding of monitoring tool like Prometheus. Deep understanding of Helm Charts . Good Understanding Infrastructure provisioning tool such as Terraform . Proficient in different scripting languages like Bash, Python and Groovy. Responsibilities: Utilize GCP services such as Google Kubernetes Engine (GKE), Google Compute Engine (GCE), Cloud DNS, Virtual Private Cloud (VPC), Cloud Storage, and Cloud Monitoring. Automate deployment, monitoring, and management of cloud applications. Implement and manage CI/CD pipelines using Jenkins. Use Docker and Docker Compose for containerization and orchestration. Manage build tools like Maven and Gradle. Integrate and maintain code quality tools such as SonarQube. Manage artifact repositories using JFrog Artifactory. Monitor system performance, identify issues, and implement solutions to ensure high availability and performance. Ensure security best practices and compliance with industry standards. Troubleshoot and resolve infrastructure-related issues. Stay updated with the latest industry trends and best practices in cloud computing and DevOps. Good to have Certifications: CKA Certified GCP certification (e.g., Professional Cloud DevOps Engineer). Qualifications Bachelor of Engineering, Master of Computer Applications
Posted 2 months ago
4 - 6 years
14 - 16 Lacs
Noida
Work from Office
Role: Google Cloud Engineer Skillset: GCP, Kubernates, Docker, CI/CD Experience: 4-6 years Location: Noida and Chennai Required Skillset: 4-6 years of experience in DevOps, with a focus on GCP . Strong experience with GCP services including GKE, GCE, Cloud DNS, VPC, Cloud Storage, Cloud Monitoring, Cloud logging, Cloud Nat , Anthos Service Mesh. Proficiency in Docker and Docker Compose . Experience with build tools like Maven and Gradle . Strong knowledge of Jenkins for CI/CD and Jenkins shared library concept. Familiarity with code quality tools like SonarQube . Experience with JFrog Artifactory . Strong understanding of networking, security, and cloud architecture . Basic knowledge of different application servers, like - Jboss, Wildfly and Websphere Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Understanding of monitoring tool like Prometheus. Deep understanding of Helm Charts . Good Understanding Infrastructure provisioning tool such as Terraform . Proficient in different scripting languages like Bash, Python and Groovy. Responsibilities: Utilize GCP services such as Google Kubernetes Engine (GKE), Google Compute Engine (GCE), Cloud DNS, Virtual Private Cloud (VPC), Cloud Storage, and Cloud Monitoring. Automate deployment, monitoring, and management of cloud applications. Implement and manage CI/CD pipelines using Jenkins. Use Docker and Docker Compose for containerization and orchestration. Manage build tools like Maven and Gradle. Integrate and maintain code quality tools such as SonarQube. Manage artifact repositories using JFrog Artifactory. Monitor system performance, identify issues, and implement solutions to ensure high availability and performance. Ensure security best practices and compliance with industry standards. Troubleshoot and resolve infrastructure-related issues. Stay updated with the latest industry trends and best practices in cloud computing and DevOps. Good to have Certifications: CKA Certified GCP certification (e.g., Professional Cloud DevOps Engineer). Bachelor of Engineering, Master of Computer Applications
Posted 2 months ago
2 - 6 years
8 - 12 Lacs
Pune
Work from Office
About The Role : Job Title Senior engineer - (Data Engineer (ETL, Big Data, Hadoop, Spark, GCP), AVP Location:Pune, India Role Description Senior engineer is responsible for developing and delivering elements of engineering solutions to accomplish business goals. Awareness is expected of the important engineering principles of the bank. Root cause analysis skills develop through addressing enhancements and fixes 2 products build reliability and resiliency into solutions through early testing peer reviews and automating the delivery life cycle. Successful candidate should be able to work independently on medium to large sized projects with strict deadlines. Successful candidates should be able to work in a cross application mixed technical environment and must demonstrate solid hands-on development track record while working on an agile methodology. The role demands working alongside a geographically dispersed team. The position is required as a part of the buildout of Compliance tech internal development team in India. The overall team will primarily deliver improvements in compliance tech capabilities that are major components of the regular regulatory portfolio addressing various regulatory common commitments to mandate monitors. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Analyzing data sets and designing and coding stable and scalable data ingestion workflows also integrating into existing workflows Working with team members and stakeholders to clarify requirements and provide the appropriate ETL solution. Work as a senior developer for developing analytics algorithm on top of ingested data. Work as though senior developer for various data sourcing in Hadoop also GCP. Own unit testing UAT deployment end user sign off and prod go live. Ensuring new code is tested both at unit level and system level design develop and peer review new code and functionality. Operate as a team member of an agile scrum team. Root cause analysis skills to identify bugs and issues for failures. Support Prod support and release management teams in their tasks. Your skills and experience More than 10+ years of coding experience in experience and reputed organizations Hands on experience in Bitbucket and CI/CD pipelines Proficient in Hadoop, Python, Spark, SQL Unix and Hive Basic understanding of on Prem and GCP data security Hands on development experience on large ETL/ big data systems .GCP being a big plus Hands on experience on cloud build, artifact registry ,cloud DNS ,cloud load balancing etc. Hands on experience on Data flow, Cloud composer, Cloud storage ,Data proc etc. Basic understanding of data quality dimensions like Consistency, Completeness, Accuracy, Lineage etc. Hands on business and systems knowledge gained in a regulatory delivery environment. Desired Banking experience regulatory and cross product knowledge. Passionate about test driven development. Prior experience with release management tasks and responsibilities. Data visualization experience is good to have. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.
Posted 2 months ago
7 - 12 years
15 - 20 Lacs
Navi Mumbai, Bengaluru, Mumbai (All Areas)
Work from Office
Key Responsibilities: Design, implement, and maintain GCP cloud infrastructure using Infrastructure as Code (IaC) tools Manage and optimize Kubernetes clusters on GKE (Google Kubernetes Engine) Build and maintain CI/CD pipelines for efficient application delivery Monitor GCP infrastructure costs and drive optimization strategies Develop observability solutions using GCP-native and third-party tools Collaborate with engineering teams to streamline deployment and operations workflows Enforce security best practices and ensure compliance with internal and industry standards Design and implement high availability (HA) and disaster recovery (DR) architectures Mandatory Technical Skills: GCP Services: Compute Engine, VPC, Cloud Storage, Cloud SQL, IAM, Cloud DNS, Cloud Monitoring Infrastructure as Code: Terraform (preferred), Deployment Manager Containerization: Docker, Kubernetes (GKE expertise required) CI/CD Tools: GitHub Actions, Cloud Build, Jenkins, or similar Version Control: Git Scripting Languages: Python, Bash Monitoring & Logging: Stackdriver, Prometheus, Grafana, ELK Stack Strong experience with automation and configuration management (Terraform, Ansible, etc.) Solid understanding of cloud security best practices Experience designing fault-tolerant, resilient cloud-native architectures 47 years in DevOps/Cloud Engineering roles Minimum 2+ years hands-on with GCP infrastructure and services Proven experience managing CI/CD pipelines and container-based deployments Strong background in modern DevOps tools and cloud-native architectures Preferred candidate profile
Posted 2 months ago
5 - 10 years
9 - 13 Lacs
Chennai
Work from Office
Overview GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) +Teradata Responsibilities GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) + Teradata Requirements GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) + Teradata
Posted 2 months ago
8 - 13 years
10 - 15 Lacs
Jaipur, Rajasthan
Work from Office
Job Summary Auriga is looking for a Data Engineer to design and maintain cloud-native data pipelines supporting real-time analytics and machine learning. You'll work with cross-functional teams to build scalable, secure data solutions using GCP (BigQuery, Looker), SQL, Python, and orchestration tools like Dagster and DBT. Mentoring junior engineers and ensuring data best practices will also be part of your role. WHAT YOU'LL DO: Design, build, and maintain scalable data pipelines and architectures to support analytical and operational workloads. Develop and optimize ETL/ELT pipelines, ensuring efficient data extraction, transformation, and loading from various sources. Work closely with backend and platform engineers to integrate data pipelines into cloud-native applications. Manage and optimize cloud data warehouses, primarily BigQuery, ensuring performance, scalability, and cost efficiency. Implement data governance, security, and privacy best practices, ensuring compliance with company policies and regulations. Collaborate with analytics teams to define data models and enable self-service reporting and BI capabilities. Develop and maintain data documentation, including data dictionaries, lineage tracking, and metadata management. Monitor, troubleshoot, and optimize data pipelines, ensuring high availability and reliability. Stay up to date with emerging data engineering technologies and best practices, continuously improving our data infrastructure. WHAT WE'RE LOOKING FOR: Strong proficiency in English (written and verbal communication) is required. Experience working with remote teams across North America and Latin America, ensuring smooth collaboration across time zones. 5+ years of experience in data engineering, with expertise in building scalable data pipelines and cloud-native data architectures. Strong proficiency in SQL for data modeling, transformation, and performance optimization. Experience with BI and data visualization tools (e.g., Looker, Tableau, or Google Data Studio). Expertise in Python for data processing, automation, and pipeline development. Experience with cloud data platforms, particularly Google Cloud Platform (GCP).Hands-on experience with Google BigQuery, Cloud Storage, and Pub/Sub. Strong knowledge of ETL/ELT frameworks such as DBT, Dataflow, or Apache Beam. Familiarity with workflow orchestration tools like Dagster, Apache Airflow or Google Cloud Workflows. Understanding of data privacy, security, and compliance best practices. Strong problem-solving skills, with the ability to debug and optimize complex data workflows. Excellent communication and collaboration skills. NICE TO HAVES: Experience with real-time data streaming solutions (e.g., Kafka, Pub/Sub, or Kinesis). Familiarity with machine learning workflows and MLOps best practices. Knowledge of Terraform for Infrastructure as Code (IaC) in data environments. Familiarity with data integrations involving Contentful, Algolia, Segment, and .
Posted 2 months ago
4 - 8 years
15 - 30 Lacs
Bengaluru
Remote
Job Title: Senior GCP Data DevOps Engineer Job Type: Remote Exp: 4+ years Position Overview: As a Senior DevOps Engineer specializing in Google Cloud Platform (GCP), you will play a crucial role in designing, implementing, and managing our cloud infrastructure to ensure optimal performance, scalability, and reliability. You will collaborate closely with cross-functional teams to streamline development processes, automate deployment pipelines, and enhance overall system efficiency. Responsibilities: Design, implement, and manage scalable and highly available cloud infrastructure on Google Cloud Platform (GCP) to support our applications and services. Develop and maintain CI/CD pipelines to automate the deployment, testing, and monitoring of applications and microservices. Collaborate with software engineering teams to optimize application performance, troubleshoot issues, and ensure smooth deployment processes. Implement and maintain infrastructure as code (IaC) using tools such as Terraform , Ansible, or Google Deployment Manager. Monitor system health, performance, and security metrics, and implement proactive measures to ensure reliability and availability. Implement best practices for security, compliance, and data protection in cloud environments. Continuously evaluate emerging technologies and industry trends to drive innovation and improve infrastructure efficiency. Mentor junior team members and provide technical guidance and support as needed. Qualifications: Bachelor's degree in Computer Science, Engineering, or related field. 4-8 years of experience in a DevOps role, with a focus on Google Cloud Platform (GCP). In-depth knowledge of GCP services such as Compute Engine, Kubernetes Engine, Cloud Storage, Cloud SQL , Pub/Sub, and BigQuery . Proficiency in scripting languages such as Python , Bash, or PowerShell. Experience with containerization technologies such as Docker and container orchestration platforms like Kubernetes. Strong understanding of CI/CD concepts and experience with CI/CD tools such as Jenkins, GitLab CI/CD, or CircleCI. Solid understanding of infrastructure as code (IaC) principles and experience with tools such as Terraform, Ansible, or Google Deployment Manager. Experience with monitoring and logging tools such as Prometheus, Grafana, Stackdriver, or ELK Stack. Knowledge of security best practices and experience implementing security controls in cloud environments. Excellent problem-solving skills and ability to troubleshoot complex issues in distributed systems. Strong communication skills and ability to collaborate effectively with cross-functional teams. Preferred Qualifications: Google Cloud certification (e.g., Professional Cloud DevOps Engineer, Professional Cloud Architect). Experience with other cloud platforms such as AWS or Azure. Familiarity with agile methodologies and DevOps practices. Experience with software development using languages such as Java, Node.js, or Go. Knowledge of networking concepts and experience with configuring network services in cloud environments. Skills: Gcp CloudSQLBigqueryKubernetesIac ToolsCi Cd PipelineTerraformPythonAirflowSnowflakePower BiIacData FlowPubsubCloud StorageCloud Computing
Posted 2 months ago
5 - 10 years
9 - 19 Lacs
Chennai, Bengaluru, Mumbai (All Areas)
Hybrid
Google BigQuery Location- Pan India Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Key Responsibilities : Analyze and model client market and key performance data Use analytical tools and techniques to develop business insights and improve decisionmaking \n1:Data Proc PubSub Data flow Kalka Streaming Looker SQL No FLEX\n2:Proven track record of delivering data integration data warehousing soln\n3: Strong SQL And Handson Pro in BigQuery SQL languageExp in Shell Scripting Python No FLEX\n4:Exp with data integration and migration projects Oracle SQL Technical Experience : Google BigQuery\n\n1: Expert in Python NO FLEX Strong handson knowledge in SQL NO FLEX Python programming using Pandas NumPy deep understanding of various data structure dictionary array list tree etc experiences in pytest code coverage skills\n2: Exp with building solutions using cloud native services: bucket storage Big Query cloud function pub sub composer and Kubernetes NO FLEX\n3: Pro with tools to automate AZDO CI CD pipelines like ControlM GitHub JIRA confluence CI CD Pipeline Professional Attributes :
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough