Senior DevOps + Data Engineer

10 years

0 Lacs

Posted:1 week ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Key Responsibilities

DevOps & Infrastructure:

  • Design, implement, and manage CI/CD pipelines for multiple environments.
  • Automate provisioning, deployment, and scaling of infrastructure across AWS, GCP, or Azure.
  • Implement monitoring, alerting, and logging frameworks using tools such as Prometheus, Grafana, ELK Stack, or CloudWatch.
  • Ensure high availability, disaster recovery, and adherence to infrastructure security best practices.
  • Manage containerization and orchestration using Docker, Kubernetes, and Helm.

Data Engineering:

  • Design, build, and optimize data pipelines for batch and real-time processing.
  • Work with ETL frameworks such as Airflow, dbt, or AWS Glue to ensure reliable data flow and transformation.
  • Collaborate with data scientists and analysts to maintain data quality, lineage, and governance.
  • Optimize data storage and retrieval using cloud data warehouses like BigQuery, Redshift, or Snowflake.
  • Integrate diverse data sources and maintain APIs or microservices for data exchange.

Collaboration & Strategy:

  • Partner with development, analytics, and infrastructure teams to improve deployment velocity and system reliability.
  • Define best practices, standards, and guidelines for DevOps and data operations.
  • Mentor junior engineers and promote a culture of automation, reliability, and data excellence.


Required Skills & Experience:

  • 10+ years of experience in DevOps, Infrastructure Automation, or related roles.
  • Proven experience with cloud platforms (AWS, GCP, or Azure).
  • Strong expertise in Infrastructure as Code (IaC) using Terraform, CloudFormation, or Pulumi.
  • Proficiency in scripting and automation using Python, Bash, or Go.
  • Hands-on experience with CI/CD tools (GitHub Actions, Jenkins, GitLab CI, CircleCI, etc.).
  • Experience managing Kubernetes clusters and microservice-based architectures.
  • Strong understanding of data pipelines, ETL tools, and data orchestration frameworks.
  • Familiarity with data modeling, warehousing, and streaming technologies (Kafka, Spark, Flink, etc.).
  • Solid knowledge of monitoring, observability, and security principles.
  • Excellent problem-solving, communication, and leadership abilities.


Good to Have:

  • Experience with machine learning pipelines and MLOps frameworks.
  • Exposure to serverless architectures (AWS Lambda, Google Cloud Functions, etc.).
  • Knowledge of cost optimization and performance tuning in cloud environments.
  • Contributions to open-source projects or technical blogs/publications.


Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You