DevOps Engineer-Senior

5 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

At EY, we’re all in to shape your future with confidence.We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go.Join EY and help to build a better working world.

Data Engineer

  • Develop and maintain scalable data pipelines using tools like Apache Kafka, Apache Spark, or AWS Glue to ingest, process, and transform large datasets from various sources, ensuring efficient data flow and processing.
  • Design and implement data models and schemas in data warehouses (e.g., Amazon Redshift, Google BigQuery) and data lakes (e.g., AWS S3, Azure Data Lake) to support analytics and reporting needs.
  • Collaborate with data scientists and analysts to understand data requirements, ensuring data availability and accessibility for analytics, machine learning, and reporting.
  • Utilize ETL tools and frameworks (e.g., Apache NiFi, Talend, or custom Python scripts) to automate data extraction, transformation, and loading processes, ensuring data quality and integrity.
  • Monitor and optimize data pipeline performance using tools like Apache Airflow or AWS Step Functions, implementing best practices for data processing and workflow management.
  • Write, test, and maintain scripts in Python, SQL, or Bash for data processing, automation tasks, and data validation, ensuring high code quality and performance.
  • Implement CI/CD practices for data engineering workflows using tools like Jenkins, GitLab CI, or Azure DevOps, automating the deployment of data pipelines and infrastructure changes.
  • Collaborate with DevOps teams to integrate data solutions into existing infrastructure, leveraging Infrastructure as Code (IaC) tools like Terraform or AWS CloudFormation for provisioning and managing resources.
  • Manage containerized data applications using Docker and orchestrate them with Kubernetes, ensuring scalability and reliability of data processing applications.
  • Implement monitoring and logging solutions using tools like Prometheus, Grafana, or ELK Stack to track data pipeline performance, troubleshoot issues, and ensure data quality.
  • Ensure compliance with data governance, security best practices, and data privacy regulations, embedding DevSecOps principles in data workflows.
  • Participate in code reviews and contribute to the development of best practices for data engineering, data quality, and DevOps methodologies.
  • Mentor junior data engineers, providing guidance on data engineering practices, data architecture, and DevOps tools and techniques.
  • Contribute to the documentation of data architecture, processes, and workflows for knowledge sharing, compliance, and onboarding purposes.
  • Demonstrate strong communication skills to collaborate effectively with cross-functional teams, including data science, analytics, and business stakeholders.

Desired Profile

  • Seeking a DevOps Engineer with 5+ years of hands-on Cloud and DevOps experience, including significant leadership. Requires a Bachelor's/master’s in computer science.
  • Must have expert proficiency in Terraform and extensive experience across at least two major cloud platforms (AWS, Azure, GCP). Strong hands-on experience with Kubernetes, Helm charts, and designing/optimizing CI/CD pipelines (e.g., Jenkins, GitLab CI) is essential. Proficiency in Python and scripting (Bash/PowerShell) is also a must.
  • Valued experience includes leading cloud migrations, contributing to RFP/RFI processes, and mentoring teams. Excellent problem-solving, communication, and collaboration skills are critical. Experience with configuration management (Ansible, Puppet) and DevSecOps principles is required; OpenShift is a plus.

Experience

  • 10 years and above

Education

  • B.Tech. / BS in Computer Science

Technical Skills & Certifications

  • Certifications in cloud platforms (e.g., AWS Certified Solutions Architect, Azure Administrator, Google Professional Cloud Architect).
  • Terraform, Kubernetes, Python, CI/CD, Ansible, Security tools, Monitoring tools.

EY | Building a better working world

EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets.Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow.EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
EY logo
EY

Professional Services

London

RecommendedJobs for You

chennai, tamil nadu, india

kolkata, west bengal, india

hyderabad, telangana, india

kanayannur, kerala, india

trivandrum, kerala, india

noida, uttar pradesh, india

chennai, tamil nadu, india

kolkata, west bengal, india

hyderabad, telangana, india

kanayannur, kerala, india