Data Engineer Specialist

5 - 8 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Title - Lead Data Engineer Specialist ACS SONG

Management Level: Level 9 Specialist

Location: Kochi, Coimbatore, Trivandrum

Must have skills: Databricks including Spark-based ETL, Delta Lake

Good to have skills: Pyspark

Job Summary

We are seeking a highly skilled and experienced Senior Data Engineer to join our growing Data and Analytics team. The ideal candidate will have deep expertise in Databricks and cloud data warehousing, with a proven track record of designing and building scalable data pipelines, optimizing data architectures, and enabling robust analytics capabilities. This role involves working collaboratively with cross-functional teams to ensure the organization leverages data as a strategic asset. Your responsibilities will include:

Roles And Responsibilities

  • Design, build, and maintain scalable data pipelines and ETL processes using Databricks and other modern tools.
  • Architect, implement, and manage cloud-based data warehousing solutions on Databricks (Lakehouse Architecture)
  • Develop and maintain optimized data lake architectures to support advanced analytics and machine learning use cases.
  • Collaborate with stakeholders to gather requirements, design solutions, and ensure high-quality data delivery.
  • Optimize data pipelines for performance and cost efficiency.
  • Implement and enforce best practices for data governance, access control, security, and compliance in the cloud.
  • Monitor and troubleshoot data pipelines to ensure reliability and accuracy.
  • Lead and mentor junior engineers, fostering a culture of continuous learning and innovation.
  • Excellent communication skills
  • Ability to work independently and along with client based out of western Europe.

Professional And Technical Skills

  • 5-8 years of experience in data engineering roles with a focus on cloud platforms.
  • Proficiency in Databricks, including Spark-based ETL, Delta Lake, and SQL.
  • Strong experience with one or more cloud platforms (AWS preferred).
  • Experience in Snowflakes and Azure SQL will be an added advantage
  • Hands on experience with Delta Lake, Unity Catalog, and Lakehouse architecture concepts
  • Strong programming skills in Python and SQL; experience with Pyspark a plus.
  • Solid understanding of data modeling concepts and practices (e.g., star schema, dimensional modeling).
  • Knowledge of CI/CD practices and version control systems (e.g., Git).
  • Familiarity with data governance and security practices, including GDPR and CCPA compliance.

Additional Information

  • Experience with Airflow or similar workflow orchestration tools.
  • Exposure to machine learning workflows and MLOps.
  • Certification in Databricks, AWS
  • Familiarity with data visualization tools such as Power BI
About Our Company | Accenture (do not remove the hyperlink)

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Accenture in India logo
Accenture in India

Business Consulting and Services

Dublin 2 San Francisco

RecommendedJobs for You