Lead Data Engineer

8 - 12 years

15 - 20 Lacs

bengaluru mumbai (all areas)

Posted:4 hours ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Gracenote,

entertainment they love, powering a better media future for all people. Gracenote is the

content data business unit of Nielsen that powers innovative entertainment experiences for

the worlds leading media companies. Our entertainment metadata and connected IDs

deliver advanced content navigation and discovery to connect consumers to the content

they love and discover new ones.

Gracenotes industry-leading datasets cover TV programs, movies, sports, music and

podcasts in 80 countries and 35 languages.Common identifiers Universally adopted by the

worlds leading media companies to deliver powerful cross-media entertainment

experiences. Machine driven, human validated best-in-class data and images fuel new

search and discovery experiences across every screen.

Gracenote's Data Organization is a dynamic and innovative group that is essential in

delivering business outcomes through data, insights, predictive & prescriptive analytics. An

extremely motivated team that values creativity, experimentation through continuous

learning in an agile and collaborative manner. From designing, developing and maintaining

data architecture that satisfies our business goals to managing data governance and

region-specific regulations, the data team oversees the whole data lifecycle.

Role Overview:

We are seeking an experienced Senior Data Engineer with 4-9 years of experience to join

our Video engineering team with Gracenote - a NielsenIQ Company. In this role, you will

design, build, and maintain our data processing systems and pipelines. You will work closely

with Product managers, Architects, analysts, and other stakeholders to ensure data is

accessible, reliable, and optimized for Business, analytical and operational needs.

Key Responsibilities:

  • Design, develop, and maintain scalable data pipelines and ETL processes
  • Architect and implement data warehousing solutions and data lakes
  • Optimize data flow and collection for cross-functional teams
  • Build infrastructure required for optimal extraction, transformation, and loading of data
  • Ensure data quality, reliability, and integrity across all data systems
  • Collaborate with data scientists and analysts to help implement models and algorithms
  • Identify, design, and implement internal process improvements: automating manual

processes, optimizing data delivery, etc.

  • Create and maintain comprehensive technical documentation
  • Mentor junior engineers and provide technical leadership
  • Evaluate and integrate new data management technologies and tools
  • Implement Optimization strategies to enable and maintain sub second latency.
  • Oversee Data infrastructure to ensure robust deployment and monitoring of the

pipelines and processes.

  • Stay ahead of emerging trends in Data, cloud, integrating new research into practical

applications.

  • Mentor and grow a team of junior data engineers.

Required qualification and Skills:

  • Expert-level proficiency in Python, SQL, and big data tools (Spark, Kafka, Airflow).
  • Bachelor's degree in Computer Science, Engineering, or related field; Master's degree

preferred

  • Expert knowledge of SQL and experience with relational databases (e.g., PostgreSQL,

Redshift, TIDB, MySQL, Oracle, Teradata)

  • Extensive experience with big data technologies (e.g., Hadoop, Spark, Hive, Flink)
  • Proficiency in at least one programming language such as Python, Java, or Scala
  • Experience with data modeling, data warehousing, and building ETL pipelines
  • Strong knowledge of data pipeline and workflow management tools (e.g., Airflow, Luigi,

NiFi)

  • Experience with cloud platforms (AWS, Azure, or GCP) and their data services. AWS

Preferred

  • Hands on Experience with building streaming pipelines with flink, Kafka, Kinesis. Flink

Preferred.

  • Understanding of data governance and data security principles
  • Experience with version control systems (e.g., Git) and CI/CD practices
  • Proven leadership skills in grooming data engineering teams.

Preferred Skills

  • Experience with containerization and orchestration tools (Docker, Kubernetes)
  • Basic knowledge of machine learning workflows and MLOps
  • Experience with NoSQL databases (MongoDB, Cassandra, etc.)
  • Familiarity with data visualization tools (Tableau, Power BI, etc.)
  • Experience with real-time data processing
  • Knowledge of data governance frameworks and compliance requirements (GDPR, CCPA,

etc.)

  • Experience with infrastructure-as-code tools (Terraform, CloudFormation)Role & responsibilities

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Nielsen Media logo
Nielsen Media

Software Development

New York NY

RecommendedJobs for You

bengaluru, mumbai (all areas)

gurugram, haryana, india