Lead - Data Engineer

10 - 12 years

35 - 40 Lacs

Posted:1 week ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Gracenote, a Nielsen company, is dedicated to connecting audiences to the entertainment they love, powering a better media future for all people. Gracenote is the content data business unit of Nielsen that powers innovative entertainment experiences for the world s leading media companies. Our entertainment metadata and connected IDs deliver advanced content navigation and discovery to connect consumers to the content they love and discover new ones. Gracenote s industry-leading datasets cover TV programs, movies, sports, music and podcasts in 80 countries and 35 languages. Common identifiers Universally adopted by the world s leading media companies to deliver powerful cross-media entertainment experiences. Machine driven, human validated best-in-class data and images fuel new search and discovery experiences across every screen. Gracenotes Data Organization is a dynamic and innovative group that is essential in delivering business outcomes through data, insights, predictive & prescriptive analytics. An extremely motivated team that values creativity, experimentation through continuous learning in an agile and collaborative manner. From designing, developing and maintaining data architecture that satisfies our business goals to managing data governance and region-specific regulations, the data team oversees the whole data lifecycle. Role Overview We are seeking an experienced Senior Data Engineer with 10-12 years of experience to join our Video engineering team with Gracenote - a NielsenIQ Company. In this role, you will design, build, and maintain our data processing systems and pipelines. You will work closely with Product managers, Architects, analysts, and other stakeholders to ensure data is accessible, reliable, and optimized for Business, analytical and operational needs. Key Responsibilities Design, develop, and maintain scalable data pipelines and ETL processes Architect and implement data warehousing solutions and data lakes Optimize data flow and collection for cross-functional teams Build infrastructure required for optimal extraction, transformation, and loading of data Ensure data quality, reliability, and integrity across all data systems Collaborate with data scientists and analysts to help implement models and algorithms Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, etc. Create and maintain comprehensive technical documentation Mentor junior engineers and provide technical leadership Evaluate and integrate new data management technologies and tools Implement Optimization strategies to enable and maintain sub second latency. Oversee Data infrastructure to ensure robust deployment and monitoring of the pipelines and processes. Stay ahead of emerging trends in Data, cloud, integrating new research into practical applications. Mentor and grow a team of junior data engineers. Required Qualifications and Skills Expert-level proficiency in Python, SQL, and big data tools (Spark, Kafka, Airflow).Bachelors degree in Computer Science, Engineering, or related field; Masters degree preferred Expert knowledge of SQL and experience with relational databases (e.g., PostgreSQL, Redshift, TIDB, MySQL, Oracle, Teradata) Extensive experience with big data technologies (e.g., Hadoop, Spark, Hive, Flink)Proficiency in at least one programming language such as Python, Java, or Scala Experience with data modeling, data warehousing, and building ETL pipelines Strong knowledge of data pipeline and workflow management tools (e.g., Airflow, Luigi, NiFi)Experience with cloud platforms (AWS, Azure, or GCP) and their data services. AWS Preferred Hands on Experience with building streaming pipelines with flink, Kafka, Kinesis. Flink Preferred. Understanding of data governance and data security principles Experience with version control systems (e.g., Git) and CI/CD practices Proven leadership skills in grooming data engineering teams. Preferred Skills Experience with containerization and orchestration tools (Docker, Kubernetes)Basic knowledge of machine learning workflows and MLOps Experience with NoSQL databases (MongoDB, Cassandra, etc.) Familiarity with data visualization tools (Tableau, Power BI, etc.)Experience with real-time data processing Knowledge of data governance frameworks and compliance requirements (GDPR, CCPA, etc.) Experience with infrastructure-as-code tools (Terraform, CloudFormation)

Mock Interview

Practice Video Interview with JobPe AI

Start Data Modeling Interview Now

My Connections Nielsen Sports

Download Chrome Extension (See your connection in the Nielsen Sports )

chrome image
Download Now
Nielsen Sports
Nielsen Sports

Market Research / Sports Analytics

Chicago

N/A Employees

115 Jobs

    Key People

  • Kathy Leahy

    VP, Nielsen Sports
  • David Burch

    SVP, Business Development

RecommendedJobs for You

Mumbai, Mumbai Suburban, Mumbai (All Areas)

Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru