Mumbai, Gurugram, Bengaluru
INR 25.0 - 40.0 Lacs P.A.
Hybrid
Full Time
Nielsen is seeking an organized, detail oriented, team player, to join the Engineering team in the role of Software Machine Learning Engineer. Nielsen's Audience Measurement Engineering platforms support the measurement of television viewing in more than 30 countries around the world. The Software Engineer will be responsible to define, develop, test, analyze, and deliver technology solutions within Nielsen's Collections platforms. Required Skills Bachelor's degree in Computer Science or equivalent degree. 3+ years of software experience Experience with Machine learning frameworks and models. Pytorch experience preferred Strong understanding of statistical analysis and mathematical data manipulation Work with web technology including Java, Python, JavaScript, React/Redux, Kotlin. Follow best practices for software development and deployment Understanding of relational database, big data, and experience in SQL Proficient at using GIT, GitFlow, JIRA, Gitlab and Confluence. Strong analytical and problem solving skills. Open-minded and passionate to learn and grow technology skills Strong sense of accountability Solution-focused and ability to drive change within the organization Experience in writing unit/integration tests including test automation. Strong testing and debugging abilities, functional, analytical and technical abilities, ability to find bugs, attention to detail, troubleshooting Additional Useful Skills A fundamental understanding of the AWS ecosystem (EC2, S3, EMR, Lambda, etc) Experienced in building RESTful APIs. Experience in writing unit/integration tests including test automation.
Bengaluru, Mumbai (All Areas)
INR 15.0 - 30.0 Lacs P.A.
Work from Office
Full Time
Gracenote, a Nielsen company, is dedicated to connecting audiences to the entertainment they love, powering a better media future for all people. Gracenote is the content data business unit of Nielsen that powers innovative entertainment experiences for the worlds leading media companies. Our entertainment metadata and connected IDs deliver advanced content navigation and discovery to connect consumers to the content they love and discover new ones. Gracenotes industry-leading datasets cover TV programs, movies, sports, music and podcasts in 80 countries and 35 languages.Common identifiers Universally adopted by the worlds leading media companies to deliver powerful cross-media entertainment experiences. Machine driven, human validated best-in-class data and images fuel new search and discovery experiences across every screen. Gracenote's Data Organization is a dynamic and innovative group that is essential in delivering business outcomes through data, insights, predictive & prescriptive analytics. An extremely motivated team that values creativity, experimentation through continuous learning in an agile and collaborative manner. From designing, developing and maintaining data architecture that satisfies our business goals to managing data governance and region-specific regulations, the data team oversees the whole data lifecycle. Role Overview: We are seeking an experienced Senior Data Engineer with 10-12 years of experience to join our Video engineering team with Gracenote - a NielsenIQ Company. In this role, you will design, build, and maintain our data processing systems and pipelines. You will work closely with Product managers, Architects, analysts, and other stakeholders to ensure data is accessible, reliable, and optimized for Business, analytical and operational needs. Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes Architect and implement data warehousing solutions and data lakes Optimize data flow and collection for cross-functional teams Build infrastructure required for optimal extraction, transformation, and loading of data Ensure data quality, reliability, and integrity across all data systems Collaborate with data scientists and analysts to help implement models and algorithms Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, etc. Create and maintain comprehensive technical documentation Mentor junior engineers and provide technical leadership Evaluate and integrate new data management technologies and tools Implement Optimization strategies to enable and maintain sub second latency. Oversee Data infrastructure to ensure robust deployment and monitoring of the pipelines and processes. Stay ahead of emerging trends in Data, cloud, integrating new research into practical applications. Mentor and grow a team of junior data engineers. Required qualification and Skills: Expert-level proficiency in Python, SQL, and big data tools (Spark, Kafka, Airflow). Bachelor's degree in Computer Science, Engineering, or related field; Master's degree preferred Expert knowledge of SQL and experience with relational databases (e.g., PostgreSQL, Redshift, TIDB, MySQL, Oracle, Teradata) Extensive experience with big data technologies (e.g., Hadoop, Spark, Hive, Flink) Proficiency in at least one programming language such as Python, Java, or Scala Experience with data modeling, data warehousing, and building ETL pipelines Strong knowledge of data pipeline and workflow management tools (e.g., Airflow, Luigi, NiFi) Experience with cloud platforms (AWS, Azure, or GCP) and their data services. AWS Preferred Hands on Experience with building streaming pipelines with flink, Kafka, Kinesis. Flink Preferred. Understanding of data governance and data security principles Experience with version control systems (e.g., Git) and CI/CD practices Proven leadership skills in grooming data engineering teams. Preferred Skills Experience with containerization and orchestration tools (Docker, Kubernetes) Basic knowledge of machine learning workflows and MLOps Experience with NoSQL databases (MongoDB, Cassandra, etc.) Familiarity with data visualization tools (Tableau, Power BI, etc.) Experience with real-time data processing Knowledge of data governance frameworks and compliance requirements (GDPR, CCPA, etc.) Experience with infrastructure-as-code tools (Terraform, CloudFormation)Role & responsibilities
Bengaluru, Mumbai (All Areas)
INR 15.0 - 30.0 Lacs P.A.
Hybrid
Full Time
Purpose: You join the stream team, an experienced, informal and enthusiastic scrum team of 5 developers, working on stream-processing components to improve our data publication platform. This team is responsible for combining different sources of Sports data from all over the world into a single unified product, all in real-time. Part of the job is that you work together with international teams of developers located in Gracenote offices around the world. Job Requirements: Has experience with Scala, or other JVM languages with the capability to learn Scala Understands Stream-processing (preferably with Kafka Streams and/or Akka Streams) Is comfortable in a DevOps culture, and knows how to get their work into production. Has relevant work experience with both NoSQL (MongoDB) and SQL databases (Postgres, SQL Server) Has affinity with data and data streams. Has experience working in an Agile environment. Has good communication skills and is able to share their knowledge with the team. Has good knowledge of the English language, both spoken and written. Good to have skills: Have an affinity with sports, active or passive Understand schemas and like data modelling Are used to working with the scrum framework Have experience with other programming languages (some other languages we use are Python, Typescript and jsJava) Qualifications: B.E / B.Tech / BCA/ MCA in Computer Science, Engineering or a related subject. Strong Computer Science fundamentals. Comfortable with version control systems such as git. A thirst for learning new Tech and keeping up with industry advances. Excellent communication and knowledge-sharing skills. Comfortable working with technical and non-technical teams. Strong debugging skills. Comfortable providing and receiving code review feedback. A positive attitude, adaptability, enthusiasm, and a growth mindset.Role & responsibilities: Outline the day-to-day responsibilities for this role.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.