Posted:3 months ago| Platform:
Work from Office
Full Time
Ever wonder how Nielsen figured out that 127. 3 million people tuned in to watch Super Bowl LVIII? Thats what we do here. Come and join our team of professionals and together we will build a cutting-edge Television measurement system. On this specialized team, we will use a combination of technology, curiosity and culture to empower our teams and people to be successful and focused on delivering highly reliable and accurate systems. Our focus will be to develop and deliver content recognition engines used in measuring streaming video, commercials, and broadcast TV. Language requirements : GoLang, Python Technical skills: Kubernetes, Pulumi, Terraform Knowledge of specific systems : AWS S3, SQS, EC2, RDS Responsibilities The duties of this position include the development and refinement of a high-resolution content identification system used in the identification of television programs and commercials. You will be working on a scrum team with other skilled developers sharing best practices and exploring new technologies and algorithms that will advance the excellence of our measurement. You will build and maintain microservices that power the content identification services used in Television Audience Measurement. These micro services run in AWS and consume and process data using advanced algorithms that are tuned for efficiency. You will also be responsible for the efficient use of AWS resources on our projects. Critical thinking and Innovation are highly valued on this team and everyone is expected to think out of the box, bring new ideas and to challenge what we do and how we are doing it. Qualifications Bachelors degree in Computer Engineering related field (or equivalent experience). 7+ Years of experience in programming back office services. Skilled in GoLang and fluent in Python. Skilled in AWS Programming APIs for S3, SQS. Experienced in Kubernetes. Experienced in writing Infrastructure as code: Pulumi and/or Terraform. Experienced in Operating and writing Airflow DAGs. Has fundamental skills in Signal Processing - FFT, Nyquist cutoff, etc. Strong skills in Hashing data and working with large hash tables. Having past experience of working efficiently with large data sets. Strong abstract reasoning and problem solving skills. Demonstrate the ability to perform a root cause analysis Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a nielsen. com domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @ nielsen. com address. If youre unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels.
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Pune, Gurgaon, Mumbai (All Areas)
INR 5.0 - 15.0 Lacs P.A.
Ghaziabad, Bengaluru
INR 18.0 - 90.0 Lacs P.A.
INR 5.0 - 10.0 Lacs P.A.
Bengaluru
INR 7.0 - 8.0 Lacs P.A.
INR 7.0 - 12.0 Lacs P.A.
Nasik, Pune, Nagpur, Mumbai, Thane, Aurangabad
INR 7.0 - 12.0 Lacs P.A.
INR 12.0 - 13.0 Lacs P.A.
Chennai
INR 5.0 - 8.0 Lacs P.A.
INR 0.6 - 0.7 Lacs P.A.
Pune, Navi Mumbai, Hyderabad
INR 1.0 - 5.0 Lacs P.A.