Posted:Just now|
Platform:
Work from Office
Full Time
In this position, you contribute to the successful implementation of our software solutions for advertisers, media owners and media agencies. Our solutions help these companies in making decisions around their marketing communication budgets. Based upon research data combined with analytical and econometric approaches our solutions answer questions like "What budget is needed to generate an increase of 10% in brand preference? What media mix contributes the most to the revenue?"
Data Analysis and Reporting: Youll lead complex data analysis projects, leveraging Core SQL (including query services like AWS Athena) and Python to extract, manipulate, and analyze large datasets from sources like S3. Youll be responsible for developing and automating reports and dashboards that provide key insights to clients and stakeholders.
Data Engineering and Solutions: Youll use your engineering skills to design, build, and maintain scalable ETL and ELT data pipelines within the AWS ecosystem. This includes cleaning, transforming, and ingesting data from diverse sources, including third-party APIs, to ensure it is accurate and ready for analysis. You will be responsible for developing, scheduling, and monitoring complex workflows using Airflow (building and maintaining DAGs). Youll also work on developing scalable data solutions that meet client requirements, often defining pipeline-as-code and configurations using YAML.
Client and Stakeholder Management: Youll serve as a primary point of contact for internal and external stakeholders, developing a deep product understanding and taking technical ownership of specific data markets or domains. Youll use this expertise to understand business challenges, provide expert guidance on how to leverage data, and translate functional requirements into technical solutions. Strong communication and interpersonal skills are essential for presenting findings and collaborating with both technical and non-technical stakeholders.
Qualifications
Bachelors of Technology in Computer Science (preferred) with 5+ years of experience in a data-focused role.
Expert-level proficiency in SQL for data querying and manipulation; experience with AWS Athena is highly preferred.
Advanced proficiency in Python for data analysis, scripting, automation, and building/consuming APIs.
Hands-on experience building and managing data orchestration tools, specifically Airflow (writing and maintaining DAGs).
Strong experience with the AWS cloud stack, particularly S3, Athena, and other related services (e.g., Glue, Lambda).
Proven experience in data analytics, including statistical analysis and reporting.
Experience with data modeling and building scalable ETL/ELT data pipelines.
Familiarity with configuration languages like YAML.
[Best to have] Strong experience with Git-based version control systems, particularly GitLab, for code management and CI/CD.
[Good to have] Experience with distributed data processing frameworks, such as Pyspark.
Excellent communication and problem-solving skills, with the ability to work independently and as part of a team, and a strong sense of product/market ownership.
Experience working in geographically distributed teams.
Able to communicate effectively both orally and in writing.
Proactive and self learner
Nielsen Sports
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now
bengaluru
0.6 - 1.25 Lacs P.A.
chennai
0.6 - 1.25 Lacs P.A.
bengaluru
18.0 - 20.0 Lacs P.A.
bengaluru
0.5 - 0.6 Lacs P.A.
3.9955 - 17.10867 Lacs P.A.
10.0 - 20.0 Lacs P.A.
chennai
5.5 - 15.0 Lacs P.A.
5.5 - 15.0 Lacs P.A.
bengaluru
5.5 - 15.0 Lacs P.A.
chennai
5.0 - 14.0 Lacs P.A.