Data Engineer Pyspark, AWS, AVP

6 - 10 years

8 - 12 Lacs

Posted:3 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Join us as a Data Engineer, PySpark, AWS
Were looking for someone to build effortless, digital first customer experiences to help simplify our organisation and keep our data safe and secure Day-to-day, youll develop innovative, data-driven solutions through data pipelines, modelling and ETL design while inspiring to be commercially successful through insights If youre ready for a new challenge, and want to bring a competitive edge to your career profile by delivering streaming data ingestions, this could be the role for you We're offering this role at associate vice president level What youll do Your daily responsibilities will include you developing a comprehensive knowledge of our data structures and metrics, advocating for change when needed for product development Youll also provide transformation solutions and carry out complex data extractions, Well expect you to develop a clear understanding of data platform cost levels to build cost-effective and strategic solutions Youll also source new data by using the most appropriate tooling before integrating it into the overall solution to deliver it to our customers, Youll Also Be Responsible For Driving customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tools to build data solutions Participating in the data engineering community to deliver opportunities to support our strategic direction Carrying out complex data engineering tasks to build a scalable data architecture and the transformation of data to make it usable to analysts and data scientists Building advanced automation of data engineering pipelines through the removal of manual stages Leading on the planning and design of complex products and providing guidance to colleagues and the wider team when required The skills youll need To be successful in this role, youll have an understanding of data usage and dependencies with wider teams and the end customer Youll also have experience of extracting value and features from large scale data, You'll need at least eight years of experience working with Python, PySpark and SQL You'll also need experience in AWS architecture using EMR, EC2, S3, Lambda and Glue You'll also need experience in Apache Airflow, Anaconda and Sagemaker, Youll Also Need Experience of using programming languages alongside knowledge of data and software engineering fundamentals Experience with Performance optimization and tuning Good knowledge of modern code development practices Great communication skills with the ability to proactively engage with a range of stakeholders Show

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

Chennai, Tamil Nadu, India

Hyderabad, Telangana, India

Mumbai Metropolitan Region

Chennai, Tamil Nadu, India