Senior Associate - Data Engineering

4 - 7 years

8 - 13 Lacs

Posted:2 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

We are seeking a Senior Data Engineer to design, build, and optimize scalable data pipelines and solutions in a cloud environment. This individual contributor role requires strong expertise in AWS services (IAM, EMR Serverless, Glue, Athena), advanced PySpark skills, and a solid understanding of data warehousing techniques. The ideal candidate will have experience in modern ETL frameworks and cloud-native data engineering practices. Familiarity with visualization tools like Power BI is a plus but not mandatory. You will work on building robust, secure, and high-performance data platforms that enable analytics and business intelligence across the organization.

Key Responsibilities

  • Interpreting and analyzing business requirements and converting them into high and low level designs.
  • Designing, developing, configuring, testing and deploying cloud automation for Finance business unit using tools such as CloudFormation, Terraform, Ansible etc. while following the capability domains Engineering standards in an Agile environment
  • End-to-end ownership of developing, configuring, unit testing and deploying developed code with quality and minimal supervision.
  • Work closely with customers, business analysts and technology & project team to understand business requirements, drive the analysis and design of quality technical solutions that are aligned with business and technology strategies and comply with the organization's architectural standards.
  • Understand and follow-up through change management procedures to implement project deliverables.
  • Coordinate with support groups such as Enterprise Cloud Engineering teams, DevSecOps, Monitoring to get issues resolved with a quick turnaround time.
  • Work with data science user community to address an issue in ML(machine learning) development life cycle.

Required Qualifications

  • Bachelors or Masters degree in Computer Science or similar field
  • 4 to 7 years of experience in automation on a major cloud (AWS, Azure or GCP)
  • Experience in infrastructure provisioning using Ansible, AWS Cloud formation or Terraform, Python or PowerShell
  • Working knowledge of AWS Services such as EC2, Cloud Formation, IAM, S3, EMR, ECS/EKS etc.
  • Advanced proficiency in PySpark for distributed data processing and transformation.
  • Working knowledge of CI/CD tools and containers.
  • Experience in hadoop administration in resolving hive/spark related issues.
  • Proven understanding of common development tools, patterns and practices for the cloud.
  • Experience writing automated unit tests in a major programming language
  • Proven ability to write quality code by following best practices and guidelines.
  • Strong problem-solving, multi-tasking and organizational skills.
  • Good written and verbal communication skills.
  • Demonstrable experience of working on a team that is geographically dispersed.
  • Experience with SQL and query optimization across large datasets.

Preferred Qualifications

  • Experience with managing Hadoop platform and good in debugging hive/spark related issues.
  • Cloud certification (AWS, Azure or GCP)
  • Knowledge of UNIX/LINUX shell scripting
Timings
(2:00p-10:30p)

Mock Interview

Practice Video Interview with JobPe AI

Start Data Science Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Ameriprise Financial logo
Ameriprise Financial

Financial Services

Minneapolis

RecommendedJobs for You