AWS PySpark Data Engineer

1 - 3 years

3 - 6 Lacs

Hyderabad

Posted:1 month ago| Platform: Naukri logo

Apply

Skills Required

Data analysis Cloud Agile Data processing Apache AWS Python

Work Mode

Work from Office

Job Type

Full Time

Job Description

We are seeking a highlyskilled and experienced Senior Data Engineer to lead the end-to-end developmentof complex models for compliance and supervision. The ideal candidate will havedeep expertise in cloud-based infrastructure, ETL pipeline development, andfinancial domains, with a strong focus on creating robust, scalable, andefficient solutions. Key Responsibilities: -ModelDevelopment: Lead the development of advanced models using AWS services such asEMR, Glue, and Glue Notebooks. -CloudInfrastructure: Design, build, and optimize scalable cloud infrastructuresolutions with a minimum of 5 years of experience. -ETL PipelineDevelopment: Create, manage, and optimize ETL pipelines using PySpark forlarge-scale data processing. -CI/CDImplementation: Build and maintain CI/CD pipelines for deploying andmaintaining cloud-based applications. -Data Analysis:Perform detailed data analysis and deliver actionable insights to stakeholders. -Collaboration:Work closely with cross-functional teams to understand requirements, presentsolutions, and ensure alignment with business goals. -AgileMethodology: Operate effectively in agile or hybrid agile environments,delivering high-quality results within tight deadlines. -FrameworkDevelopment: Enhance and expand existing frameworks and capabilities to supportevolving business needs. -Documentation andCommunication: Create clear documentation and present technical solutions toboth technical and non-technical audiences. Requirements Required Qualifications: -05+ years ofexperience with Python programming. -5+ years ofexperience in cloud infrastructure, particularly AWS. -3+ years ofexperience with PySpark, including usage with EMR or Glue Notebooks. -3+ years ofexperience with Apache Airflow for workflow orchestration. -Solid experiencewith data analysis in fast-paced environments. -Strongunderstanding of capital markets, financial systems, or prior experience in thefinancial domain is a must. -Proficiency withcloud-native technologies and frameworks. -Familiarity withCI/CD practices and tools like Jenkins, GitLab CI/CD, or AWS CodePipeline. -Experience withnotebooks (e.g., Jupyter, Glue Notebooks) for interactive development. -Excellentproblem-solving skills and ability to handle complex technical challenges. -Strongcommunication and interpersonal skills for collaboration across teams andpresenting solutions to diverse audiences. -Ability to thrivein a fast-paced, dynamic environment. Benefits Standard Company Benefits ","

Mock Interview

Practice Video Interview with JobPe AI

Start Data Analysis Interview Now
Data Economy
Data Economy

IT Services and IT Consulting

Data City

2-10 Employees

17 Jobs

    Key People

  • John Doe

    CEO
  • Jane Smith

    CTO

RecommendedJobs for You

Pune, Maharashtra, India