Posted:21 hours ago| Platform: Foundit logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Summary

ETL & Data Warehouse Developer

Key Responsibilities

1. ETL Development

  • Design, develop, and implement robust ETL processes using

    AWS EMR, AWS Data Pipeline, or custom scripts

    .
  • Ensure efficient extraction, transformation, and loading of data from multiple sources into the data warehouse.

2. Data Warehousing

  • Design and maintain scalable, high-performance

    data warehouse solutions

    on AWS.
  • Implement optimized data models and structures for

    AWS Redshift

    or equivalent environments.

3. AWS Service Utilization

  • Leverage AWS services such as

    S3, Lambda, Redshift

    ,

    EMR

    ,

    Step Functions

    and others to build end-to-end data solutions.
  • Stay updated with AWS advancements and recommend new tools or services to enhance data infrastructure.

4. SQL Expertise

  • Develop and optimize complex

    SQL queries, stored procedures, and views

    for analytics and reporting.
  • Troubleshoot and fine-tune SQL performance to ensure efficient data retrieval.

5. Pyspark Expertise

  • Design, Develop and optimize ETL pipelines using

    Pyspark

    .
  • Leverage optimization and

    data modelling

    capabilities to design data pipelines.

6. Performance Optimization

  • Continuously monitor and enhance ETL and query performance.
  • Identify and resolve processing bottlenecks across data pipelines and warehouse layers.

7. Data Integration & Collaboration

  • Collaborate with cross-functional teams to integrate data from multiple systems.
  • Partner with business stakeholders to understand and deliver on data requirements.

8. Security & Compliance

  • Implement strong data security and governance measures.
  • Ensure compliance with relevant industry standards and organizational policies.

9. Documentation

  • Maintain detailed documentation for

    ETL processes, data models, and configurations

    .
  • Ensure proper handover and knowledge transfer within the team.

Qualifications & Skills

  • Bachelor's degree in

    Computer Science, Information Technology

    , or a related field.
  • Experience with

    Spark Streaming

    will be an advantage.
  • Proven experience as an

    ETL / Data Warehouse Developer

    with expertise in AWS and SQL.
  • Strong proficiency in

    SQL

    , including complex queries and performance tuning.
  • Hands on experience with developing and maintaining ETL pipeline using

    Pyspark

    .
  • Excellent understanding of

    Spark

    architecture.
  • Hands-on experience with

    AWS services

    such as S3, EMR, Redshift, Lambda, and Step Functions.
  • Solid understanding of

    data modeling

    , data integration, and warehousing principles.
  • Familiarity with

    data security, compliance, and governance

    best practices.
  • Strong analytical, problem-solving, and communication skills.
  • AWS Certification

    or equivalent credentials preferred.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You

hyderabad, telangana, india

chennai, tamil nadu, india

chennai, tamil nadu, india

mumbai metropolitan region

noida, uttar pradesh, india