Python + Data Engineer

5 - 10 years

5 - 7 Lacs

Posted:3 months ago| Platform: Foundit logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Key Responsibilities:

  • Data Pipeline Design & Development:

  • Design, develop, and optimize data pipelines using

    Python

    and core

    AWS services

    such as Glue, Lambda, S3, EMR, Redshift, Athena, and Kinesis.
  • Implement

    ETL/ELT processes

    to extract, transform, and load data from multiple sources into centralized repositories (e.g., data lakes or data warehouses).
  • Collaboration & Requirement Gathering:

  • Collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions.
  • Monitoring & Troubleshooting:

  • Monitor, troubleshoot, and enhance data workflows for

    performance

    and

    cost optimization

    .
  • Data Quality & Governance:

  • Ensure data quality and consistency by implementing

    validation

    and

    governance

    practices.
  • Security & Compliance:

  • Work on

    data security best practices

    in compliance with organizational policies and regulations.
  • Automation:

  • Automate repetitive data engineering tasks using

    Python scripts

    and frameworks.
  • CI/CD Pipelines:

  • Leverage

    CI/CD pipelines

    for deployment of data workflows on

    AWS

    .

Required Skills & Qualifications:

  • Professional Experience:

  • 5+ years of experience in

    data engineering

    or a related field.
  • Programming Skills:

  • Strong proficiency in

    Python

    , with experience in libraries such as

    pandas

    ,

    pyspark

    , and

    boto3

    .
  • AWS Expertise:

  • Hands-on experience with core

    AWS services

    for data engineering:
  • AWS Glue

    for ETL/ELT processes.
  • S3

    for storage.
  • Redshift

    or

    Athena

    for data warehousing and querying.
  • Lambda

    for serverless compute.
  • Kinesis

    or

    SNS/SQS

    for data streaming.
  • IAM Roles

    for security.
  • Database Skills:

  • Proficiency in

    SQL

    and experience with relational (e.g.,

    PostgreSQL

    ,

    MySQL

    ) and

    NoSQL

    (e.g.,

    DynamoDB

    ) databases.
  • Data Processing Frameworks:

  • Knowledge of

    big data frameworks

    such as

    Hadoop

    and

    Spark

    is a plus.
  • DevOps & CI/CD:

  • Familiarity with

    CI/CD pipelines

    and tools like

    Jenkins

    ,

    Git

    , and

    CodePipeline

    .
  • Version Control:

  • Proficient with

    Git-based workflows

    .
  • Problem-Solving:

  • Excellent

    analytical

    and

    debugging

    skills.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Wissen Infotech logo
Wissen Infotech

Information Technology & Services

San Francisco

RecommendedJobs for You