Data Engineer (Python + AWS + Airflow)

5 years

0 Lacs

Posted:2 weeks ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

We are seeking an enthusiastic and highly skilled Senior Data Engineer to play an instrumental role in creating and evolving complex data-centric solutions that improve decision making for our clients and internal staff.


You will be part of a data engineering team responsible for building and maintaining scalable data systems and pipelines. The team manages acquisition, storage, and processing of data from internal and external sources for multiple analytical products. This includes data mapping, geocoding, validation, metadata management, and automation processes. The team uses Python scripting to implement and automate data workflows on top of relational and columnar databases.

In this role, you will design and optimize data infrastructure, ensuring the reliability, scalability, and performance of our platforms. You will collaborate with stakeholders and development teams to deliver solutions from concept through design, deployment, operations, and validation — all within a fast-paced, agile environment.


Responsibilities and Duties:

• Design, build, and maintain robust, scalable, and efficient data pipelines and architectures.

• Implement complex transformations for large-scale structured and unstructured datasets.

• Develop and optimize ETL/ELT workflows using modern tools and orchestration frameworks.

• Ensure data quality, governance, security, and reliability across platforms.

• Prepare and deliver data structures that support predictive modeling, advanced analytics, and reporting.

• Implement monitoring, alerting, and performance optimization for data pipelines and platforms.

• Collaborate with product, engineering, and analytics teams to integrate and deploy data-driven solutions.

• Provide feedback and technical mentorship through peer code reviews and shared best practices.

• Contribute to architecture decisions for cloud-native, modern data warehouse environments.


Qualifications

• Bachelor’s or Master’s degree in Computer Science.

• 5+ years of experience working with large-scale data systems, pipelines, and architectures.

• Advanced proficiency in SQL and data modeling for analytical and transactional workloads.

• Experience with PostGreSQL and MS SQL Server.

• Strong programming skills in Python, with experience in automating and optimizing data workflows.

• Hands-on experience with ETL/ELT frameworks and workflow orchestration tools (e.g., Airflow, dbt).

• Experience with modern cloud-based data warehouse platforms (Snowflake, Redshift).

• Experience with cloud platforms, preferably AWS, including storage, compute, and data services.

• Familiarity with DevOps practices, CI/CD pipelines, and containerization (Docker, Kubernetes, Terraform).

• Ability to leverage AI-assisted coding tools (e.g., GitHub Copilot, Claude, ChatGPT, or similar) to improve development efficiency and code quality.

• Strong analytical, problem-solving, and communication skills; ability to thrive in a fast-paced, collaborative environment.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You