Data Engineer (Python, GCP, Airflow/Autosys)

6 years

0 Lacs

Posted:6 days ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Contractual

Job Description

Job Title: Data Engineer (Python, GCP, Airflow/Autosys)

Location: Remote

Experience: 6+ years

Employment Type: Long-Term Contract (through end of 2025)

Work Hours: Must have overlap with EST time zone

Dual Employment: Not permitted


Role Overview:

We are seeking an experienced *Data Engineer* with strong expertise in scheduling/orchestration tools (*Autosys, Airflow), **Python scripting, and **Google Cloud Platform (GCP)* to support a critical data engineering initiative. The role focuses on building, optimizing, and automating data pipelines in GCP using BigQuery and DAG orchestration. The ideal candidate is hands-on, proactive, and eager to leverage modern cloud tools to deliver scalable data solutions.


Key Responsibilities:

* Design, develop, and maintain data pipelines leveraging *Airflow (DAGs)* and *Autosys* for scheduling and orchestration.

* Build, optimize, and maintain scalable solutions in *Google Cloud Platform (BigQuery, Dataflow, Cloud Composer)*.

* Write, test, and deploy *Python scripts* for automation, data transformations, and workflow efficiency.

* Collaborate with cross-functional teams to gather requirements and deliver reliable data solutions.

* Troubleshoot, debug, and optimize workflows for reliability, scalability, and performance.

* Maintain clear documentation of data pipelines, workflows, and implemented solutions.


Required Qualifications:

* Hands-on experience with *Autosys and Airflow* for scheduling and orchestration.

* Strong *Python scripting* skills for automation and data engineering tasks.

* Proven experience with *GCP (BigQuery, Dataflow, Cloud Composer)*.

* Solid understanding of workflow automation, data integration, and monitoring.

* Independent problem solver with strong analytical and troubleshooting skills.

* Excellent communication and documentation abilities.


Nice to Have

* Experience with *ETL frameworks*.

* Familiarity with *SQL optimization*.

* Exposure to *real-time data pipelines*.


Interview Process

1. *Technical Assessment / Coding Test*

2. *HR Interview*

3. *1–2 Technical Interviews*


Apply Now: Share your profiles only if you match the criteria mentioned above at hiring@khey-digit.com.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You