Home
Jobs

Data Engineer

5 - 10 years

7 - 17 Lacs

Posted:1 week ago| Platform: Naukri logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

We are looking for a skilled and motivated Data Engineer with deep expertise in GCP, BigQuery, Apache Airflow to join our data platform team. The ideal candidate should have hands-on experience building scalable data pipelines, automating workflows, migrating large-scale datasets, and optimizing distributed systems. The candidate should have experience with building Web APIs using Python. This role will play a key part in designing and maintaining robust data engineering solutions across cloud and on-prem environments. Key Responsibilities BigQuery & Cloud Data Pipelines: Design and implement scalable ETL pipelines for ingesting large-scale datasets. Build solutions for efficient querying of tables in BigQuery. Automated scheduled data ingestion using Google Cloud services and scheduled Apache Airflow DAGs Airflow DAG Development & Automation: Build dynamic and configurable DAGs using JSON-based input to be reused across multiple data processes Create DAGs for data migration to/from BigQuery and external systems (SFTP, SharePoint, Email etc.) Develop custom Airflow operators to meet business needs Data Security & Encryption : Build secure data pipelines with end-to-end encryption for external data exports and imports Data Migration & Integration: Experience in data migration and replication across various systems, including Salesforce, MySQL, SQL Server, and BigQuery Required Skills & Qualifications: Strong hands-on experience with Google BigQuery, Apache Airflow, and Cloud Storage (GCS/S3) Deep understanding of ETL/ELT concepts, data partitioning, and pipeline scheduling Proven ability to automate complex workflows and build reusable pipeline frameworks Programming knowledge in Python, SQL, and scripting for automation Hands-on experience with building web APIs/applications using Python Familiarity with cloud platforms (GCP/AWS) and distributed computing frameworks Strong problem-solving, analytical thinking, and debugging skills Basic understanding of Object Oriented Fundamentals Working knowledge of version control tools like Gitlab, Bitbucket Industry Knowledge & Experience Experience with deep expertise in Google BigQuery, Apache Airflow, Python, and SQL Experience with BI tools such as DOMO, Looker, and Tableau Working knowledge of Salesforce and data extraction methods Prior experience working with data encryption, SFTP automation, or ad-hoc data requests

Mock Interview

Practice Video Interview with JobPe AI

Start Airflow Interview Now
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Trantor
Trantor

Data Engineering and Analytics

Las Vegas

51-200 Employees

30 Jobs

    Key People

  • Ravi S. Rachakonda

    CEO
  • Pradeep J. Namdeo

    CTO

RecommendedJobs for You