Posted:1 week ago|
Platform:
Remote
Full Time
We are looking for a skilled and motivated Data Engineer with deep expertise in GCP, BigQuery, Apache Airflow to join our data platform team. The ideal candidate should have hands-on experience building scalable data pipelines, automating workflows, migrating large-scale datasets, and optimizing distributed systems. The candidate should have experience with building Web APIs using Python. This role will play a key part in designing and maintaining robust data engineering solutions across cloud and on-prem environments. Key Responsibilities BigQuery & Cloud Data Pipelines: Design and implement scalable ETL pipelines for ingesting large-scale datasets. Build solutions for efficient querying of tables in BigQuery. Automated scheduled data ingestion using Google Cloud services and scheduled Apache Airflow DAGs Airflow DAG Development & Automation: Build dynamic and configurable DAGs using JSON-based input to be reused across multiple data processes Create DAGs for data migration to/from BigQuery and external systems (SFTP, SharePoint, Email etc.) Develop custom Airflow operators to meet business needs Data Security & Encryption : Build secure data pipelines with end-to-end encryption for external data exports and imports Data Migration & Integration: Experience in data migration and replication across various systems, including Salesforce, MySQL, SQL Server, and BigQuery Required Skills & Qualifications: Strong hands-on experience with Google BigQuery, Apache Airflow, and Cloud Storage (GCS/S3) Deep understanding of ETL/ELT concepts, data partitioning, and pipeline scheduling Proven ability to automate complex workflows and build reusable pipeline frameworks Programming knowledge in Python, SQL, and scripting for automation Hands-on experience with building web APIs/applications using Python Familiarity with cloud platforms (GCP/AWS) and distributed computing frameworks Strong problem-solving, analytical thinking, and debugging skills Basic understanding of Object Oriented Fundamentals Working knowledge of version control tools like Gitlab, Bitbucket Industry Knowledge & Experience Experience with deep expertise in Google BigQuery, Apache Airflow, Python, and SQL Experience with BI tools such as DOMO, Looker, and Tableau Working knowledge of Salesforce and data extraction methods Prior experience working with data encryption, SFTP automation, or ad-hoc data requests
Trantor
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Hyderabad
9.0 - 19.0 Lacs P.A.
Chennai
25.0 - 30.0 Lacs P.A.
Ahmedabad
4.0 - 9.0 Lacs P.A.
Bengaluru
5.0 - 9.0 Lacs P.A.
Mumbai
7.0 - 12.0 Lacs P.A.
Bengaluru
25.0 - 30.0 Lacs P.A.
Hyderabad
25.0 - 30.0 Lacs P.A.
Bengaluru
22.5 - 25.0 Lacs P.A.
Bengaluru
8.0 - 13.0 Lacs P.A.
Bengaluru
30.0 - 35.0 Lacs P.A.