Freelance Opportunity: Data Engineer – Teradata to GCP Migration

5 years

0 Lacs

Posted:2 months ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Part Time

Job Description

Company Description

ThreatXIntel is a startup cyber security company dedicated to providing customizable and affordable cyber security solutions for businesses and organizations. Our proactive approach involves continuous monitoring and testing to identify vulnerabilities before they can be exploited. We offer services such as cloud security, web and mobile security testing, and DevSecOps to help clients protect their digital assets.


Role Description

Data Engineers (freelance/contract)

The role is hands-on and requires strong expertise in both Teradata and GCP services to ensure seamless migration, optimized performance, and minimal disruption to business operations.

Key Responsibilities

  • Lead the

    migration of data and ETL workflows

    from Teradata to GCP services such as

    BigQuery, Dataflow, Cloud Storage, Dataproc, and Composer (Airflow)

    .
  • Analyze and map

    Teradata workloads

    to appropriate GCP equivalents.
  • Refactor and rewrite

    SQL, scripts, and procedures

    in GCP-compliant formats (BigQuery Standard SQL).
  • Collaborate with

    data architects and business stakeholders

    to define migration strategies, validate data quality, and ensure compliance.
  • Develop

    automated data pipelines

    using GCP-native tools and/or custom scripts (Python/Java).
  • Optimize

    data storage, query performance, and costs

    within GCP.
  • Implement

    monitoring, logging, and alerting

    for migrated pipelines and production workloads.

Required Skills

  • 5+ years of Data Engineering

    experience, including

    2+ years in GCP

    .
  • Strong background in

    Teradata

    , including

    BTEQ

    and complex SQL.
  • Hands-on expertise with

    GCP services

    : BigQuery, Dataflow, Cloud Storage, Pub/Sub, Composer, and Dataproc.
  • Experience building and maintaining

    ETL/ELT pipelines

    with Python or Java.
  • Proven ability to

    translate and optimize legacy Teradata logic

    into GCP.
  • Familiarity with

    CI/CD pipelines, Git, Argo CD, and DevOps

    practices in cloud environments.
  • Strong problem-solving skills, attention to detail, and ability to work independently.

Preferred Qualifications

  • GCP Professional Data Engineer certification

    (preferred).
  • Exposure to

    Apache Kafka, Cloud Functions, or AI/ML pipelines

    on GCP.
  • Prior experience in the

    healthcare domain

    .
  • Knowledge of

    data governance, security, and compliance

    in cloud ecosystems.


Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now