GCP Data Engineer (Airflow / GCS / BigQuery / Migration)

0 years

0 Lacs

Posted:16 hours ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Part Time

Job Description

Company Description

ThreatXIntel is a startup cyber security company specializing in customized and affordable security solutions to protect businesses and organizations from cyber threats. With a focus on cloud security, web and mobile security testing, and DevSecOps, we provide tailored services to meet the unique needs of our clients, regardless of size. Our proactive approach includes continuous monitoring and testing to identify vulnerabilities before they can be exploited. Committed to supporting startups and small businesses, ThreatXIntel empowers clients to securely grow their operations with confidence.


Role Description

Freelance GCP Data Engineer

Key Responsibilities

  • Build and maintain

    Airflow DAGs

    for batch/near-real-time pipelines (scheduling, retries, backfills, SLAs).
  • Implement ingestion patterns into

    GCS

    (file drops, API pulls, database extracts), including partitioning and folder conventions.
  • Develop

    BigQuery

    transformations (ELT/ETL) using SQL, dbt (if used), or Dataflow/Spark (if needed).
  • Perform

    migration

    activities such as:
  • Moving data from on-prem/legacy DWH/other clouds to

    GCS + BigQuery

  • Rebuilding legacy jobs into Airflow orchestration
  • Translating existing SQL/procs/models into BigQuery-optimized design
  • Optimize BigQuery performance and cost: partitioning, clustering, materialized views, incremental loads, query tuning.
  • Implement data quality checks: schema validation, deduplication, late-arriving data handling, reconciliation reports.
  • Set up monitoring and alerting: Airflow task logs, failure notifications, pipeline health dashboards.
  • Collaborate with stakeholders to define requirements and deliver clean, documented datasets.

Required Skills & Experience

  • Strong hands-on experience with

    Apache Airflow

    (or

    Cloud Composer

    ) in production.
  • Solid experience with

    Google Cloud Storage (GCS)

    and ingestion patterns.
  • Strong

    BigQuery

    expertise (data modeling, SQL performance, cost optimization).
  • Migration experience (data + pipelines) with a clear approach for cutover, validation, and rollback planning.
  • Python for Airflow operators/hooks, automation scripts, and pipeline utilities.
  • Git + CI/CD familiarity for deploying DAGs, SQL, and infrastructure changes (preferred).

Nice to Have

  • dbt, Dataflow, Dataproc/Spark, Pub/Sub
  • IAM, service accounts, secrets management, and basic GCP networking knowledge
  • Experience building analytics layers (star schema, marts, KPI definitions)

Deliverables (Typical)

  • Airflow DAGs + documentation/runbooks
  • GCS ingestion layout + schema/versioning strategy
  • BigQuery datasets/tables/views (curated + marts)
  • Migration plan (cutover steps, validation checks, rollback)
  • Monitoring/alerting for pipeline reliability


Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You