Data Engineer - Contract (Remote)

4 years

0 Lacs

Posted:1 week ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Contractual

Job Description

Position:

Location:

Engagement Type:

Language:

Experience:



About the Role


We’re looking for a hands-on Data/Backend Engineer to build and operate production-grade data pipelines and services. You’ll work across ingestion, transformation, orchestration, and APIs; automate environments with Terraform; instrument observability; and write tests to keep things reliable. Azure experience is preferred; strong AWS experience is also acceptable. You should communicate clearly (async chat and live sessions), learn fast, and use AI coding tools (e.g., Claude Code) effectively without relying on them exclusively.



Key Responsibilities


  • Data pipelines

    : Design, build, and operate reliable batch and change-driven pipelines from multiple sources; schedule/orchestrate jobs; handle schema evolution and failures gracefully.
  • Transformations & modeling

    : Implement clean, tested merge and transformation pipelines and produce models that are easy to consume for products and analytics; tune SQL for performance.
  • APIs & integration

    : Build or collaborate on APIs for data access and updates; design request/response contracts; handle auth (OAuth2/OIDC/JWT), idempotency, validation, and auditing.
  • Infrastructure & DevOps

    : Provision and manage cloud resources with Terraform; automate build/test/deploy with CI/CD; write solid shell scripts for glue/ops tasks.
  • Observability & reliability

    : Instrument structured logs, metrics, and traces; define alert policies and on-call runbooks; track SLOs and drive root-cause prevention.
  • Security basics

    : Apply least-privilege access, secrets management, and encryption in transit/at rest; collaborate with the team on compliance requirements.
  • Quality & tests

    : Write unit, integration, and data-quality tests (including SQL tests) to keep pipelines and services correct and maintainable.
  • Collaboration & docs

    : Communicate clearly in writing and in person; produce concise docs (data contracts, mappings, runbooks); work closely with product/engineering/analytics.



Required Skills & Qualifications


  • Strong SQL

    (query tuning, indexing, query plans) and solid RDBMS fundamentals (e.g., Postgres, SQL Server, MySQL).
  • Python

    for data work and services; comfortable with shell scripting (bash preferred, powershell proficiency if not bash).
  • Terraform

    and IaC fundamentals; CI/CD experience (version control workflows, automated tests, environment promotion).
  • Cloud

    : Azure preferred (e.g., storage, compute, identity, monitoring). If not Azure, very strong AWS background.
  • Data integration

    : Experience with ingestion (batch + change-driven), orchestration, schema/versioning, and resilient retries/replays.
  • Workflow orchestration & scheduling

    : Prefect preferred (flows/deployments, retries/backoff, scheduling, observability); Airflow, Dagster, or similar also acceptable.
  • APIs & auth

    : Practical experience with REST patterns, pagination, validation, rate limiting, and OAuth2/OIDC/JWT.
  • Observability

    : Familiar with logs/metrics/traces and setting actionable alerts and dashboards.
  • Testing mindset

    : Habit of writing tests for code and SQL; comfortable with fixtures/test data and CI.
  • Communication

    : Clear, concise writing and verbal comms; comfortable in async chat and live sessions.
  • AI tooling

    : Ready to use Claude Code (and similar assistants) from day one to accelerate work—while exercising judgment and writing your own code/tests.



Preferred Qualifications (Nice to Have)


  • Regulatory/compliance awareness (e.g., data protection or public-sector standards) and how it impacts design/operations.
  • Analytics experience (dim/fact modeling, BI consumption patterns).
  • Full-stack exposure (can prototype simple UI/ops views when needed).
  • Data platform architecture exposure (storage/layout choices, catalog/lineage, governance concepts).
  • Familiarity with event streaming/message queues, job schedulers/orchestrators, and columnar formats (e.g., Parquet).
  • Experience with cloud monitoring stacks (Azure Monitor/Application Insights, CloudWatch, Datadog, New Relic, etc.).



Engagement Details


Duration

Schedule

Compensation

Mode



How to Apply


Please share:

  • Resume/CV

    highlighting data engineering, DevOps/IaC, and/or API/auth work
  • Brief note

    (one paragraph) about a recent pipeline/service: requirements, your role, tech, validation/tests, deployment/ops, outcome
  • Links

    to code samples or design docs (if available and shareable)

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You