Jobs
Interviews

Black Dog Labs

2 Job openings at Black Dog Labs
Data Engineer - Contract (Remote) india 4 years None Not disclosed Remote Contractual

Position: Data Engineer Location: Remote (collaboration across time zones), India or LATAM preferred Engagement Type: Contract Language: Proficient English communication Experience: 4+ years in Data Engineering / Backend Engineering / DevOps About the Role We’re looking for a hands-on Data/Backend Engineer to build and operate production-grade data pipelines and services. You’ll work across ingestion, transformation, orchestration, and APIs; automate environments with Terraform; instrument observability; and write tests to keep things reliable. Azure experience is preferred; strong AWS experience is also acceptable. You should communicate clearly (async chat and live sessions), learn fast, and use AI coding tools (e.g., Claude Code) effectively without relying on them exclusively. Key Responsibilities Data pipelines : Design, build, and operate reliable batch and change-driven pipelines from multiple sources; schedule/orchestrate jobs; handle schema evolution and failures gracefully. Transformations & modeling : Implement clean, tested merge and transformation pipelines and produce models that are easy to consume for products and analytics; tune SQL for performance. APIs & integration : Build or collaborate on APIs for data access and updates; design request/response contracts; handle auth (OAuth2/OIDC/JWT), idempotency, validation, and auditing. Infrastructure & DevOps : Provision and manage cloud resources with Terraform; automate build/test/deploy with CI/CD; write solid shell scripts for glue/ops tasks. Observability & reliability : Instrument structured logs, metrics, and traces; define alert policies and on-call runbooks; track SLOs and drive root-cause prevention. Security basics : Apply least-privilege access, secrets management, and encryption in transit/at rest; collaborate with the team on compliance requirements. Quality & tests : Write unit, integration, and data-quality tests (including SQL tests) to keep pipelines and services correct and maintainable. Collaboration & docs : Communicate clearly in writing and in person; produce concise docs (data contracts, mappings, runbooks); work closely with product/engineering/analytics. Required Skills & Qualifications Strong SQL (query tuning, indexing, query plans) and solid RDBMS fundamentals (e.g., Postgres, SQL Server, MySQL). Python for data work and services; comfortable with shell scripting (bash preferred, powershell proficiency if not bash). Terraform and IaC fundamentals; CI/CD experience (version control workflows, automated tests, environment promotion). Cloud : Azure preferred (e.g., storage, compute, identity, monitoring). If not Azure, very strong AWS background. Data integration : Experience with ingestion (batch + change-driven), orchestration, schema/versioning, and resilient retries/replays. Workflow orchestration & scheduling : Prefect preferred (flows/deployments, retries/backoff, scheduling, observability); Airflow, Dagster, or similar also acceptable. APIs & auth : Practical experience with REST patterns, pagination, validation, rate limiting, and OAuth2/OIDC/JWT. Observability : Familiar with logs/metrics/traces and setting actionable alerts and dashboards. Testing mindset : Habit of writing tests for code and SQL; comfortable with fixtures/test data and CI. Communication : Clear, concise writing and verbal comms; comfortable in async chat and live sessions. AI tooling : Ready to use Claude Code (and similar assistants) from day one to accelerate work—while exercising judgment and writing your own code/tests. Preferred Qualifications (Nice to Have) Regulatory/compliance awareness (e.g., data protection or public-sector standards) and how it impacts design/operations. Analytics experience (dim/fact modeling, BI consumption patterns). Full-stack exposure (can prototype simple UI/ops views when needed). Data platform architecture exposure (storage/layout choices, catalog/lineage, governance concepts). Familiarity with event streaming/message queues, job schedulers/orchestrators, and columnar formats (e.g., Parquet). Experience with cloud monitoring stacks (Azure Monitor/Application Insights, CloudWatch, Datadog, New Relic, etc.). Engagement Details Duration : 6+ months (extension/full-time possible) Schedule : Flexible, with overlap for key meetings across time zones Compensation : Competitive, based on experience Mode : Remote; available for virtual collaboration and on-call windows as agreed How to Apply Please share: Resume/CV highlighting data engineering, DevOps/IaC, and/or API/auth work Brief note (one paragraph) about a recent pipeline/service: requirements, your role, tech, validation/tests, deployment/ops, outcome Links to code samples or design docs (if available and shareable)

Senior Data Engineer india 7 years None Not disclosed Remote Contractual

Position : Senior Data Engineer Location : Remote (collaboration across time zones), India or LATAM preferred Engagement Type : Contract Language : Proficient English communication Experience : 7+ years in Data Engineering / Backend Engineering / DevOps About the Role We’re looking for a hands-on Senior Data/Backend Engineer to build and operate production-grade data pipelines and services. You’ll work across ingestion, transformation, orchestration, and APIs; automate environments with Terraform; instrument observability; and write tests to keep things reliable. Azure experience is preferred; strong AWS experience is also acceptable. You should communicate clearly (async chat and live sessions), learn fast, and use AI coding tools (e.g., Claude Code) effectively without relying on them exclusively. Key Responsibilities Data pipelines : Design, build, and operate reliable batch and change-driven pipelines from multiple sources; schedule/orchestrate jobs; handle schema evolution and failures gracefully. Transformations & modeling : Implement clean, tested merge and transformation pipelines and produce models that are easy to consume for products and analytics; tune SQL for performance. APIs & integration : Build or collaborate on APIs for data access and updates; design request/response contracts; handle auth (OAuth2/OIDC/JWT), idempotency, validation, and auditing. Infrastructure & DevOps : Provision and manage cloud resources with Terraform; automate build/test/deploy with CI/CD; write solid shell scripts for glue/ops tasks. Observability & reliability : Instrument structured logs, metrics, and traces; define alert policies and on-call runbooks; track SLOs and drive root-cause prevention. Security basics : Apply least-privilege access, secrets management, and encryption in transit/at rest; collaborate with the team on compliance requirements. Quality & tests : Write unit, integration, and data-quality tests (including SQL tests) to keep pipelines and services correct and maintainable. Collaboration & docs : Communicate clearly in writing and in person; produce concise docs (data contracts, mappings, runbooks); work closely with product/engineering/analytics. Required Skills & Qualifications Strong SQL (query tuning, indexing, query plans) and solid RDBMS fundamentals (e.g., Postgres, SQL Server, MySQL). Python for data work and services; comfortable with shell scripting (bash preferred, powershell proficiency if not bash). Terraform and IaC fundamentals; CI/CD experience (version control workflows, automated tests, environment promotion). Cloud : Azure preferred (e.g., storage, compute, identity, monitoring). If not Azure, very strong AWS background. Data integration : Experience with ingestion (batch + change-driven), orchestration, schema/versioning, and resilient retries/replays. Workflow orchestration & scheduling : Prefect preferred (flows/deployments, retries/backoff, scheduling, observability); Airflow, Dagster, or similar also acceptable. APIs & auth : Practical experience with REST patterns, pagination, validation, rate limiting, and OAuth2/OIDC/JWT. Observability : Familiar with logs/metrics/traces and setting actionable alerts and dashboards. Testing mindset: Habit of writing tests for code and SQL; comfortable with fixtures/test data and CI. Communication : Clear, concise writing and verbal comms; comfortable in async chat and live sessions. AI tooling : Ready to use Claude Code (and similar assistants) from day one to accelerate work—while exercising judgment and writing your own code/tests. Preferred Qualifications (Nice to Have) Regulatory/compliance awareness (e.g., data protection or public-sector standards) and how it impacts design/operations Analytics experience (dim/fact modeling, BI consumption patterns) Full-stack exposure (can prototype simple UI/ops views when needed) Data platform architecture exposure (storage/layout choices, catalog/lineage, governance concepts) Familiarity with event streaming/message queues, job schedulers/orchestrators, and columnar formats (e.g., Parquet) Experience with cloud monitoring stacks (Azure Monitor/Application Insights, CloudWatch, Datadog, New Relic, etc.) Engagement Details Duration : 6+ months (extension/full-time possible) Schedule : Flexible, with overlap for key meetings across time zones Compensation : Competitive, based on experience Mode : Remote; available for virtual collaboration and on-call windows as agreed How to Apply Please share: Resume/CV highlighting data engineering, DevOps/IaC, and/or API/auth work Brief note (one paragraph) about a recent pipeline/service: requirements, your role, tech, validation/tests, deployment/ops, outcome Links to code samples or design docs (if available and shareable) Apply via LinkedIn or by sending an email to careers@blackdoglabs.io