Posted:10 hours ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Title:

Location:

Job Type:

About Us

At Best Nanotech, we are [brief mission statement, e.g., revolutionizing supply chain logistics]. We believe that data is our most valuable asset. We are building a world-class Data Platform team to ensure that our Analysts, Data Scientists, and Leadership have access to clean, reliable, and timely data.

The Role

You won't just be moving data from A to B; you will be designing the architecture that makes that movement fast, reliable, and cost-effective.

Key Responsibilities

  • Pipeline Development:

    Design, build, and maintain scalable ETL/ELT pipelines using

    Python

    and

    SQL

    .
  • Architecture:

    Help manage our cloud data warehouse (

    Snowflake / BigQuery / Redshift

    ) and data lake.
  • Orchestration:

    Schedule and monitor workflows using

    Airflow

    , Dagster, or Prefect.
  • Data Quality:

    Implement automated testing and validation to ensure data accuracy and consistency.
  • Collaboration:

    Work closely with Data Scientists to prepare data for ML models and with Product teams to capture new data events.
  • Performance Tuning:

    Optimize slow-running queries and minimize cloud compute costs.

What We Are Looking For

  • Experience:

    2+ years of experience in Data Engineering.
  • Coding:

    Advanced proficiency in

    SQL

    (window functions, CTEs are second nature) and

    Python

    .
  • Cloud:

    Experience with a major cloud provider (

    AWS, GCP, or Azure

    ).
  • Big Data Tools:

    Familiarity with distributed computing frameworks like

    Spark

    or

    Databricks

    is a plus.
  • Concepts:

    Strong understanding of Data Modeling (Star Schema, Snowflake Schema) and Data Warehousing concepts.
  • Engineering Mindset:

    Experience with Git, CI/CD pipelines, and Containerization (Docker).

Bonus Points

  • Experience with Streaming technologies (Kafka, Kinesis, Flink).
  • Experience with Infrastructure as Code (Terraform).
  • Knowledge of dbt (Data Build Tool).
  • Why Join Us?

  • Impact:

    Your work will directly influence company strategy and product features.
  • Compensation:

    Competitive salary + Equity/Stock Options.
  • Learning:

    Budget for certifications (AWS/GCP), conferences, and workshops.
  • Culture:

    A "no-blame" culture that values learning from incidents.
  • Health:

    Premium medical, dental, and vision coverage.

Ready to apply?

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

bengaluru, mumbai (all areas)

chandigarh, delhi / ncr

new delhi, delhi, india

bengaluru, karnataka, india