Senior Data Engineer

5 - 10 years

30 - 35 Lacs

Posted:Just now| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

  • Overseeing technological choices and implementation of data pipelines and warehousing philosophy

  • Execute and serve as lead and/or SME on cross-organizational and cross-divisional projects automating our data value chain processes
  • Promoting technical best practices throughout the data organization

  • Design data architecture that is simple and maintainable while enabling Data Analysts, Data Scientists, and stakeholders to efficiently work with data.

  • Mentor data team members in architecture and coding techniques.
  • Serve as a source of knowledge for the Data Engineering team for process improvement, automation and new technologies available to enable best-in-class timeliness and data coverage
  • Design data pipelines utilizing ETL tools, event driven software, and other streaming software.
  • Partner with both data scientists and engineers to bring our amazing concepts to reality. This requires learning to speak the language of statisticians as well as software engineers.
  • Ensure reliability in data pipelines and enforce data governance, security and protection of our customer s information while balancing tech debt.
  • Demonstrate innovation, customer focus, and experimentation mindsets
  • Partner with product and engineering teams to design data models for downstream data maximization.
  • Evaluate and champion new engineering tools that help us move faster and scale our team

WHAT YOU LL NEED:

  • A Technical Bachelor/Masters Degree with 5+ years of experience across Data Engineering (Data Pipelining, Warehousing, ETL Tools etc.)

  • Extensive experience with data engineering techniques, Python and using SQL

  • Familiarity and working knowledge of Airflow and dbt

  • You are comfortable and have expertise in data engineering tooling such as Jira, git, buildkite, terraform, airflow, dbt and containers as well as GCP suite, terraform kubernetes, cloud functions

  • You understand standard ETL patterns, modern data warehousing ideas such as data mesh or data vaulting, and data quality practices regarding test driven design and data observability.

  • You enjoy being a high-level architect sometimes, and a low-level coder sometimes
  • You are passionate about all things data: Big data, small data, moving and transforming it, its quality, its accessibility, and delivering value from it to internal and external clients
  • You want ownership to solve for and lead a team to deliver modern and efficient data pipeline components
  • You are passionate about a culture of learning and teaching
  • You love challenging yourself to constantly improve, and sharing your knowledge to empower others
  • You like to take risks when looking for novel solutions to complex problems. If faced with roadblocks, you continue to reach higher to make greatness happen
  • Technologies, you will use:

    • Python for data pipelining and automation.

    • Airbyte for ETL purpose

    • Google Cloud Platform, Terraform, Kubernetes, Cloud SQL, Cloud Functions, BigQuery, DataStore, and more: we keep adopting new tools as we grow!

    • Airflow and dbt for data pipelining

    • Tableau and PowerBI for data visualization and consumer facing dashboards.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
SE2 logo
SE2

Advertising Services

Denver CO

RecommendedJobs for You

hyderabad, pune, greater noida