Sr. Backend Engineer - Analytics,

3 - 8 years

40 - 45 Lacs

Posted:1 hour ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Data Modelling: Skilled in designing data warehouse schemas (e.g., star and snowflake schemas), with experience in fact and dimension tables, as well as normalization and denormalization techniques.
Data Warehousing & Storage Solutions : Proficient with platforms such as Snowflake, Amazon Redshift, Google BigQuery, and Azure Synapse Analytics.
ETL/ELT Processes : Expertise in ETL/ELT tools (e.g., Apache NiFi, Apache Airflow, Informatica, Talend, dbt) to facilitate data movement from source systems to the data warehouse.
SQL Proficiency : Advanced SQL skills for complex queries, indexing, and performance tuning.
Programming Skills : Strong in Python or Java for building custom data pipelines and handling advanced data transformations.
Data Integration : Experience with real-time data integration tools like Apache Kafka, Apache Spark, AWS Glue, Fivetran, and Stitch.
Data Pipeline Management : Familiar with workflow automation tools (e.g., Apache Airflow, Luigi) to orchestrate and monitor data pipelines.
APIs and Data Feeds : Knowledgeable in API-based integrations, especially for aggregating data from distributed sources.
Responsibilities -

Design and implement analytical platforms that provide insightful dashboards to customers.

Develop and maintain data warehouse schemas, such as star schemas, fact tables, and dimensions, to support efficient querying and data access.

Oversee data propagation processes from source databases to warehouse-specific databases/tools, ensuring data accuracy, reliability, and timeliness.

Ensure the architectural design is extensible and scalable to adapt to future needs.


Requirement -

  • Qualification: B.E/B.Tech/M.E/M.Tech/PhD from tier 1 Engineering institutes with relevant work experience with a top technology company.
  • 3+ years of Backend and Infrastructure Experience with a strong track record in development, architecture and design.
  • Hands-on experience with large-scale databases, high-scale messaging systems and real-time Job Queues.
  • Experience navigating and understanding large scale systems and complex code-bases, and architectural patterns.
  • Proven experience in building high-scale data platforms.
  • Strong expertise in data warehouse schema design (star schema, fact tables, dimensions).
  • Experience with data movement, transformation, and integration tools for data propagation across systems.
  • Ability to evaluate and implement best practices in data architecture for scalable solutions.

  • Nice to have:
  • Experience with Google Cloud, Django, Postgres, Celery, Redis.
  • Some experience with AI Infrastructure and Operations.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Level Ai logo
Level Ai

Software Development

Mountain View California

RecommendedJobs for You

hyderabad, chennai, bengaluru

kolkata, mumbai, new delhi, hyderabad, pune, chennai, bengaluru

kolkata, mumbai, new delhi, hyderabad, pune, chennai, bengaluru

kolkata, mumbai, new delhi, hyderabad, pune, chennai, bengaluru

kolkata, mumbai, new delhi, hyderabad, pune, chennai, bengaluru