2 - 5 years

17 - 18 Lacs

Posted:1 day ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Key Responsibilities

  • Design, build, and maintain scalable data pipelines to ingest, process, and transform data from multiple retail sources (POS systems, e-commerce platforms, inventory databases, customer touchpoints).
  • Develop and manage ETL/ELT workflows supporting both batch and real-time data processing.
  • Create and optimize data warehousing solutions to consolidate data from disparate systems into unified repositories.
  • Implement data quality checks, validation rules, and monitoring to ensure accuracy and reliability.
  • Collaborate with data analysts, data scientists, and backend engineers to understand data requirements.
  • Build well-structured data models to support analytics, reporting, and ML use cases (demand forecasting, customer segmentation, BI).
  • Optimize query performance and manage indexing strategies for high-volume retail transaction data.
  • Implement data governance practices and ensure compliance with data privacy and security regulations.

Required Qualifications

Experience

  • 2-5 years of experience in data engineering, analytics engineering, or related roles.
  • Proven experience building and maintaining data pipelines.


Technical Skills

  • Strong proficiency in Python for data processing and pipeline development.
  • Solid experience with SQL and relational databases (PostgreSQL, MySQL, or similar).
  • Hands-on experience with ETL/ELT tools such as Apache Airflow, dbt, or Luigi.
  • Experience with data warehouses (Snowflake, Redshift, BigQuery, Azure Synapse).
  • Working knowledge of data processing frameworks (Spark, Pandas, or similar).
  • Experience with cloud platforms (AWS, Azure, or GCP) and their data services.
  • Strong understanding of data modeling, schema design, and dimensional modeling.


Soft Skills

  • Strong analytical and problem-solving skills.
  • Excellent communication skills for working with non-technical stakeholders.
  • Ability to work independently as well as collaboratively in agile teams.


Nice to Have

  • Experience with streaming platforms (Kafka, Kinesis).
  • Knowledge of containerization and orchestration tools (Docker, Kubernetes).
  • Familiarity with infrastructure as code (Terraform).
  • Experience with data visualization tools (Tableau, Power BI, Looker).
  • Exposure to AI/ML data pipelines and feature engineering.
  • Understanding of vector databases and RAG architectures for generative AI.
  • Prior experience in retail, e-commerce, or supply chain analytics.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

mumbai metropolitan region

kochi, thiruvananthapuram