Data Acquisition Engineer - Prism

3 - 8 years

2 - 6 Lacs

Posted:1 week ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Role: Data Acquisition Engineer - Prism

Location: Bangalore

About ClickPost

Founded in 2015, ClickPost is Asia s leading Logistics Intelligence Platform, trusted by brands like Walmart, Nykaa, Meesho, and Adidas to improve post-purchase customer experience at scale. We ve been profitable for the last five years and continue to grow rapidly, solving complex problems at the intersection of logistics, data, and technology.
Our culture is built on ownership, transparency, continuous learning, and speed. We operate with small, high-impact teams where engineers have the autonomy to design, build, and scale systems that directly power our products.

About Prism

Prism is ClickPost s Quick Commerce Intelligence Platform, built to help brands win on platforms such as Blinkit, Zepto, and Swiggy Instamart. Prism transforms fragmented, hyperlocal marketplace data into a unified intelligence layer providing actionable insights on availability, coverage, pricing, and inventory across geographies.

Role Overview

As a Data Acquisition Engineer for Prism, you will be responsible for building and scaling the data ingestion systems that power our intelligence platform. You will design robust, high-throughput data collection frameworks that operate across dynamic, location-aware environments and deliver fresh, reliable data at scale.
This role sits at the intersection of backend engineering, data engineering, and large-scale automation, with a strong emphasis on system reliability and data quality.

Key Responsibilities

Data Acquisition & Automation

Design, build, and maintain scalable data acquisition pipelines for collecting hyperlocal marketplace data.
Simulate location-based user journeys to capture geo-specific availability, pricing, and catalog information.
Handle dynamic, JavaScript-heavy environments using modern browser automation and scraping frameworks.

Scalability & Reliability

Build distributed crawling systems capable of ingesting millions of data points daily while meeting strict freshness SLAs.
Implement monitoring, alerting, and automated recovery mechanisms to ensure system stability.
Proactively identify and resolve failures caused by platform changes, rate limits, or infrastructure issues.

Data Quality & Normalization

Develop robust parsing and validation layers to standardize unstructured data from multiple sources.
Ensure data accuracy, completeness, and consistency before it reaches analytics and customer-facing dashboards.
Continuously improve data integrity checks and anomaly detection.

Collaboration & Impact

Work closely with Product Managers and Data Scientists to translate raw data into meaningful business metrics.
Enable insights such as availability trends, coverage gaps, and potential sales impact through reliable data foundations.

Tech Stack

  • Languages & Frameworks: Python, Scrapy, BeautifulSoup, Selenium / Playwright
  • Data & Storage: SQL / NoSQL databases
  • Geospatial: GeoJSON, latitude/longitude handling, spatial indexing (H3 preferred)
  • Infrastructure: AWS or GCP
  • Distributed Systems: Celery, Kafka, Redis (or equivalent)

Requirements

Must Have

  • 3+ years of experience building production-grade data acquisition or scraping systems using Python.
  • Strong hands-on experience with browser automation tools and modern scraping frameworks.
  • Experience working with location-based or geospatial data extraction.
  • Solid understanding of HTTP, networking fundamentals, and large-scale automation challenges.
  • Proven ability to design systems that handle high concurrency and frequent change.
  • Strong focus on data quality, validation, and reliability.

Good to Have

  • Experience with distributed task queues or streaming systems.
  • Familiarity with proxy management strategies and request routing at scale.
  • Exposure to monitoring, alerting, and incident response systems.

What We Look For

  • Ownership mindset: You take responsibility end-to-end from design to production stability.
  • Problem-solver: You re comfortable navigating ambiguity and adapting to fast-changing systems.
  • Execution-focused: You value pragmatic solutions and iterative improvement over perfection.

Why Join ClickPost & Prism

High Impact: Build the data backbone for a platform shaping how brands operate in India s fast-growing quick commerce ecosystem.
Technical Depth: Work on complex, real-world data engineering problems at scale.
Growth & Learning: Collaborate with experienced engineers and leaders in a fast-moving SaaS environment.


Mock Interview

Practice Video Interview with JobPe AI

Start JavaScript Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Javascript Skills

Practice Javascript coding challenges to boost your skills

Start Practicing Javascript Now
Clickpost logo
Clickpost

Logistics and Supply Chain

Goregaon West

RecommendedJobs for You

hubli, mangaluru, mysuru, bengaluru, belgaum

hubli, mangaluru, mysuru, bengaluru, belgaum