We are building a Unified Data Platform to serve 20+ product modules, enabling ingestion, processing, and retrieval of data to power analytics, machine learning & various intelligent systems at scale. This person will be cross-functionally involved in various initiatives to modernize or scale the data-platform catering to different product modules.
About the Role
- You will lead the design, architecture & development of the Unified Data platform.
- You will be working closely with senior engineering leaders, mentor senior/junior engineers, problem solve with product teams & work with customer success teams.
- You will actively participate in building MVPs and POCs in various areas around data ingestion(streaming/batch), transformation or analytics.
- Author functional/non-functional specifications and design documents
- Design and implement automation of infrastructure setup using helm, terraform and similar technologies.
- Perform peer reviews of specifications, designs, and code
- Work alongside Site Reliability Engineers and cross functional teams to diagnose/troubleshoot any production performance related issues
About You
- Bachelor s or Master s degree in Computer Science or related field.
- 15+ years of hands-on software(backend) development experience - most of which should be in the data platform space involving distributed systems & microservices.
- Proven expertise in building low-latency, highly concurrent microservices for handling ingestion & analytics.
- Deep exposure to both batch (spark) & stream processing (Flink, Spark Streaming, Kafka Streams).
- Deep exposure to Datalake storage backed by S3, GCS or NFS.
- Deep exposure to OLAP data-stores like Pinot, Druid, Clickhouse
- Must have worked with open source technologies in the data platform space
- Familiarity with document datastores such as MongoDB
- Familiarity with RDS like postgress, mysql.
- Exposure to scaling and operating production data systems at 10K+ qps
- Deep exposure to JVM based languages like Java, Scala, Kotlin
- Experience in building custom DSL based ingestion or query abstractions.
- Deep understanding of Kubernetes infra & deployment, observability & alerting.
- Open-source contributions to major Apache projects in the data space(e.g. Apache Spark or Kafka) is a big plus.
Work Location
- Bangalore. The successful candidate will be expected to be in the Bangalore office 3x/ week.
What You Will Have at Harness
- Experience building a transformative product
- End-to-end ownership of your projects
- Competitive salary
- Comprehensive healthcare benefit
- Flexible work schedule
- Quarterly Harness TGIF-Off / 4 days
- Paid Time Off and Parental Leave
- Monthly, quarterly, and annual social and team building events
- Monthly internet reimbursement
- Harness AI Tackles Software Development s Real Bottleneck
- After Vibe Coding Comes Vibe Testing (Almost)
- Startup Within a Startup: Empowering Intrapreneurs for Scalable Innovation - Jyoti Bansal (Harness)
- Jyoti Bansal, Harness | theCUBEd Awards
- Eight years after selling AppDynamics to Cisco, Jyoti Bansal is pursuing an unusual merger
- Harness snags Split.io , as it goes all in on feature flags and experiments
- Exclusive: Jyoti Bansal-led Harness has raised $150 million in debt financing