Posted:4 days ago|
Platform:
On-site
Full Time
We're seeking an Engineering Manager for Data & Platform who thrives at the intersection of real-time data architecture and people leadership. Reporting directly to our CTO, you'll lead a high-performing team of high-performance engineers while driving the technical vision for our real-time streaming platform that powers critical trading infrastructure.
Your team will be responsible for building the backbone of our trading ecosystem, from scalable data ingestion systems to sophisticated trading modules, including PAMM (Percentage Allocation Management Module), MAMM (Multi-Account Manager Module), Copy Trading platforms, and Introducing Broker systems. This is where data engineering meets financial trading at scale.
•
Lead and scale a specialized data & platform engineering team of talented developers
•
Drive architecture decisions for real-time streaming platforms and trading infrastructure
•
Spearhead the development of PAMM, MAMM, and Introducing Broker systems, etc.
•
Design and implement scalable data ingestion pipelines handling high-volume trade and market data
•
Collaborate cross-functionally with Product, QA, and Support teams
•
Implement and optimize platform reliability metrics and data quality standards
•
Mentor engineers in distributed systems, stream processing, and financial domain expertise
•
Champion best practices in real-time data processing, monitoring, and incident response
Experience That Matters:
•
Industry Depth: 10+ years of progressive software engineering experience with significant focus on data platforms, streaming systems, or high-throughput distributed architectures
•
Data Platform Expertise: 5+ years building and scaling real-time data platforms, streaming architectures, or event-driven systems, with a deep understanding of data consistency, backpressure handling, and fault tolerance patterns
•
Team Leadership: 4+ years successfully managing and growing engineering teams of 8-10 developers, with experience hiring specialized data engineers and platform architects during rapid scaling phases
•
System Complexity: Proven track record of architecting and delivering mission-critical data infrastructure that processes high-volume, low-latency data streams with strict reliability and accuracy requirements
•
Financial Domain Impact: Experience working with trading systems, financial data feeds, risk management platforms, or other latency-sensitive financial infrastructure where data accuracy and timing are paramount
•
Streaming & Event Processing: Expert-level experience with Apache Kafka, Apache Flink, and modern stream processing frameworks, including complex event processing, windowing operations, and exactly-once semantics
•
Data Engineering Mastery: Deep expertise in building data pipelines, ETL/ELT processes, data modeling for real-time analytics, and handling both structured and unstructured data at scale
•
Java & JVM Ecosystem: Advanced proficiency in Java (11+) and JVM performance tuning, with experience building high-throughput applications that handle thousands of transactions per second
•
Cloud-Native Data Architecture: Extensive hands-on experience with FOSS data stack as well as cloud data, containerization of stateful services, and cloud-native data storage solutions
•
Database & Storage Systems: Expert knowledge of both SQL and NoSQL databases, time-series databases, distributed caching, and storage optimization for high-frequency data
•
Microservices & APIs: Strong experience in designing event-driven microservices, implementing robust API gateways, handling service mesh complexity, and managing inter-service communication patterns in distributed data systems
•
Platform Reliability: Deep understanding of observability frameworks, distributed tracing, metrics collection, alerting strategies, and building self-healing systems that maintain high availability
•
AI-Assisted Development: Experience leveraging modern AI coding tools (GitHub Copilot, Cursor, ChatGPT, Claude, Tabnine) to accelerate development workflows, improve code quality, and enhance team productivity while maintaining security and best practices
Leadership & Process:
•
Agile for Data Teams: Deep expertise in Agile/Scrum methodologies adapted for data and platform teams, with experience managing complex dependencies between data pipelines, platform services, and downstream consumers
•
Data Engineering Culture: Proven ability to establish engineering best practices specific to data platforms including comprehensive testing strategies for streaming applications, data quality validation, schema evolution management, and effective incident response for data outages
•
Metrics-Driven Platform Management: Experience implementing platform-specific metrics such as data processing latency, throughput rates, error rates, data quality scores, SLA adherence for data availability, and consumer satisfaction metrics
•
Quality & Reliability: Strong focus on building reliability into data systems through automated testing, chaos engineering, data validation frameworks, and collaborative practices that ensure data accuracy and system resilience
•
Cross-Functional Process Optimization: Track record of identifying bottlenecks in data workflows and implementing solutions that improve data delivery speed, reduce pipeline complexity, and enable faster time-to-insight for business stakeholders
•
Mentorship-Driven Development: Strong background in pair programming, code mentoring, guided development sessions, and creating structured learning paths that help junior developers grow into senior contributors and future technical leads
•
Data-Driven Customer Focus: Genuine passion for understanding how data serves end-users and translating business requirements into scalable data solutions, with experience gathering feedback from traders, analysts, and business users
•
Platform Ownership & Accountability: Demonstrated history of taking end-to-end ownership of data platforms and trading systems, from initial architecture through production operations, with a "you build it, you run it" mentality for critical infrastructure
•
Innovation with Reliability: Ability to balance cutting-edge data technologies with the stability requirements of financial systems, making thoughtful decisions about when to adopt new streaming technologies versus proven solutions
•
Cross-Domain Collaboration: Natural ability to build consensus between engineering, trading, risk management, compliance, and business intelligence teams while maintaining technical standards and data governance requirements
•
Continuous Learning Mindset: Passionate about staying current with evolving data engineering landscape, financial technology trends, and regulatory requirements, while
creating an environment where team members experiment with new tools and methodologies
•
Bias for Action with Precision: Comfortable making architectural decisions with incomplete information while maintaining the precision and accuracy required for financial data processing, and skilled at balancing innovation speed with regulatory compliance.
Domain & Industry:
•
Fintech Trading Systems: Deep understanding of trading infrastructure, market data feeds, order management systems, risk management platforms, and the unique challenges of building real-time financial data processing systems
•
PAMM/MAMM Experience: Direct experience building or maintaining Percentage Allocation Management Modules, Multi-Account Manager systems, Copy Trading platforms, or Introducing Broker infrastructure
•
High-Frequency Data Processing: Experience with ultra-low-latency systems, real-time market data processing, algorithmic trading support, or other performance-critical applications where microseconds matter
•
Regulatory & Compliance: Understanding of financial services regulations (MiFID II, EMIR, Dodd-Frank), data governance requirements, audit trails, and compliance reporting for trading platforms
•
Global Trading Infrastructure: Experience building systems that handle multiple asset classes, serve international markets, manage complex regulatory requirements across jurisdictions, and handle diverse market data formats
YourTribe
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Java coding challenges to boost your skills
Start Practicing Java Nowbengaluru, karnataka, india
Salary: Not disclosed
india
Salary: Not disclosed
hyderabad, telangana, india
Salary: Not disclosed
kolkata, mumbai, new delhi, hyderabad, pune, chennai, bengaluru
13.0 - 17.0 Lacs P.A.
kolkata, mumbai, new delhi, hyderabad, pune, chennai, bengaluru
9.0 - 13.0 Lacs P.A.
kolkata, mumbai, new delhi, hyderabad, pune, chennai, bengaluru
13.0 - 17.0 Lacs P.A.
hyderabad, telangana
Salary: Not disclosed
pune, mumbai (all areas)
20.0 - 25.0 Lacs P.A.
0.6 - 0.9 Lacs P.A.
bengaluru
45.0 - 60.0 Lacs P.A.