Auxo AI is scaling its Lakehouse Engineering Team . If your Dremio + Data Architecture foundations are not rock solid , this role is not for you. We need someone who can design, build, optimize, and govern large-scale Dremio lakehouse systems—not someone who only “worked around it.” What You’ll Own Architect & implement Dremio-based lakehouse systems across cloud platforms Define strategies for ingestion, curation, semantic layers for analytics & AI Optimize reflections, caching & query performance with precision Integrate data from APIs, JDBC, Parquet/Delta, S3/ADLS/GCS Enforce security, lineage, RBAC, and governance Enable governed self-service analytics for BI & DS teams Build scalable standards for deployment, monitoring & optimization Work with engineering, analytics, and product teams to deliver fast, reliable data access Non-Negotiable Requirements (If you don’t meet these—don’t apply.) Experience 5+ years in data architecture / data engineering 3+ years hands-on with Dremio Proven experience designing end-to-end lakehouse architectures Strong SQL optimization, modeling, query tuning Deep knowledge of S3 / ADLS / GCS, Parquet, Delta, Iceberg Experience with Airflow, DBT, Kafka, Spark , or similar pipelines Strong governance: lineage, RBAC, enterprise security Excellent communicator with strong documentation discipline Must come from cloud-native / analytics / enterprise modernization environments Nice-to-Have Experience integrating Tableau, Power BI, Looker Exposure to Snowflake, Databricks, BigQuery Pre-Screening Questions (Mandatory) (Submit answers or your application will be skipped.) Years of hands-on experience with Dremio? Preferred location: Mumbai / Bengaluru / Hyderabad / Gurugram ? Comfortable with 3 days WFO ? OK with video ON during virtual interviews? CV Attachment is mandatory. Contact to Apply Phone: 7396612220 Email: akjobplacements@gmail.com Job Type: Full-time Pay: ₹3,500,000.00 - ₹4,500,000.00 per year Work Location: In person