About Exponentia.ai
Exponentia.ai is a fast-growing AI-first technology services company, partnering with enterprises to shape and accelerate their journey to AI maturity. With a presence across the US, UK, UAE, India, and Singapore , we bring together deep domain knowledge, cloud-scale engineering, and cutting-edge artificial intelligence to help our clients transform into agile, insight-driven organizations.
We are proud partners with global technology leaders such as Databricks, Microsoft, AWS, and Qlik , and have been consistently recognized for in ation, delivery excellence, and trusted advisories.
Awards & Recognitions:
- In ation Partner of the Year - Databricks,
- Digital Impact Award, UK - (TMT Sector)
- Rising Star - APJ Databricks Partner Awards
- Qlik s Most Enabled Partner - APAC
With a team of 450+ AI engineers, data scientists, and consultants , we are on a mission to redefine how work is done, by combining human intelligence with AI agents to deliver exponential outcomes.
Learn more:
About the Role:
We are seeking a skilled Data Engineer with hands-on experience in Databricks to design, develop, and optimize scalable data pipelines and data infrastructure. The role involves working with cloud-based data platforms, building efficient ETL/ELT workflows, and enabling data-driven ision-making across the organization.
Key Responsibilities
- Design, develop, and maintain ETL/ELT pipelines using Databricks, Spark, and cloud-native tools.
- Build and optimize scalable data architectures including data lakes, Delta Lake, and cloud warehouses.
- Develop and maintain Databricks notebooks, jobs, workflows, and clusters.
- Work with Delta Lake format for versioned, reliable, and high-performance data storage.
- Collaborate with data analysts, data scientists, and business teams to understand data needs.
- Ensure data quality, integrity, and governance with automated testing and itoring.
- Apply best practices for data modeling, partitioning, and performance tuning.
- Utilize cloud services (AWS/Azure/GCP) integrated with Databricks.
- Implement CI/CD pipelines for Databricks jobs and notebooks.
- Troubleshoot and Resolve data processing, performance, and cluster-related issues.
Ideal Candi Profile
- Bachelor s or Master s degree in Computer Science, Engineering, or related field.
- Strong proficiency in SQL and experience with relational and NoSQL databases.
- Hands-on experience with Databricks, including notebooks, workflows, jobs, clusters, and Delta Lake.
- Strong experience with Apache Spark (PySpark/Scala).
- Solid programming experience in Python or Scala.
- Experience with cloud platforms such as:
- Azure: Azure Databricks, Data Lake, Synapse, ADF
- AWS: Databricks on AWS, S3, Redshift, Glue
- GCP: Dat oc/BigQuery (added advantage)
- Knowledge of data warehousing concepts and data modeling.
- Experience with ETL/ELT tools such as Airflow, ADF, dbt, or AWS Glue.
- Familiarity with version control (Git) and CI/CD practices
Good to have
- Experience with real-time streaming technologies (Kafka, Kinesis, or Pub/Sub).
- Knowledge of Unity Catalog, Lakehouse architecture, and Databricks SQL.
- Understanding of DevOps, containerization (Docker), and Kubernetes.
- Exposure to MLflow or machine learning workflows within Databricks.
- Experience with BI tools (Power BI, Tableau, Looker).
Why Join Exponentia.ai
- In ate with Purpose: Opportunity to create pioneering AI solutions in partnership with leading cloud and data platforms
- Shape the Practice: Build a quee capability from the ground up with full ownership
- Work with the Best: Collaborate with top-tier talent and learn from industry leaders in AI
- Global Exposure: Be part of a high-growth firm operating across US, UK, UAE, India, and Singapore
- Continuous Growth: Access to certifications, tech events, and partner-led in ation labs
- Inclusive Culture: A supportive and diverse workplace that values learning, initiative, and ownership
Ready to build the future of AI with us
- Apply now and become a part of a next-gen tech company that s setting bench ks in enterprise AI solutions.