We are looking for an experienced Senior Data Engineer to join our team in Bengaluru / Hosur/Ahmedabad. Someone who can help to build scalable, reliable, and secure Data analytic solutions. Skills Required: 5+ years in data engineering and Microsoft Azure. Experience in implementing Data Lake with technologies like Azure Data Factory (ADF), PySpark, Databricks, ADLS, Azure SQL Database A comprehensive foundation with working knowledge of Azure Full stack, Event Hub & Streaming Analytics. A passion for writing high-quality code and the code should be modular, scalable, and free of bugs (debugging skills in SQL, Python, or Scala/Java). Enthuse to collaborate with various stakeholders across the organization and take complete ownership of deliverables. Experience in using big data technologies like Hadoop, Spark, Databricks, Airflow, Kafka. Adept understanding of different file formats like Delta Lake, Avro, Parquet, JSON, and CSV. Good knowledge of building and designing REST APIs with real-time experience working on Data Lake/Lakehouse projects. Experience in supporting BI and Data Science teams in consuming the data in a secure and governed manner. Certifications like Data Engineering on Microsoft Azure (DP-203) or Databricks Certified Developer (DE) are a valuable addition. Responsibilities: Build and learn about a variety of analytics solutions & platforms, data lakes, modern data platforms, data fabric solutions, etc. using different Open Source, Big Data, and Cloud technologies on Microsoft Azure. Design and build scalable & metadata-driven data ingestion pipelines (For Batch and Streaming Datasets). Conceptualize and execute high-performance data processing for structured and unstructured data, and data harmonization. Schedule, orchestrate, and validate pipelines. Design exception handling and log monitoring for debugging. Ideate with your peers to make tech stack and tools-related decisions. Interact and collaborate with multiple teams (Vendors/Consultants/Data Science & Dev team) and various stakeholders to meet deadlines, to bring Analytical Solutions to life. Should be able to understand complex architectures and be comfortable working with multiple teams. Delivery teams to build, automate, and deploy cloud solutions on Microsoft Azure. Monitor production, staging, test, and development environments for a myriad of applications in an agile and dynamic organization. Should be well-versed in Security Skills. Should have a Customer-Focused approach and understanding the problems
We are seeking an experienced Databricks Solution Architect to lead technical solutioning, architecture design, Presales and customer engagements for data engineering, analytics, and AI workloads on the Databricks platform. The ideal candidate will work closely with sales, delivery, and customer teams to craft high-impact solutions, develop proposals, conduct workshops, and support end-to-end presales cycles. Key Responsibilities: Solution Architecture Architect end-to-end data solutions leveraging Databricks Lakehouse Platform for data engineering, streaming, analytics, ML, and governance. Define best practices for data ingestion, transformation, optimization, Delta Lake design, cluster configuration, and performance tuning. Design ETL/ELT pipelines, data models, data quality frameworks, and cost-efficient architectures. Integrate Databricks with Azure/AWS/GCP cloud services (ADF, ADB, S3, ADLS, IAM, Dataflow, BigQuery, Redshift, etc.) Guide internal teams and clients during PoCs/POVs and ensure technical success. Conduct capability presentations, roadmaps, and architecture review sessions. Build reusable accelerators, frameworks, and solution kits for presales and delivery. Stay updated with Databricks roadmap, new features (Unity Catalog, DBSQL, GenAI, Delta Live Tables, Mosaic AI, Governance), and industry trends Required Skills: 7-12 years of experience in data engineering, analytics, or cloud architecture. 3-5 years hands-on experience with Databricks and Delta Lake Strong understanding of: PySpark/Spark SQL Delta Lake optimization & medallion architecture MLflow, Feature Store (good to have) Databricks Jobs, DLT, Workflows, Unity Catalog Databricks SQL / DB SQL Warehouse CI/CD for Databricks (Repos, Git integration) Presales Competencies: Strong presentation, whiteboarding, and client-facing communication skills. Experience preparing proposals, RFP/RFI responses, and solution blueprints. Ability to define pricing, licensing, and sizing for Databricks workloads Experience in scoping PoCs/POVs and managing presales cycles.