Jobs
Interviews

Unifi Data Management Services

6 Job openings at Unifi Data Management Services
Data&AI Senior Technical Architect Chennai 10 - 15 years INR 20.0 - 35.0 Lacs P.A. Remote Full Time

Job Summary We are seeking a highly skilled Senior Technical Architect with expertise in Databricks, Apache Spark, and modern data engineering architectures. The ideal candidate will have a strong grasp of Generative AI and RAG pipelines and a keen interest (or working knowledge) in Agentic AI systems. This individual will lead the architecture, design, and implementation of scalable data platforms and AI-powered applications for our global clients. This high-impact role requires technical leadership, cross-functional collaboration, and a passion for solving complex business challenges with data and AI. Key Responsibilities Lead architecture, design, and deployment of scalable data solutions using Databricks and the medallion architecture. Guide technical teams in building batch and streaming data pipelines using Spark, Delta Lake, and MLflow. Collaborate with clients and internal stakeholders to understand business needs and translate them into robust data and AI architectures. Design and prototype Generative AI applications using LLMs, RAG pipelines, and vector stores. Provide thought leadership on the adoption of Agentic AI systems in enterprise environments. Mentor data engineers and solution architects across multiple projects. Ensure adherence to security, governance, performance, and reliability best practices. Stay current with emerging trends in data engineering, MLOps, GenAI, and agent-based systems. Qualifications Bachelor's or Master's degree in Computer Science, Engineering, or related technical discipline. 10+ years of experience in data architecture, data engineering, or software architecture roles. 5+ years of hands-on experience with Databricks, including Spark SQL, Delta Lake, Unity Catalog, and MLflow. Proven experience in designing and delivering production-grade data platforms and pipelines. Exposure to LLM frameworks (OpenAI, Hugging Face, LangChain, etc.) and vector databases (FAISS, Weaviate, etc.). Strong understanding of cloud platforms (Azure, AWS, or GCP), particularly in the context of Databricks deployment. Knowledge or interest in Agentic AI frameworks and multi-agent system design is highly desirable. Technical Skills Databricks (incl. Spark, Delta Lake, MLflow, Unity Catalog) Python, SQL, PySpark GenAI tools and libraries (LangChain, OpenAI, etc.) CI/CD and DevOps for data REST APIs, JSON, data serialization formats Cloud services (Azure/AWS/GCP) Soft Skills Strong communication and stakeholder management skills Ability to lead and mentor diverse technical teams Strategic thinking with a bias for action Comfortable with ambiguity and iterative development Client-first mindset and consultative approach Excellent problem-solving and analytical skills Preferred Certifications Databricks Certified Data Engineer / Architect Cloud certifications (Azure/AWS/GCP) Any certifications in AI/ML, NLP, or GenAI frameworks are a plus

Data & AI Senior Technical Architect Chennai,Tamil Nadu,India 10 years Not disclosed On-site Full Time

We are seeking a highly skilled Senior Technical Architect with expertise in Databricks, Apache Spark, and modern data engineering architectures. The ideal candidate will have a strong grasp of Generative AI and RAG pipelines and a keen interest (or working knowledge) in Agentic AI systems. This individual will lead the architecture, design, and implementation of scalable data platforms and AI-powered applications for our global clients. This high-impact role requires technical leadership, cross-functional collaboration, and a passion for solving complex business challenges with data and AI. Responsibilities Lead architecture, design, and deployment of scalable data solutions using Databricks and the medallion architecture. Guide technical teams in building batch and streaming data pipelines using Spark, Delta Lake, and MLflow. Collaborate with clients and internal stakeholders to understand business needs and translate them into robust data and AI architectures. Design and prototype Generative AI applications using LLMs, RAG pipelines, and vector stores. Provide thought leadership on the adoption of Agentic AI systems in enterprise environments. Mentor data engineers and solution architects across multiple projects. Ensure adherence to security, governance, performance, and reliability best practices. Stay current with emerging trends in data engineering, MLOps, GenAI, and agent-based systems. Qualifications Bachelor's or Master's degree in Computer Science, Engineering, or related technical discipline. 10+ years of experience in data architecture, data engineering, or software architecture roles. 5+ years of hands-on experience with Databricks, including Spark SQL, Delta Lake, Unity Catalog, and MLflow. Proven experience in designing and delivering production-grade data platforms and pipelines. Exposure to LLM frameworks (OpenAI, Hugging Face, LangChain, etc.) and vector databases (FAISS, Weaviate, etc.). Strong understanding of cloud platforms (Azure, AWS, or GCP), particularly in the context of Databricks deployment. Knowledge or interest in Agentic AI frameworks and multi-agent system design is highly desirable. Technical Skills Databricks (incl. Spark, Delta Lake, MLflow, Unity Catalog) Python, SQL, PySpark GenAI tools and libraries (LangChain, OpenAI, etc.) CI/CD and DevOps for data REST APIs, JSON, data serialization formats Cloud services (Azure/AWS/GCP) Soft Skills Strong communication and stakeholder management skills Ability to lead and mentor diverse technical teams Strategic thinking with a bias for action Comfortable with ambiguity and iterative development Client-first mindset and consultative approach Excellent problem-solving and analytical skills Preferred Certifications Databricks Certified Data Engineer / Architect Cloud certifications (Azure/AWS/GCP) Any certifications in AI/ML, NLP, or GenAI frameworks are a plus Show more Show less

Data&AI Senior Technical Architect Chennai,Tamil Nadu,India 10 years Not disclosed On-site Full Time

Job Description We are seeking a highly skilled Senior Technical Architect with expertise in Databricks, Apache Spark, and modern data engineering architectures. The ideal candidate will have a strong grasp of Generative AI and RAG pipelines and a keen interest (or working knowledge) in Agentic AI systems. This individual will lead the architecture, design, and implementation of scalable data platforms and AI-powered applications for our global clients. This high-impact role requires technical leadership, cross-functional collaboration, and a passion for solving complex business challenges with data and AI. Key Responsibilities Lead architecture, design, and deployment of scalable data solutions using Databricks and the medallion architecture. Guide technical teams in building batch and streaming data pipelines using Spark, Delta Lake, and MLflow. Collaborate with clients and internal stakeholders to understand business needs and translate them into robust data and AI architectures. Design and prototype Generative AI applications using LLMs, RAG pipelines, and vector stores. Provide thought leadership on the adoption of Agentic AI systems in enterprise environments. Mentor data engineers and solution architects across multiple projects. Ensure adherence to security, governance, performance, and reliability best practices. Stay current with emerging trends in data engineering, MLOps, GenAI, and agent-based systems. Qualifications Bachelor's or Master's degree in Computer Science, Engineering, or related technical discipline. 10+ years of experience in data architecture, data engineering, or software architecture roles. 5+ years of hands-on experience with Databricks, including Spark SQL, Delta Lake, Unity Catalog, and MLflow. Proven experience in designing and delivering production-grade data platforms and pipelines. Exposure to LLM frameworks (OpenAI, Hugging Face, LangChain, etc.) and vector databases (FAISS, Weaviate, etc.). Strong understanding of cloud platforms (Azure, AWS, or GCP), particularly in the context of Databricks deployment. Knowledge or interest in Agentic AI frameworks and multi-agent system design is highly desirable. Technical Skills Databricks (incl. Spark, Delta Lake, MLflow, Unity Catalog) Python, SQL, PySpark GenAI tools and libraries (LangChain, OpenAI, etc.) CI/CD and DevOps for data REST APIs, JSON, data serialization formats Cloud services (Azure/AWS/GCP) Soft Skills Strong communication and stakeholder management skills Ability to lead and mentor diverse technical teams Strategic thinking with a bias for action Comfortable with ambiguity and iterative development Client-first mindset and consultative approach Excellent problem-solving and analytical skills Preferred Certifications Databricks Certified Data Engineer / Architect Cloud certifications (Azure/AWS/GCP) Any certifications in AI/ML, NLP, or GenAI frameworks are a plus Show more Show less

Azure Solution Architect Chennai 8 - 13 years INR 20.0 - 30.0 Lacs P.A. Remote Full Time

Job Summary: We are seeking a highly skilled Azure Solution Architect to design, implement, and oversee cloud-based solutions on Microsoft Azure. The ideal candidate will have a deep understanding of cloud architecture, a strong technical background, and the ability to align Azure capabilities with business needs. You will lead the architecture and design of scalable, secure, and resilient Azure solutions across multiple projects. Role & responsibilities: Design end-to-end data architectures on Azure using Microsoft Fabric, Data Lake (ADLS Gen2), Azure SQL/Synapse, and Power BI. Lead the implementation of data integration and orchestration pipelines using Azure Data Factory and Fabric Data Pipelines. Architect Lakehouse/Data Warehouse solutions for both batch and real-time processing, ensuring performance, scalability, and cost optimization. Establish data governance, lineage, and cataloging frameworks using Microsoft Purview and other observability tools. Enable data quality, classification, and privacy controls aligned with compliance and regulatory standards. Drive adoption of event-driven data ingestion patterns using Event Hubs, Event Grid, or Stream Analytics. Provide architectural oversight on reporting and visualization solutions using Power BI integrated with Fabric datasets and models. Define architecture standards, data models, and reusable components to accelerate project delivery. Collaborate with data stewards, business stakeholders, and engineering teams to define functional and non-functional requirements. Support CI/CD, infrastructure as code, and DevOps for data pipelines using Azure DevOps or GitHub Actions. Lead Proof of Concepts (PoCs) and performance evaluations for emerging Azure data services and tools. Monitor system performance, data flow, and health using Azure Monitor and Fabric observability capabilities. Required Qualifications: Bachelors degree in Computer Science, Data Engineering, or a related field. 5+ years of experience as a data architect or solution architect in cloud data environments. 3+ years of hands-on experience designing and implementing data solutions on Microsoft Azure . Strong hands-on expertise with: Azure Data Factory Microsoft Fabric (Data Engineering, Data Warehouse, Real-Time Analytics, Power BI) Azure Data Lake (ADLS Gen2), Azure SQL, and Synapse Analytics Power BI for enterprise reporting and data modeling Experience with data governance and cataloging tools , ideally Microsoft Purview. Proficient in data modeling techniques (dimensional, normalized, or data vault). Strong understanding of security, RBAC, data encryption, Key Vault, and privacy requirements in Azure. Preferred Qualifications: Microsoft Certified: Azure Solutions Architect Expert (AZ-305) or Azure Enterprise Data Analyst Associate (DP-500) . Hands-on experience with Microsoft Fabric end-to-end implementation. Familiarity with medallion architecture , delta lake, and modern lakehouse principles. Experience in Agile/Scrum environments and stakeholder engagement across business and IT. Strong communication skills, with the ability to explain complex concepts to both technical and non-technical audiences.

PMO Analyst Chennai 10 - 15 years INR 15.0 - 25.0 Lacs P.A. Remote Full Time

Job Summary: We are seeking a highly organized and detail-oriented PMO Analyst to support enterprise programs by coordinating with project managers, maintaining project documentation, updating governance materials (e.g., SteerCo decks), and ensuring timely updates in project tracking systems such as Azure DevOps and Microsoft Project. This role is critical in driving project transparency, alignment, and reporting accuracy across key initiatives. Key Responsibilities: Collaborate with project managers to gather weekly updates on project progress, risks, and milestones. Maintain and update Steering Committee decks , status reports, and dashboards with current project data. Manage and continuously update project plans (MPP files) , timelines, and delivery schedules. Structure and maintain work items in Azure DevOps including creation and tracking of Epics, Features, User Stories, and Tasks . Consolidate inputs for monthly and quarterly reviews, providing clear visibility into program health. Track and flag delays, risks, and dependencies, coordinating resolutions with project owners. Ensure project documentation ( RACI, RAID logs, charters, etc. ) is current and accessible. Assist in resource planning, capacity tracking, and sprint/iteration reviews when needed. Support PMO governance processes, audit readiness, and compliance tracking. Qualifications & Skills: Bachelors degree in Business, Information Technology, or related field. 10+ years of experience in a project coordination or PMO role within large transformation or technology programs. Proficiency with Microsoft Project (MPP) , PowerPoint , Excel , and Azure DevOps . Strong understanding of Agile and Waterfall project management methodologies . Excellent communication and stakeholder management skills. Attention to detail and ability to work under tight deadlines. PMP or PMI-ACP certification. (preferred) Preferred Tools Knowledge: Azure DevOps MS Project (MPP) Confluence / SharePoint Power BI (basic reporting) Jira (optional)

Full Stack Developer with Java or Node.js Experience Chennai 5 - 8 years INR 12.0 - 22.0 Lacs P.A. Remote Full Time

Job Summary: We are seeking a highly skilled Full Stack Developer with over 5 years of hands-on experience in both front-end and back-end development, specializing in Java or Node.js . The ideal candidate should have strong expertise in Node.js , Angular and React. They should comfortably work across the entire stack from designing and building RESTful APIs and backend services to crafting responsive, user-centric front-end applications. This is purely a technical position. The candidate should have strong programming skills. Key Responsibilities: Design, develop, test, and maintain end-to-end web applications. Develop robust backend logic using Java , Spring Boot, and RESTful APIs. Create responsive and dynamic user interfaces using Angular or React . Strong Technical Hands-on experience in Node.js, Angular and React. In-depth knowledge of front-end technologies such as HTML, CSS, and JavaScript frameworks Develop user friendly interfaces and implement complex UI/UX designs Knowledge of Docker, Kubernetes, CI/CD pipelines Good hands-on experience with cloud services and infrastructure (AWS/Azure or GCP) Good understanding of database technologies, such as SQL/ NoSQL /MongoDB. Etc Follow Agile/Scrum development practices and contribute to sprint planning. Ability to work in a dynamic environment and have excellent organizational, interpersonal, written, and verbal communication skills Excellent analytical and problem-solving abilities Good ability to contribute on an individual basis. Exposure in GitLab / SonarQube / Sonar Lint. etc Required Qualification: 1. Bachelors or Masters degree in Computer Science, Engineering, or related field. 2. 5+ years of experience with JavaScript/TypeScript development. 3. Strong hands-on experience with Node.js and frameworks like Express.js. 4. Deep expertise in Angular (v10+), including RxJS, forms, routing, and component lifecycle. 5. Solid understanding of REST APIs, HTTP, and asynchronous programming. 6. Experience with Git, CI/CD pipelines, and containerization (Docker). 7. Knowledge of database design and querying (SQL and/or NoSQL).