Experience : 5 to 8Yrs (must have lead experience) Mode of Work : WFO Only Location : Chennai Immediate Joiners Only can Apply Required Skills & Experience: 5–7 years of professional experience in software development. Expertise in .NET Core / .NET Framework, C#, and Web APIs. Strong front-end development experience using ReactJS (including hooks, state management). Proficiency in MySQL or other relational databases; ability to design schemas, optimize queries, and handle DB migrations. Experience in leading small to mid-sized development teams. Excellent problem-solving skills and a strong grasp of software engineering principles. Effective communication skills to interact with both internal teams and external customers. Key Responsibilities: Lead the design, development, and deployment of scalable web applications using .NET (C#), React, and MySQL. Guide and mentor a team of developers, ensuring adherence to coding best practices and architectural standards. Collaborate with product owners, QA, and other cross-functional teams to ensure timely delivery of high-quality software. Participate in architecture discussions and propose scalable and maintainable solutions. Act as a primary technical point of contact for the customer and represent the engineering team in all technical discussions. Review code, resolve technical roadblocks, and ensure continuous improvement in the development process. Support DevOps practices like CI/CD, monitoring, and cloud deployment—preferably in Azure. Show more Show less
Key Responsibilities: Hands-on experience on designing and developing Microsoft Power Platform solutions - Power Apps (Especially Model driven) and Power Automates.- Design and configure Dataverse tables, relationships, forms, views, charts, and dashboards to support business processes and data management within model-driven apps. Implement business rules, workflows, and business process flows using Power Automate (formerly Microsoft Flow) to automate tasks and streamline operations. Extend model-driven app functionality through client scripting (JavaScript), web resources, and Power Apps Component Framework (PCF) controls for advanced UI/UX and custom behavior. Lead the design and development of complex Power Apps solutions (both canvas and model-driven) to meet advanced business requirements. Expert proficiency in Power Apps development (Canvas and Model-driven) including advanced Power Fx, component framework (PCF), and ALM practices Demonstrated ability to create solutions with various data sources such as data verse, SharePoint, SQL, Azure components etc Develop custom business logic and integrations (using advanced Power Fx, Azure Functions, custom connectors, etc.) to extend Power Platform capabilities Configure and optimize Microsoft Dataverse schemas. Manage data flows between Power Apps and external databases. Ensure solutions are secure, scalable, and compliant with best practices (including governance, authentication, and data validation) . Integrate PowerApps with other Microsoft services (e.g., SharePoint, Dynamics 365, Power BI) and external systems using custom connectors and APIs. Extensive experience with Power Automate (cloud flows, UI flows) and proven ability to integrate Power Platform with other systems and services Create and maintain comprehensive documentation, including design specifications, technical guides, and user manuals. Job Type: Contractual / Temporary Pay: ₹60,000.00 - ₹100,000.00 per month Schedule: Day shift Fixed shift Work Location: In person
Key Responsibilities: Hands-on experience on designing and developing Microsoft Power Platform solutions - Power Apps (Especially Model driven) and Power Automates.- Design and configure Dataverse tables, relationships, forms, views, charts, and dashboards to support business processes and data management within model-driven apps. Implement business rules, workflows, and business process flows using Power Automate (formerly Microsoft Flow) to automate tasks and streamline operations. Extend model-driven app functionality through client scripting (JavaScript), web resources, and Power Apps Component Framework (PCF) controls for advanced UI/UX and custom behavior. Lead the design and development of complex Power Apps solutions (both canvas and model-driven) to meet advanced business requirements. Expert proficiency in Power Apps development (Canvas and Model-driven) including advanced Power Fx, component framework (PCF), and ALM practices Demonstrated ability to create solutions with various data sources such as data verse, SharePoint, SQL, Azure components etc Develop custom business logic and integrations (using advanced Power Fx, Azure Functions, custom connectors, etc.) to extend Power Platform capabilities Configure and optimize Microsoft Dataverse schemas. Manage data flows between Power Apps and external databases. Ensure solutions are secure, scalable, and compliant with best practices (including governance, authentication, and data validation) . Integrate PowerApps with other Microsoft services (e.g., SharePoint, Dynamics 365, Power BI) and external systems using custom connectors and APIs. Extensive experience with Power Automate (cloud flows, UI flows) and proven ability to integrate Power Platform with other systems and services Create and maintain comprehensive documentation, including design specifications, technical guides, and user manuals. Show more Show less
Job Role : Data Engineer Full time / Contract Experience : 2 to 4Yrs Mode of Work : WFO Only Location : Chennai Job description: Key Skills: SQL, ETL Tools, ADF, ADB, Reporting Tools Key Requirements: The day-to-day development activities will need knowledge of the below concepts. Expert-level knowledge in RDBMS (SQL Server) with clear understanding of SQL query writing, object creation and management and performance and optimisation of DB/DWH operations. Good understanding of Transactional and Dimensional Data Modelling, Star Schema, Facts/Dimensions, Relationships Good understanding of ETL concepts and exposure to tools such as Azure Data Factory, Azure Databricks, Airflow. In-depth expertise in Azure Data Factory and Databricks, including building scalable data pipelines, orchestrating complex workflows, implementing dynamic and parameterized pipelines, and optimizing Spark-based data transformations for large-scale integrations. Hands-on experience with Databricks Unity Catalog for centralized data governance, fine-grained access control, auditing, and managing data assets securely across multiple workspaces. Should have worked on at least 1 development lifecycle of one of the below: End-to-end ETL project (Involving above mentioned ETL tools) Ability to write and review test cases, test code and validate code. Good understanding of SDLC practices like source control, version management, usage of Azure Devops and CI/CD practices. Project context: Should have the skill to fully understand the context and use-case of a project and have a personal vision for it – Play the role of interfacing with customer directly on a daily basis. Should be able to converse with functional users and convert requirements into tangible processes/models and documentation in available templates. Should be able to provide consultative options to customer on best way to execute projects Should have a good understanding of project dynamics – scoping, setting estimates, setting timelines, working around timelines in case of exceptions, etc. Preferred skills: Knowledge of Python is a bonus Knowledge of SSIS is a bonus Knowledge of Azure Devops, Source Control/Repos is good to have Job Type: Contractual / Temporary Pay: ₹600,000.00 - ₹2,000,000.00 per year Application Question(s): Are you an immediate joiner? Are you ready to work from office chennai location? How many years experience are you having in total? How many years experience are you having as a Data Engineer? How many years experience are you having in Azure Databricks ? Work Location: In person
Job Role : Data Engineer Full time / Contract Experience : 2 to 6Yrs Mode of Work : WFO Only Location : Chennai Job description: Key Skills: SQL, ETL Tools, ADF, ADB, SSIS, Reporting Tools Key Requirements: The day-to-day development activities will need knowledge of the below concepts. Expert-level knowledge in RDBMS (SQL Server) with clear understanding of SQL query writing, object creation and management and performance and optimisation of DB/DWH operations. Good understanding of Transactional and Dimensional Data Modelling, Star Schema, Facts/Dimensions, Relationships Good understanding of ETL concepts and exposure to tools such as Azure Data Factory, Azure Databricks, Airflow. In-depth expertise in Azure Data Factory and Databricks, including building scalable data pipelines, orchestrating complex workflows, implementing dynamic and parameterized pipelines, and optimizing Spark-based data transformations for large-scale integrations. Hands-on experience with Databricks Unity Catalog for centralized data governance, fine-grained access control, auditing, and managing data assets securely across multiple workspaces. Should have worked on at least 1 development lifecycle of one of the below: End-to-end ETL project (Involving above mentioned ETL tools) Ability to write and review test cases, test code and validate code. Good understanding of SDLC practices like source control, version management, usage of Azure Devops and CI/CD practices. Project context: Should have the skill to fully understand the context and use-case of a project and have a personal vision for it – Play the role of interfacing with customer directly on a daily basis. Should be able to converse with functional users and convert requirements into tangible processes/models and documentation in available templates. Should be able to provide consultative options to customer on best way to execute projects Should have a good understanding of project dynamics – scoping, setting estimates, setting timelines, working around timelines in case of exceptions, etc. Preferred skills: Knowledge of Python is a bonus Knowledge of SSIS is a bonus Knowledge of Azure Devops, Source Control/Repos is good to have
Role : ML Ops Support Engineer *Job description:* We are looking for a skilled MLOps Support Engineer to join our team. This role involves monitoring and managing ML model operational pipelines in AzureML and MLflow, with an emphasis on automation, integration validation, and CI/CD pipeline management. The ideal candidate will be technically sound in Python, Azure CLI, and MLOps tools, and capable of ensuring stability and reliability in model deployment lifecycles. *Objectives of the role:* Support and monitor MLOps pipelines in AzureML and MLflow Manage CI/CD pipelines for model deployment and updates Handle model registry processes, ensuring best practices for versioning and tracking Perform testing & validation of integrated endpoints to ensure non-functional stability Automate monitoring and upkeep of ML pipelines to relieve the data science team Troubleshoot and resolve pipeline and integration-related issues *Responsibilities:* Support production ML pipelines using AzureML and MLflow Configure and manage model versioning and registry lifecycle Automate alerts, monitoring tasks, and routine pipeline operations Validate REST API endpoints for ML models Implement CI/CD workflows for ML deployments Document and troubleshoot operational issues related to ML services Collaborate with data scientists and platform teams to ensure delivery continuity *Required Skills & Qualifications:* Proficiency in AzureML, MLflow, and Databricks Strong command over Python Experience with Azure CLI and scripting Good understanding of CI/CD practices in MLOps Knowledge of model registry management and deployment validation 3–5 years of relevant experience in MLOps environments Skills that are good to have, but not mandatory: Exposure to monitoring tools (e.g., Azure Monitor, Prometheus) Experience with REST API testing (e.g., Postman) Familiarity with Docker/Kubernetes in ML deployments
As a skilled MLOps Support Engineer, you will be responsible for monitoring and managing ML model operational pipelines in AzureML and MLflow. Your primary focus will be on automation, integration validation, and CI/CD pipeline management to ensure stability and reliability in model deployment lifecycles. Your objectives in this role include supporting and monitoring MLOps pipelines in AzureML and MLflow, managing CI/CD pipelines for model deployment and updates, handling model registry processes, performing testing and validation of integrated endpoints, automating monitoring and upkeep of ML pipelines, as well as troubleshooting and resolving pipeline and integration-related issues. In your day-to-day responsibilities, you will support production ML pipelines using AzureML and MLflow, configure and manage model versioning and registry lifecycle, automate alerts, monitoring tasks, and routine pipeline operations, validate REST API endpoints for ML models, implement CI/CD workflows for ML deployments, document and troubleshoot operational issues related to ML services, and collaborate with data scientists and platform teams to ensure delivery continuity. To excel in this role, you should possess proficiency in AzureML, MLflow, and Databricks, have a strong command over Python, experience with Azure CLI and scripting, a good understanding of CI/CD practices in MLOps, knowledge of model registry management and deployment validation, and at least 3-5 years of relevant experience in MLOps environments. While not mandatory, it would be beneficial to have skills such as exposure to monitoring tools like Azure Monitor and Prometheus, experience with REST API testing tools such as Postman, and familiarity with Docker/Kubernetes in ML deployments.,
We are searching for a Data Engineer with strong experience in Data Bricks and the ability to join immediately. The ideal candidate should have 4 to 6 years of experience and must work from office (WFO) in Chennai. Key Skills required for this role include proficiency in SQL, ETL Tools, ADF, ADB, and Reporting Tools. Key Requirements: - Expert-level knowledge in RDBMS (SQL Server) with the ability to write SQL queries, create and manage objects, and optimize DB/DWH operations. - Good understanding of Transactional and Dimensional Data Modelling, Star Schema, Facts/Dimensions, and Relationships. - Familiarity with ETL concepts and experience with tools like Azure Data Factory, Azure Databricks, and Airflow. - In-depth expertise in Azure Data Factory and Databricks, including building scalable data pipelines, orchestrating workflows, implementing dynamic and parameterized pipelines, and optimizing Spark-based data transformations for large-scale integrations. - Hands-on experience with Databricks Unity Catalog for centralized data governance, access control, auditing, and managing data assets securely across multiple workspaces. - Experience in at least one development lifecycle of an end-to-end ETL project involving the mentioned ETL tools. - Ability to write and review test cases, test code, and validate code. - Good understanding of SDLC practices like source control, version management, and utilization of Azure Devops and CI/CD practices. Preferred Skills: - Knowledge of Python is a bonus. - Knowledge of SSIS is a bonus. - Familiarity with Azure Devops, Source Control/Repos is advantageous for this role.,
You should have 3-4 years of overall experience with at least 2 years of recent experience in Boomi, demonstrating a strong understanding of EAI concepts. Your expertise should include working with various Boomi EDI connectors such as SFTP, HTTP, and web-services. Additionally, you should be proficient in developing using SOA integration methodologies and protocols like REST, SOAP, XML, and JSON. Experience with other EAI tools like Web Methods, Tibco, and MuleSoft is also desired. Your responsibilities will include working on Boomi integration patterns involving file, web API, messaging, and database integrations. You should be familiar with Boomi MDM data models and API, as well as Boomi Flow for workflow automation and integration. Monitoring integration activities and handling data-related issues will be part of your day-to-day tasks. In this role, you will be expected to perform daily production support, monthly maintenance activities, and troubleshooting of user-identified problems within the Boomi application. You will manage activities across EAI environments, including development, QA, and production environments. Installation, configuration, patching, upgrades, and providing guidance on technical and functional issues will also be within your scope of work. Excellent customer-facing skills and strong written and oral communication are crucial for effectively translating requirements into system flows, data flows, and data mappings. You should have a solid understanding of ETL concepts and experience working with large volumes of data. Key Skills: SOA integration, ETL concepts, EDI connectors, data flows, XML, HTTP, EAI concepts, API, MuleSoft, integration, SFTP, Boomi, EDI, workflow automation, data models, Tibco, Web Methods, MDM, integration patterns, web-services, data mappings, Oracle SQL, JSON, production support, SOAP, integration monitoring, REST, SQL.,
Preferred experience: 3-5 years Primary skills: Java/Spring Framework -> Expert Apache Camel -> Practitioner Apache Kafka -> Novice AWS -> Practitioner Responsibilities: * Understand integration workflows, architectures, development process, deployment process, and support process. * Understand business flow (e-commerce). Able to suggest business solutions to Product Owner. Able todocumentbusiness requirements into technical integration documents whenever required. * Develop/deliver/support integration modules/services (API services, integration adapters on existing platform(AWScloud, AWS API Gateway, etc.) * Develop unit test and integration test cases to make sure integration flow works as required. * Monitor integration workflow and perform analysis of incident, defect, bug, issue on integration area. * Good knowledge in software development practices and be able to apply design principles to code. * Good sense of urgency, able to prioritize works appropriately. Understand and adopt changes quickly andreasonably. * Willing to work in team, able to communicate efficiently and concise. * Enjoy optimizing everything from how your code is compiled to how it scales across servers to providethebestenduser experience. * Able to coach others and initiate innovation ideas (senior role) * Provide needed reports (status report, incident report, etc.) Qualifications * Strong in Java programming language and Java’s framework (Spring, Apache Camel, etc.) * Good experience in software integration area (Middle & Senior Level), or willing to learn software integration. * Experience in event messaging including Apache Kafka, JMS, Apache Message Queue, Rabbit MQ, AWSSQS, AWSKinesis, etc. * Experience in Git, AWS Cloud and other AWS services. * Good experience in developing web service both REST API, SOAP API, and API security (certificate, oauth2, basicauthentication, etc.). * Experience in using ELK, or another Application Log management (Splunk). * Able to influence and driveprojectstomeet key milestones and overcome challenges comfortable working without routine supervision Competencies : * Ability to handle multiple assignments concurrently. * Good interpersonal skills, confidence and ability to interact professionally with people at all levels. * Motivated, flexible and with a can-do approach * Team player with a commitment to achieve team goals * A disciplined and conscientious approach * Ability to prioritise work and deliver * Keen to learn and develop proficiency * Strong communication skills in English (written and verbal) Educational Qualification: A minimum of an Under-Graduate degree in computer sciences or related fields or advanced university degree in same/similar fields. Job Types: Full-time, Contractual / Temporary Contract length: 4 months Pay: ₹600,000.00 - ₹1,000,000.00 per year Benefits: Provident Fund Application Question(s): What is your total experience? Your Current CTC Expected CTC? Notice period if currently working or LWD if serving NP or Immediate joiner? Interested in Contractual role or C2H role? Interested working from Chennai location from the first day of joining? Experience: Java: 3 years (Required) Spring Boot: 2 years (Required) Work Location: In person