Gurgaon, Haryana, India
INR 15.0 - 20.0 Lacs P.A.
On-site
Full Time
Evaluate the data governance framework and Power BI environment, provide recommendations for enhancing data quality and discoverability, and optimize Power BI performance. Key Responsibilities: Review PowerShell (PS), SSIS, Batch Scripts, and C# (.NET 3.0) codebases for data processes Assess complexity of trigger migration across Active Batch (AB), Synapse, ADF, and Azure Databricks (ADB) Define and propose transitions in the use of Azure SQL DW, SQL DB, and Data Lake (DL) Analyze data patterns for optimization, including raw-to-consumption loading and elimination of intermediate zones (e.g., staging/application zones) Understand and implement requirements for external tables (Lakehouse) Ensure the quality of deliverables within project timelines Develop understanding of equity market domain Collaborate with domain experts and stakeholders to define business rules and logic Maintain continuous communication with global stakeholders Troubleshoot complex issues across development, test, UAT, and production environments Coordinate end-to-end project delivery and manage client queries Ensure adherence to SLA/TAT and perform quality checks Work independently as well as collaboratively in cross-functional teams Required Skills and Experience: B.E./B.Tech/MCA/MBA in Finance, Information Systems, Computer Science, or a related field 7+ years of experience in data and cloud architecture working with client stakeholders Strong knowledge of Power BI, Data Governance, Azure Data Factory, Azure Data Lake, Databricks Experience in reviewing PowerShell, SSIS, Batch Scripts, and .NET-based codebases Familiarity with data optimization patterns and architecture transitions in Azure Project management and team leadership experience within agile environments Strong organizational, analytical, and communication skills Ability to deliver high-quality results to internal and external stakeholders
Gurgaon, Haryana, India
INR 10.0 - 16.0 Lacs P.A.
On-site
Full Time
The Control-M Specialist will be responsible for designing, building, and maintaining enterprise job schedules using BMC Control-M to support the Momentum platform. The role involves close collaboration with application owners and infrastructure teams to ensure reliable, auditable, and optimized batch processing. Key Responsibilities Design and implement Control-M job definitions for Momentum platform processes Manage job dependencies, conditions, and calendar-based scheduling Monitor job executions, troubleshoot failures, and coordinate resolutions Collaborate with teams to create runbooks, track SLAs, and tune performance Participate in change control processes and production deployments Required Skills B.E./B.Tech in Computer Science or a related field 4+ years of hands-on experience with BMC Control-M Strong knowledge of job flow design, alert management, and batch processing best practices Experience in SQL and PowerShell scripting Familiarity with enterprise platforms (finance, regulatory, or ERP systems) Working knowledge of MFT (Managed File Transfer) solutions Experience in supporting mission-critical schedules and environments Boomi ETL experience is an added advantage
Gurgaon, Haryana, India
INR 15.0 - 24.0 Lacs P.A.
On-site
Full Time
The Engagement Manager (EM) is responsible for the success of a client program. This includes shaping the delivery model, overseeing technology and process investments, and serving as the main escalation point for all program-related concerns. The EM will also lead operational review meetings with client stakeholders to ensure alignment with strategic goals. Key Responsibilities Evaluate and optimize delivery models, technology stack, and process investments Act as the highest escalation point for unresolved program-level issues Conduct regular operational reviews with client leadership Ensure the program aligns with client strategic objectives and outcomes Develop and manage program plans with clear timelines, milestones, and resource allocations Monitor program performance and address deviations Collaborate with cross-functional teams for effective program integration Provide consistent updates and reports to senior management and stakeholders Facilitate workshops and training sessions to improve team performance Manage program budgets and ensure efficient resource utilization Identify and mitigate program risks Build and maintain strong client relationships Drive continuous improvement initiatives for better delivery and outcomes Desired Skills and Experience Proven experience in engagement or program management in a global delivery environment Strong leadership, strategic thinking, and decision-making abilities Excellent verbal and written communication skills Effective stakeholder management capabilities Ability to perform under high-pressure, fast-paced conditions Experience in Fixed Income Markets is a strong advantage Strong analytical skills to evaluate program performance Proficiency with project management tools such as MS Project and JIRA Expertise in risk management and mitigation Understanding of financial principles and budgeting Ability to lead collaborative and cross-functional teams Flexibility to adapt to evolving client needs and market trends
Gurgaon, Haryana, India
INR 8.0 - 13.0 Lacs P.A.
On-site
Full Time
The Lead MLOps Engineer will be responsible for leading technology initiatives aimed at improving business value andoutcomes in the areas of digital marketing and commercial analytics through the adoption of Artificial Intelligence (AI) enabled solutions. Working with cross-functional teams across AI projects to operationalize data science models to deployed scalable solutions delivering business value. They should be inquisitive and bring an innovate mindset to work every day, researching, proposing, and implementing MLOps process improvements, solution ideas and ways of working to be more agile, lean and productive. Provide leadership and technical expertise in operationalizing machine learning models, bridging the gap between data science and IT operations. Key responsibilities include designing, implementing, and optimizing MLOps infrastructure, building CI/CD pipelines for ML models, and ensuring the security and scalability of ML systems. Key Responsibilities Architect & Deploy: Design and manage scalable ML infrastructure on Azure (AKS), leveraging Infrastructure as Code principles. Automate & Accelerate: Build and optimize CI/CD pipelines with GitHub Actions for seamless software, data, andmodel delivery. Engineer Performance: Develop efficient and reliable data pipelines using Python and distributed computing frameworks. Ensure Reliability: Implement solutions for deploying and maintaining ML models in production. Collaborate & Innovate: Partner with data scientists and engineers to continuously enhance existing MLOps capabilities. Key Competencies: Experience: A minimum of 5+ years of experience in software engineering, data science, or a related field with experience in MLOps is typically required. Education: A bachelor's or master's degree in Computer Science / Engineering. Soft Skills: Strong analytical and problem-solving skills, excellent communication and collaboration skills, and the ability to work in a fast-paced environment are highly valued. Azure & AKS: Deep hands-on experience. IaC & CI/CD: Mastery of Terraform/Bicep & GitHub Actions. Data Engineering: Advanced Python & Spark for complex pipelines. ML Operations: Proven ability in model serving & monitoring. Problem Solver: Adept at navigating complex technical challenges and delivering solutions.
Gurgaon, Haryana, India
INR 12.0 - 17.0 Lacs P.A.
On-site
Full Time
Drive coordination across teams for Control-M job migration, documentation, and production rollout; serve as a liaison between technical teams (Control-M, Infrastructure) and Monument stakeholders. Desired Skills and Experience Essential skills: 5+ years in technical coordination or project analyst roles. Familiarity with Control-M or other enterprise schedulers. Experience managing change control, documentation, and cross-team communication. Strong organizational and communication skills. Education: B.E./B.Tech in Computer Science or related field Key Responsibilities Manage the Control-M conversion plan, timelines, and dependencies. Facilitate requirements capture for new or migrated jobs. Maintain runbooks, job documentation, and change control records. Provide regular status updates, issues tracking, and escalation support Key Metrics Control-M & Change Control Management IT Project management Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders
Gurgaon, Haryana, India
INR 15.0 - 25.0 Lacs P.A.
On-site
Full Time
The Cloud Data Architect will lead client engagements, guiding stakeholders toward optimized, cloud-native data architectures. This role will be pivotal in defining modernization strategies, designing future-state data platforms, and integrating Microsoft Fabric solutions. Key Responsibilities: Lead client interviews and workshops to understand current and future data needs Conduct technical reviews of Azure infrastructure including Databricks, Synapse Analytics, and Power BI Design scalable and optimized architecture solutions with a focus on Microsoft Fabric integration Define and refine data governance frameworks including cataloguing, lineage, and quality standards Deliver strategic and actionable project outputs in line with client expectations Evaluate and ensure the quality and accuracy of deliverables Collaborate with business and domain stakeholders to capture and implement business logic Manage end-to-end project delivery, including coordination with client and internal teams Communicate effectively with global stakeholders across various channels Troubleshoot and resolve complex issues across dev, test, UAT, and production environments Ensure quality checks and adherence to Service Level Agreements and Turnaround Times Required Skills and Experience: Bachelor's or Master's degree in Computer Science, Finance, Information Systems, or related field Minimum 7 years of experience in Data and Cloud architecture roles Proven experience engaging with client stakeholders and leading solution architecture Deep expertise in Azure Data Platform: Synapse, Databricks, Azure Data Factory, Azure SQL, Power BI Strong knowledge of data governance best practices including data quality, cataloguing, and lineage Familiarity with Microsoft Fabric and its integration into enterprise environments Experience creating modernization roadmaps and designing target architectures Excellent verbal and written communication skills Strong analytical, organizational, and problem-solving abilities Self-starter capable of working independently and in team environments Experience delivering projects in agile development environments Project management and team leadership capabilities
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.