Jobs
Interviews
6 Job openings at Mitra AI
Senior Data Engineer

Gurugram, Haryana, India

0 years

Not disclosed

On-site

Full Time

JOB SPECIFIC DUTIES & RESPONSIBILITIES ● Design, develop, and optimize data pipelines for processing large volumes of structured and unstructured data using Apache Spark and AWS technologies. ● Develop APIs and microservices using container frameworks like OpenShift, Docker, and Kubernetes. ● Work with diverse data formats such as PARQUET, ORC, and CSV. ● Leverage data streaming and messaging platforms including Apache Kafka for real-time data processing. ● Build scalable solutions on AWS leveraging services like ElasticSearch/OpenSearch. ● Implement big data querying using tools such as Presto or Trino. ● Collaborate with platform and vendor deployment teams to ensure seamless integration of data sources. ● Work closely with data architects to provision and support sustainable infrastructure patterns. ● Contribute to data access strategies and data modeling in alignment with architectural principles. ● Communicate technical concepts effectively to non-technical stakeholders and vice versa. Show more Show less

SAS Solution Designer

India

15 years

Not disclosed

On-site

Contractual

SAS Solution Designer We are seeking a highly experienced SAS Solution Designer to join our team in a solution engineering lead capacity. This role requires in-depth knowledge of SAS technologies, cloud-based platforms, and data solutions. The ideal candidate will be responsible for end-to-end solution design aligned with enterprise architecture standards and business objectives, providing technical leadership across squads and development teams. Mitra AI is currently looking for experienced SAS Solution Designers who are based in India and are open to relocating. This is a hybrid opportunity in Sydney, Australia. JOB SPECIFIC DUTIES & RESPONSIBILITIES Own and define the end-to-end solution architecture for data platforms, ensuring alignment with business objectives, enterprise standards, and architectural best practices. Design reliable, stable, and scalable SAS-based solutions that support long-term operational effectiveness. Lead solution engineers and Agile squads to ensure the delivery of high-quality, maintainable data solutions. Collaborate independently with business and technical stakeholders to understand requirements and translate them into comprehensive technical designs. Provide high-level estimates for proposed features and technical initiatives to support business planning and prioritization. Conduct and participate in solution governance forums to secure approval for data designs and strategies. Drive continuous improvement by identifying technical gaps and implementing best practices, emerging technologies, and enhanced processes. Facilitate work breakdown sessions and actively participate in Agile ceremonies such as sprint planning and backlog grooming. Ensure quality assurance through rigorous code reviews, test case validation, and enforcement of coding and documentation standards. Troubleshoot complex issues by performing root cause analysis, log reviews, and coordination with relevant teams for resolution. Provide mentoring and coaching to solution engineers and technical leads to support skills growth and consistency in solution delivery. REQUIRED COMPETENCIES AND SKILLS Deep expertise in SAS technologies and ecosystem. Strong proficiency in cloud-based technologies and data platforms (e.g., Azure, Hadoop, Teradata). Solid understanding of RDBMS, ETL/ELT tools (e.g., Informatica), and real-time data streaming. Ability to work across relational and NoSQL databases and integrate with various data and analytics tools. Familiarity with BI and reporting tools such as Tableau and Power BI. Experience guiding Agile delivery teams, supporting full-stack solution development through DevOps and CI/CD practices. Capability to define and implement secure, scalable, and performant data solutions. Strong knowledge of metadata management, reference data, and data lineage concepts. Ability to communicate effectively with both technical and non-technical stakeholders. Problem-solving mindset with attention to detail and an emphasis on delivering high-quality solutions. REQUIRED EXPERIENCE AND QUALIFICATIONS Minimum of 15+ years of experience in solution design and development roles, including leadership responsibilities. Strong exposure to SAS and enterprise data platforms in the financial services industry. Prior experience working within risk, compliance, or credit risk domains is highly desirable. Practical experience with Agile methodologies and DevOps principles. Bachelors or Masters degree in Computer Science, Engineering, Information Technology, or related field. Experience working in cross-functional teams with a focus on business alignment and technology delivery. Show more Show less

Rocket UniData/ UniVerse Engineer

India

2 years

None Not disclosed

On-site

Contractual

About the Role: We are seeking a experienced Rocket UniData/ UniVerse Engineer to join our team in supporting and modernizing critical business applications built on the Rocket MultiValue platform. The ideal candidate will have a deep understanding of UniData and UniVerse environments, with a proven track record in developing, maintaining, and optimizing complex legacy systems. Key Responsibilities: Design, develop, maintain, and optimize applications built on Rocket UniData and UniVerse platforms Analyze existing codebase and perform system improvements, bug fixes, and enhancement Work closely with business analysts and stakeholders to understand requirements and deliver solutions Support data integration between MultiValue databases and modern platforms (e.g., via REST APIs, ODBC, or flat file interfaces) Participate in modernization efforts including migration, documentation, and system re-architecture Provide production support, including troubleshooting and performance tuning Maintain high-quality standards in code and documentation Required Skills & Experience: 2+ years of hands-on experience with Rocket UniData and UniVerse Good knowledge of PickBasic/ UniBasic , Procs, Paragraphs, and TCL Experience with MultiValue data modeling , indexes, and file structures Familiarity with Rocket tools (SB/XA, Web DE, U2 Toolkit) is a plus Experience integrating UniData/ UniVerse with external systems Understanding of modern development practices (version control, DevOps) Excellent problem-solving, debugging, and analytical skills Strong communication and documentation capabilities Nice to Have: Exposure to modernization tools (e.g., Rocket API, Python connectors, AI-based modernization frameworks) Knowledge of legacy-to-modern migration strategies Experience working in Agile/ Scrum environments

Fenergo Developers

India

5 years

None Not disclosed

On-site

Contractual

Key Responsibilities: Collaborate with Business Analysts to translate business requirements into scalable technical solutions. Configure and customize Fenergo UI, BRE rules, and workflows to streamline KYC and client onboarding processes. Develop and integrate Fenergo solutions with APIs and technologies like Elastic Search, JSON, XML, and SQL databases. Conduct code reviews, mentor junior developers, and ensure adherence to coding standards. Troubleshoot and resolve technical issues to meet project deadlines and quality expectations. Contribute to continuous improvement of development processes and tools Qualifications : 5+ years of software development experience, with at least 2 years focused on Fenergo configuration and development. Bachelor’s degree in computer science, Engineering, or a related field (or equivalent experience). Experience with cloud platforms (e.g., Azure, AWS) or containerization (e.g., Docker). Certification in Fenergo or related financial compliance technologies Technical & Domain skills: Proven track record in delivering KYC or client onboarding solutions in financial services. Proficiency in C#, ASP.NET (MVC or Core), Web API, Entity Framework, and LINQ. Strong expertise in SQL databases (e.g., MSSQL, Oracle, PostgreSQL), including schema design, queries, and optimization. Experience with JSON, XML, Elastic Search, and modern UI frameworks (e.g., Angular, React, or Vue.js). Familiarity with front-end technologies like HTML, CSS, JavaScript, and Bootstrap. Knowledge of Fenergo-specific features: UI configuration, BRE, API utilization, FDIM integration. Proficiency with development tools such as Visual Studio, Git, and CI/CD pipeline Agile mindset with a focus on delivering high-quality, scalable solutions.

Senior Solutions Architect

India

8 - 10 years

None Not disclosed

On-site

Full Time

The candidate should be open to relocating to Sri Lanka. We are seeking an experienced Senior Architect with deep expertise in Finacle Core Banking System (CBS), including Finacle 10x, and strong hands-on experience with Oracle 19c database management. The ideal candidate will lead the architectural design, implementation, and integration of Finacle-based banking solutions, ensuring alignment with business goals and technology standards. This role demands strong leadership, technical knowledge, and the ability to work across multiple teams to deliver scalable, secure, and highly available banking solutions that meet stringent disaster recovery requirements. Roles & Responsibilities: Lead the architecture design, customization, and integration of Finacle Core Banking System (including Finacle 10x) and its modules Define solution architecture and ensure alignment with the bank’s overall IT strategy Collaborate with business stakeholders, IT teams, and vendors to translate business requirements into robust technical solutions Provide technical leadership and mentoring to the development and implementation teams Drive Finacle upgrades, patches, and performance tuning, including Finacle 10x features and enhancements Oversee the design of interfaces between Finacle CBS and other banking and third-party systems Manage and optimize Oracle 19c database instances used by Finacle for performance, availability, and security. Architect and implement high availability (HA) and disaster recovery (DR) strategies for Finacle environments to ensure business continuity and regulatory compliance Ensure adherence to security, compliance, and regulatory requirements Conduct technical feasibility studies, impact analysis, and risk assessments Establish best practices, standards, and architectural governance frameworks for Finacle implementation Manage and resolve complex technical issues and provide expert guidance Stay current with industry trends, Finacle updates (including Finacle 10x), Oracle database advancements, and emerging banking technologies Required Skills and Competencies: Core Banking Modules: Deposits Management: Savings, current, fixed deposits, recurring deposits, and interest calculations Loans & Advances: Retail, corporate, mortgage, and SME loans lifecycle management. Payments & Transfers: Domestic/international payments, RTGS, NEFT, IMPS, and internal transfers Treasury Management: Foreign exchange, money markets, securities, and investments Channels & Customer Interaction: Internet Banking: Online customer account management and transactions Mobile Banking: Mobile app and wallet integration ATM & POS: Integration with ATM and POS systems Contact Center: Customer service and CRM interfaces Trade Finance & Cash Management: Letters of Credit, Bank Guarantees, Bill Collections, and trade finance operations Corporate cash management and liquidity solutions Financial Accounting & GL: General Ledger and financial accounting for regulatory reporting Risk & Compliance: Anti-Money Laundering (AML), Know Your Customer (KYC), and fraud detection Finacle 10x Specific Enhancements: API-first architecture, microservices, and digital banking capabilities Integration & Middleware: Finacle API Gateway, Enterprise Service Bus (ESB), and third-party middleware Required Skills, Qualifications and Experience: Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field Minimum 8-10 years of experience in banking IT, with at least 5 years working extensively on Finacle Core Banking System, including Finacle 10x Proven experience as a Solution Architect or Technical Architect on large-scale Finacle implementations Strong expertise in Finacle CBS modules as listed above Hands-on experience with Oracle 19c database management, performance tuning, and troubleshooting Demonstrated experience designing and implementing high availability (HA) and disaster recovery (DR) architectures for Finacle environments Experience with Finacle integration tools, APIs, and database design Deep understanding of banking processes, regulatory requirements, and compliance standards Proficient in system design, SDLC, and agile methodologies Excellent problem-solving, communication, and stakeholder management skills Ability to work under pressure and manage multiple priorities effectively

Platform Lead - Snowflake

India

6 years

None Not disclosed

On-site

Contractual

We are seeking a skilled and proactive Platform Lead with strong Snowflake expertise and AWS cloud exposure to lead the implementation and operational excellence of a scalable, multitenant modern data platform for a leading US-based marketing agency serving nonprofit clients. This role requires hands-on experience in managing Snowflake environments, supporting data pipeline orchestration, enforcing platform-level standards, and ensuring observability, performance, and security across environments. You will collaborate with architects, engineers, and DevOps teams to operationalize the platform’s design and drive its long-term stability and scalability in a cloud-native ecosystem. Job Specific Duties & Responsibilities: Lead the technical implementation and stability of the multitenant Snowflake data platform across dev, QA, and prod environments Design and manage schema isolation, role-based access control (RBAC), masking policies, and cost-optimized Snowflake architecture for multiple nonprofit tenants Implement and maintain CI/CD pipelines for dbt, Snowflake objects, and metadata-driven ingestion processes using GitHub Actions or similar tools Develop and maintain automation accelerators for data ingestion, schema validation, error handling, and onboarding new clients at scale Collaborate with architects and data engineers to ensure seamless integration with source CRMs, ByteSpree connectors, and downstream BI/reporting layers Monitor and optimize performance of Snowflake workloads (e.g., query tuning, warehouse sizing, caching strategy) to ensure reliability and scalability Establish and maintain observability and monitoring practices across data pipelines, ingestion jobs, and platform components (e.g., error tracking, data freshness, job status dashboards) Manage infrastructure-as-code (IaC), configuration templates, and version control practices across the data stack Ensure robust data validation, quality checks, and observability mechanisms are in place across all platform services Support incident response, pipeline failures, and technical escalations in production, coordinating across engineering and client teams Contribute to data governance compliance by implementing platform-level policies for PII, lineage tracking, and tenant-specific metadata tagging Required Skills, Experience & Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related technical field 6+ years of experience in data engineering or platform delivery, including 3+ years of hands-on Snowflake experience in production environments Proven expertise in building and managing multi tenant data platforms, including schema isolation, RBAC, and masking policies Solid knowledge of CI/CD practices for data projects, with experience guiding pipeline implementations using tools like GitHub Actions Hands-on experience with dbt, SQL, and metadata-driven pipeline design for large-scale ingestion and transformation workloads Strong understanding of AWS cloud services relevant to data platforms (e.g., S3, IAM, Lambda, CloudWatch, Secrets Manager) Experience optimizing Snowflake performance, including warehouse sizing, caching, and cost control strategies Familiarity with setting up observability frameworks, monitoring tools, and data quality checks across complex pipeline ecosystems Proficient in infrastructure-as-code (IaC) concepts and managing configuration/versioning across environments Awareness of data governance principles, including lineage, PII handling, and tenant-specific metadata tagging

Mitra AI logo

Mitra AI

6 Jobs

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview