Epsilon Technologies Group

3 Job openings at Epsilon Technologies Group
AI / Machine Learning Engineer (India) – Mid to Senior india 4 years None Not disclosed Remote Full Time

Company Description Epsilon Technologies Group (Epsilon) is a specialist capital markets technology solutions and services firm helping financial institutions modernize their platforms, risk management, and operations. We’re expanding our capabilities in Artificial Intelligence and Machine Learning to enhance our products, improve internal productivity, and deliver new services to clients. We’re looking for a hands-on AI / ML Engineer who can design, build, and deploy real-world AI solutions. This is a foundational role in shaping our AI infrastructure and enabling scalable, secure, and innovative services. What You’ll Do Build and maintain end-to-end ML pipelines - from data ingestion and preparation to model training, deployment, and monitoring (MLOps best practices). Manage and optimize secure AI experimentation environments (AWS Bedrock, Azure AI, or client-hosted). Translate proofs-of-concept into working prototypes and production-ready solutions. Work across a variety of AI/ML approaches, including Natural Language Processing (NLP), Retrieval-Augmented Generation (RAG), AI Assistants / Agentic Automation, General Machine Learning Models. Ensure governance, compliance, and data security are embedded in all AI development and deployment activities. Collaborate with business analysts, consultants, and developers to deliver practical solutions aligned with client and internal needs. What We’re Looking For 2–4 years’ experience in AI/ML engineering, data science, or MLOps, with a strong hands-on delivery record. Proficiency in cloud platforms (AWS, Azure) and ML frameworks (TensorFlow, PyTorch, Hugging Face, LangChain or equivalents). Solid experience with data engineering, APIs, and deploying models into production environments. Strong problem-solving mindset, able to work independently in a small, agile team. Contributions to open-source projects or research publications in AI (preferred). Financial services knowledge is a plus, but not required - strong technical depth and adaptability are the priority. Preferred Educational Background B.Tech / BE in Computer Science or Information Technology (with specialization in AI/ML). MBA (Finance) with strong quantitative and AI/ML focus also considered. Why Join Us Be part of a specialist capital markets technology solutions and services firm. Work on diverse projects spanning internal productivity, client services, and software product innovation. Small, entrepreneurial environment where your work will directly impact our roadmap and client offerings. Opportunity to build a long-term career as we expand our AI services and capabilities globally. Location Chennai area (preferred), with remote option available. Willingness to work in the UK shift. Work closely with our development team in India and global stakeholders.

Data Architect (India) india 15 years None Not disclosed Remote Full Time

About Epsilon Technologies Group : Epsilon is a capital markets technology solutions and services firm helping financial institutions manage everything on their balance sheet with a scalable and innovative product line as well as our highly experienced and independent consulting services. Epsilon is looking for a hands-on Data Architect who will design, build, and deploy data-centered solutions including data sourcing, validation, transformation, analysis and presentation. This pivotal role will actively engage with both onshore and offshore teams. To find out more, visit https://www.epsilontg.com About the Role : We're seeking an experienced Data Architect to design, build, and maintain enterprise-scale data solutions. This role combines strategic data architecture with hands-on technical delivery, requiring the candidate who can translate business objectives into robust, scalable data platforms while maintaining the highest standards of data governance. What You Will Do : Design and implement end-to-end data delivery solutions from acquisition through analysis, visualization, and decision support Data acquisition and scoping and preparation to validation, transformation, through analysis into visualization, insights, and decision support models Design and implement scalable ETL/ELT processes using tools such as Matillion, Fivetran, SAP BI, Trifacta, and custom SQL and Python scripts Analyze existing architecture diagrams, data dictionaries, lineage documentation, and system integration maps to quickly and thoroughly assess and document existing data infrastructure, pipelines, and dependencies across complex enterprise environments Layout the foundations of data architecture to meet business objectives, project budgets and timelines, and client’s technology goals and constraints Lead data discovery, scoping, model development, and presentation efforts Present architectural designs for peer and client review, responding thoughtfully to challenges and feedback Incorporate regulatory compliance, data governance, and data quality standards into all designs Optimize effectiveness and efficiency considering data volumes, refresh frequency and platform capabilities across the end-to-end data stack Design and implement Materialized Views in Snowflake to pre-compute and store complex aggregations, joins, and transformations that are frequently accessed by business users and reporting systems Leverage Snowflake Dynamic Tables for automated, continuously refreshed data pipelines that maintain up-to-date aggregated datasets without manual orchestration Design and implement end-to-end data delivery solutions from acquisition through analysis, visualization, and decision support Data acquisition and scoping and preparation to validation, transformation, through analysis into visualization, insights, and decision support models Design and implement scalable ETL/ELT processes using tools such as Matillion, Fivetran, SAP BI, Trifacta, and custom SQL and Python scripts Analyze existing architecture diagrams, data dictionaries, lineage documentation, and system integration maps to quickly and thoroughly assess and document existing data infrastructure, pipelines, and dependencies across complex enterprise environments Layout the foundations of data architecture to meet business objectives, project budgets and timelines, and client’s technology goals and constraints Lead data discovery, scoping, model development, and presentation efforts Present architectural designs for peer and client review, responding thoughtfully to challenges and feedback Incorporate regulatory compliance, data governance, and data quality standards into all designs Optimize effectiveness and efficiency considering data volumes, refresh frequency and platform capabilities across the end-to-end data stack Design and implement Materialized Views in Snowflake to pre-compute and store complex aggregations, joins, and transformations that are frequently accessed by business users and reporting systems Leverage Snowflake Dynamic Tables for automated, continuously refreshed data pipelines that maintain up-to-date aggregated datasets without manual orchestration What We Are Looking For : 10–15 years of progressive experience in data architecture and business intelligence, with a strong hands-on delivery record demonstrating an efficient and effective record of moving from strategy to successful implementation Proven ability to architect and implement data warehouses, data lakes, data marts, operational data stores, data hubs, and transactional systems, data migration, re-platforming, major upgrades projects Proficiency in Amazon AWS and/or Microsoft Azure cloud platforms to implement cloud data services, security configurations, and cost-optimization strategies Programming languages and development tools: Python, Java, Shell, .Net and JavaScript Analytical tools: SnowPark, PySpark Data platforms: Snowflake, SQL server, Oracle, OLAP, OLTP, Cube Data analysis and visualization: PowerBI, Qlik, SAP WebIntilligence Workload orchestration and automation: Tidal ETL/ELT tools: Matillion, Fivetran, SAP BI Creating data design and performing modeling using the above tools Fluency with processing JSON, XML, CSV and other data formats Solid experience configuring and deploying integrated solutions into production environments with DevOps tools such as Git, Bitbucket and Jenkins Strong problem-solving mindset, consultative approach synthesizing multiple perspectives, ability to work independently and lead others in an agile team, and methodical execution Effective communicator seeking clarity on important questions, grasping the big picture and tailoring the content to the audience Promoter of implementing successful ideas and approaches in the team Preferred experience in Financial Services, Capital Markets and/or Risk Management industry Why Join Us : Shape the Future of Capital Markets Technology by combining scalable, innovative products with deep domain expertise to solve complex client challenges Work at the Intersection of Innovation and Experience and leverage modern cloud platforms, advanced analytics tools, and emerging technologies while applying deep understanding of capital markets to deliver solutions that truly work in production environments Lead across a Global Team of talented professionals, provide technical leadership and architectural guidance across distributed teams and mentor others, influence technical direction, and build solutions at enterprise scale Own the Full Stack from data sourcing and validation through transformation, analysis, and presentation and implement architectural vision through to production deployment to measurable business outcomes Work directly with clients across the financial services and technology domains where every engagement expands your expertise and strengthens your professional portfolio Develop deep financial services knowledge and technical innovation alongside experienced professionals who understand both the business complexity of capital markets and the technical sophistication required to build world-class solutions Location and Working Hours : Remote, India. Availability to work in UK hours (13:30 – 22:30 pm IST) is required. Occasional work after hours and on weekends is required to complete assigned projects, support production deployment and assist clients on production issues

Data Architect (India) india 10 - 15 years INR Not disclosed Remote Full Time

About Epsilon Technologies Group : Epsilon is a capital markets technology solutions and services firm helping financial institutions manage everything on their balance sheet with a scalable and innovative product line as well as our highly experienced and independent consulting services. Epsilon is looking for a hands-on Data Architect who will design, build, and deploy data-centered solutions including data sourcing, validation, transformation, analysis and presentation. This pivotal role will actively engage with both onshore and offshore teams. To find out more, visit https://www.epsilontg.com About the Role : We're seeking an experienced Data Architect to design, build, and maintain enterprise-scale data solutions. This role combines strategic data architecture with hands-on technical delivery, requiring the candidate who can translate business objectives into robust, scalable data platforms while maintaining the highest standards of data governance. What You Will Do : Design and implement end-to-end data delivery solutions from acquisition through analysis, visualization, and decision support Data acquisition and scoping and preparation to validation, transformation, through analysis into visualization, insights, and decision support models Design and implement scalable ETL/ELT processes using tools such as Matillion, Fivetran, SAP BI, Trifacta, and custom SQL and Python scripts Analyze existing architecture diagrams, data dictionaries, lineage documentation, and system integration maps to quickly and thoroughly assess and document existing data infrastructure, pipelines, and dependencies across complex enterprise environments Layout the foundations of data architecture to meet business objectives, project budgets and timelines, and client's technology goals and constraints Lead data discovery, scoping, model development, and presentation efforts Present architectural designs for peer and client review, responding thoughtfully to challenges and feedback Incorporate regulatory compliance, data governance, and data quality standards into all designs Optimize effectiveness and efficiency considering data volumes, refresh frequency and platform capabilities across the end-to-end data stack Design and implement Materialized Views in Snowflake to pre-compute and store complex aggregations, joins, and transformations that are frequently accessed by business users and reporting systems Leverage Snowflake Dynamic Tables for automated, continuously refreshed data pipelines that maintain up-to-date aggregated datasets without manual orchestration Design and implement end-to-end data delivery solutions from acquisition through analysis, visualization, and decision support Data acquisition and scoping and preparation to validation, transformation, through analysis into visualization, insights, and decision support models Design and implement scalable ETL/ELT processes using tools such as Matillion, Fivetran, SAP BI, Trifacta, and custom SQL and Python scripts Analyze existing architecture diagrams, data dictionaries, lineage documentation, and system integration maps to quickly and thoroughly assess and document existing data infrastructure, pipelines, and dependencies across complex enterprise environments Layout the foundations of data architecture to meet business objectives, project budgets and timelines, and client's technology goals and constraints Lead data discovery, scoping, model development, and presentation efforts Present architectural designs for peer and client review, responding thoughtfully to challenges and feedback Incorporate regulatory compliance, data governance, and data quality standards into all designs Optimize effectiveness and efficiency considering data volumes, refresh frequency and platform capabilities across the end-to-end data stack Design and implement Materialized Views in Snowflake to pre-compute and store complex aggregations, joins, and transformations that are frequently accessed by business users and reporting systems Leverage Snowflake Dynamic Tables for automated, continuously refreshed data pipelines that maintain up-to-date aggregated datasets without manual orchestration What We Are Looking For : 1015 years of progressive experience in data architecture and business intelligence, with a strong hands-on delivery record demonstrating an efficient and effective record of moving from strategy to successful implementation Proven ability to architect and implement data warehouses, data lakes, data marts, operational data stores, data hubs, and transactional systems, data migration, re-platforming, major upgrades projects Proficiency in Amazon AWS and/or Microsoft Azure cloud platforms to implement cloud data services, security configurations, and cost-optimization strategies Programming languages and development tools: Python, Java, Shell, .Net and JavaScript Analytical tools: SnowPark, PySpark Data platforms: Snowflake, SQL server, Oracle, OLAP, OLTP, Cube Data analysis and visualization: PowerBI, Qlik, SAP WebIntilligence Workload orchestration and automation: Tidal ETL/ELT tools: Matillion, Fivetran, SAP BI Creating data design and performing modeling using the above tools Fluency with processing JSON, XML, CSV and other data formats Solid experience configuring and deploying integrated solutions into production environments with DevOps tools such as Git, Bitbucket and Jenkins Strong problem-solving mindset, consultative approach synthesizing multiple perspectives, ability to work independently and lead others in an agile team, and methodical execution Effective communicator seeking clarity on important questions, grasping the big picture and tailoring the content to the audience Promoter of implementing successful ideas and approaches in the team Preferred experience in Financial Services, Capital Markets and/or Risk Management industry Why Join Us : Shape the Future of Capital Markets Technology by combining scalable, innovative products with deep domain expertise to solve complex client challenges Work at the Intersection of Innovation and Experience and leverage modern cloud platforms, advanced analytics tools, and emerging technologies while applying deep understanding of capital markets to deliver solutions that truly work in production environments Lead across a Global Team of talented professionals, provide technical leadership and architectural guidance across distributed teams and mentor others, influence technical direction, and build solutions at enterprise scale Own the Full Stack from data sourcing and validation through transformation, analysis, and presentation and implement architectural vision through to production deployment to measurable business outcomes Work directly with clients across the financial services and technology domains where every engagement expands your expertise and strengthens your professional portfolio Develop deep financial services knowledge and technical innovation alongside experienced professionals who understand both the business complexity of capital markets and the technical sophistication required to build world-class solutions Location and Working Hours : Remote, India. Availability to work in UK hours (13:30 22:30 pm IST) is required. Occasional work after hours and on weekends is required to complete assigned projects, support production deployment and assist clients on production issues