Noida
INR Not disclosed
Remote
Full Time
· Experience: 3-5 years in automation testing, with hands-on experience in Cypress, Selenium, or Playwright using JavaScript, or Selenium with Java. · API Testing: Proven experience with API testing using tools like Rest Assured, Postman, or Karate. · Performance Testing: Exposure to tools like JMeter. · Security Testing: Familiarity with OWASP tools (preferred). · Programming Skills: Strong understanding of programming concepts, frameworks, and debugging techniques. · Methodologies: In-depth knowledge of software testing methodologies, best practices, and quality assurance principles. · CI/CD Integration: Experience with continuous integration and continuous delivery pipelines. · Education: Bachelor's degree in computer science, Engineering, or a related field.
Noida, Uttar Pradesh, India
Not disclosed
On-site
Full Time
Position : Senior Software Engineer (ROR) Location : Noida Experience : 3 to 6 years Key Responsibilities Design, develop and maintain scalable web applications using Ruby on Rails. Write clean, efficient, and well-tested code following best practices. Design and implement RESTful APIs for seamless integration with frontend applications and third-party services. Optimize application performance by identifying bottlenecks and improving query efficiency. Collaborate with frontend developers, designers, and product managers to deliver a seamless user experience. Implement security best practices to ensure application robustness. Automate testing and deployment pipelines using CI/CD tools. Participate in code reviews, ensuring high code quality and adherence to best practices. Work with databases like PostgreSQL, MySQL, or NoSQL solutions. Troubleshoot and resolve production issues effectively. Document technical designs, workflows, and processes on Jira Confluence for team-wide and Qualifications : Bachelor's degree in computer science, Information Technology, or a related field. Proven experience as a Ruby on Rails developer, with a strong portfolio of past skills : Strong knowledge of Object-Oriented Programming (OOP) and MVC architecture. Experience with SQL databases (PostgreSQL / MySQL) and writing optimized queries. Proficiency in RESTful API development and third-party API integrations. Hands-on experience with Ruby, ROR, JavaScript, HTML, CSS, and frontend frameworks like React or Vue.js. Knowledge of background job processing using Sidekiq, Resque, or Delayed Job. Familiarity with caching strategies (Redis, Memcached). Experience with testing frameworks (RSpec, MiniTest) and test-driven development (TDD). Familiarity with containerization (Docker) and cloud platforms (AWS, GCP, Azure). Exposure to Agile methodologies and tools like JIRA, Trello, or Asana. Experience with CI/CD pipelines for automated deployment. Strong problem-solving skills and ability to work in a collaborative team Skills : Strong analytical and problem-solving skills with attention to detail. Excellent communication and ability to articulate complex technical concepts to non-technical stakeholders. Leadership capabilities with experience mentoring and guiding junior developers. Adaptability to work in Agile/Scrum environments and deliver under tight to have skills : Knowledge of microservices architecture. Experience in DevOps practices for infrastructure management. Contributions to open-source projects or personal RoR projects. (ref:hirist.tech) Show more Show less
Noida, Uttar Pradesh, India
Not disclosed
On-site
Full Time
About the company: Veersa is a healthtech company that leverages emerging technology and data science to solve business problems in the US healthcare industry. Veersa has established a niche in serving small and medium entities in the US healthcare space through its tech frameworks, platforms, and tech accelerators. Veersa is known for providing innovative solutions using technology and data science to its client base and is the preferred innovation partner to its clients. Veersa’s rich technology expertise manifests in the various tech accelerators and frameworks developed in-house to assist in rapid solutions delivery and implementations. Its end-to-end data ingestion, curation, transformation, and augmentation framework has helped several clients quickly derive business insights and monetize data assets. Veersa teams work across all emerging technology areas such as AI/ML, IoT, and Blockchain and using tech stacks as MEAN, MERN, PYTHON, GoLang, ROR, and backend such as Java Springboot, NodeJs, and using databases as PostgreSQL, MS SQL, MySQL, Oracle on AWS and Azure cloud using serverless architecture. Veersa has two major business lines – Veersalabs: an In-house R&D and product development platform and Veersa tech consulting: Technical solutions delivery for clients. Veersa’s customer base includes large US Healthcare software vendors, Pharmacy chains, Payers, providers, and Hospital chains. Though Veersa’s focus geography is North America, Veersa also provides product engineering expertise to a few clients in Australia and Singapore. About the Role: We are seeking talented and detail-oriented Data Engineers with expertise in Informatica MDM to join our fast-growing data engineering team. Depending on your experience, you’ll join as a Software Engineer or Senior Software Engineer, contributing to the design, development, and maintenance of enterprise data management solutions that support our business objectives. As a key player, you will be responsible for building reliable data pipelines, working with master data management, and ensuring data quality, governance, and integration across systems. Responsibilities: Design, develop, and implement data pipelines using ETL tools like Informatica PowerCenter, IICS, etc., and MDM solutions using Informatica MDM. Develop and maintain batch and real-time data integration workflows. Collaborate with data architects, business analysts, and stakeholders to understand data requirements. Perform data profiling, data quality assessments, and master data matching/merging. Implement governance, stewardship, and metadata management practices. Optimize the performance of Informatica MDM Hub, IDD, and associated components. Write complex SQL queries and stored procedures as needed. Senior Software Engineer – Additional Responsibilities: Lead design discussions and code reviews; mentor junior engineers. Architect scalable data integration solutions using Informatica and complementary tools. Drive adoption of best practices in data modeling, governance, and engineering. Work closely with cross-functional teams to shape the data strategy. Required Qualifications: Software Engineer: Bachelor’s degree in Computer Science, Information Systems, or related field. 2–4 years of experience with Informatica MDM (Customer 360, Business Entity Services, Match/Merge rules). Strong SQL and data modeling skills. Familiarity with ETL concepts, REST APIs, and data integration tools. Understanding of data governance and quality frameworks. Senior Software Engineer: Bachelor’s or Master’s in Computer Science, Data Engineering, or related field. 4+ years of experience in Informatica MDM, with at least 2 years in a lead role. Proven track record of designing scalable MDM solutions in large-scale environments. Strong leadership, communication, and stakeholder management skills. Hands-on experience with data lakes, cloud platforms (AWS, Azure, or GCP), and big data tools is a plus. Preferred Skills (Nice to Have): Experience with other Informatica products (IDQ, PowerCenter). Exposure to cloud MDM platforms or cloud data integration tools. Agile/Scrum development experience. Knowledge of industry-standard data security and compliance practices. JD For Data Engineer: About the Role We are seeking a highly skilled and experienced Senior/Lead Data Engineer to join our growing Data Engineering Team. In this critical role, you will design, architect, and develop cutting-edge multi-tenant SaaS data solutions hosted on Azure Cloud. Your work will focus on delivering robust, scalable, and high-performance data pipelines and integrations that support our enterprise provider and payer data ecosystem. This role is ideal for someone with deep experience in ETL/ELT processes, data warehousing principles, and real-time and batch data integrations. As a senior member of the team, you will also be expected to mentor and guide junior engineers, help define best practices, and contribute to the overall data strategy. Key Responsibilities Architect and implement scalable data integration and data pipeline solutions using Azure cloud services. Design, develop, and maintain ETL/ELT processes, including data extraction, transformation, loading, and quality checks. Collaborate with business and technical stakeholders to understand data requirements and translate them into technical solutions. Develop and manage data flows, data mappings, and data quality & validation rules across multiple tenants and systems. Implement best practices for data modeling, metadata management, and data governance. Configure, maintain, and monitor integration jobs to ensure high availability and performance. Lead code reviews, mentor data engineers, and help shape engineering culture and standards. Stay current with emerging technologies and recommend tools or processes to improve the team's effectiveness. Required Qualifications Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field. 7+ years of experience in data engineering, with a strong focus on Azure-based solutions. Proven experience in designing and implementing real-time and batch data integrations. Hands-on experience with Azure Data Factory, Azure Data Lake, Azure Synapse, Databricks, or similar technologies. Strong understanding of data warehousing principles, ETL/ELT methodologies, and data pipeline architecture. Proficiency in SQL, Python, or Scala for data processing and pipeline development. Familiarity with data quality, metadata management, and data validation frameworks. Strong problem-solving skills and the ability to communicate complex technical concepts clearly. Preferred Qualifications Experience with multi-tenant SaaS data solutions. Background in healthcare data, especially provider and payer ecosystems. Familiarity with DevOps practices, CI/CD pipelines, and version control systems (e.g., Git). Experience mentoring and coaching other engineers in technical and architectural decision-making. Show more Show less
Noida
INR Not disclosed
On-site
Full Time
About the Role: We are seeking talented and detail-oriented Data Engineers with expertise in Informatica MDM to join our fast-growing data engineering team. Depending on your experience, you’ll join as a Software Engineer or Senior Software Engineer, contributing to the design, development, and maintenance of enterprise data management solutions that support our business objectives. As a key player, you will be responsible for building reliable data pipelines, working with master data management, and ensuring data quality, governance, and integration across systems. Responsibilities: Design, develop, and implement data pipelines using ETL tools like Informatica PowerCenter, IICS, etc., and MDM solutions using Informatica MDM. Develop and maintain batch and real-time data integration workflows. Collaborate with data architects, business analysts, and stakeholders to understand data requirements. Perform data profiling, data quality assessments, and master data matching/merging. Implement governance, stewardship, and metadata management practices. Optimize the performance of Informatica MDM Hub, IDD, and associated components. Write complex SQL queries and stored procedures as needed. Senior Software Engineer – Additional Responsibilities: Lead design discussions and code reviews; mentor junior engineers. Architect scalable data integration solutions using Informatica and complementary tools. Drive adoption of best practices in data modeling, governance, and engineering. Work closely with cross-functional teams to shape the data strategy. Required Qualifications: Software Engineer: Bachelor’s degree in Computer Science, Information Systems, or related field. 2–4 years of experience with Informatica MDM (Customer 360, Business Entity Services, Match/Merge rules). Strong SQL and data modeling skills. Familiarity with ETL concepts, REST APIs, and data integration tools. Understanding of data governance and quality frameworks. Senior Software Engineer: Bachelor’s or Master’s in Computer Science, Data Engineering, or related field. 4+ years of experience in Informatica MDM, with at least 2 years in a lead role. Proven track record of designing scalable MDM solutions in large-scale environments. Strong leadership, communication, and stakeholder management skills. Hands-on experience with data lakes, cloud platforms (AWS, Azure, or GCP), and big data tools is a plus. Preferred Skills (Nice to Have): Experience with other Informatica products (IDQ, PowerCenter). Exposure to cloud MDM platforms or cloud data integration tools. Agile/Scrum development experience. Knowledge of industry-standard data security and compliance practices. Job Type: Full-time Pay: ₹219,797.43 - ₹1,253,040.32 per year Benefits: Health insurance Schedule: Day shift Ability to commute/relocate: Noida, Uttar Pradesh: Reliably commute or planning to relocate before starting work (Preferred) Experience: Informatica: 3 years (Required) Data warehouse: 3 years (Required) Work Location: In person
Noida, Uttar Pradesh
INR Not disclosed
On-site
Full Time
About the Role: We are seeking talented and detail-oriented Data Engineers with expertise in Informatica MDM to join our fast-growing data engineering team. Depending on your experience, you’ll join as a Software Engineer or Senior Software Engineer, contributing to the design, development, and maintenance of enterprise data management solutions that support our business objectives. As a key player, you will be responsible for building reliable data pipelines, working with master data management, and ensuring data quality, governance, and integration across systems. Responsibilities: Design, develop, and implement data pipelines using ETL tools like Informatica PowerCenter, IICS, etc., and MDM solutions using Informatica MDM. Develop and maintain batch and real-time data integration workflows. Collaborate with data architects, business analysts, and stakeholders to understand data requirements. Perform data profiling, data quality assessments, and master data matching/merging. Implement governance, stewardship, and metadata management practices. Optimize the performance of Informatica MDM Hub, IDD, and associated components. Write complex SQL queries and stored procedures as needed. Senior Software Engineer – Additional Responsibilities: Lead design discussions and code reviews; mentor junior engineers. Architect scalable data integration solutions using Informatica and complementary tools. Drive adoption of best practices in data modeling, governance, and engineering. Work closely with cross-functional teams to shape the data strategy. Required Qualifications: Software Engineer: Bachelor’s degree in Computer Science, Information Systems, or related field. 2–4 years of experience with Informatica MDM (Customer 360, Business Entity Services, Match/Merge rules). Strong SQL and data modeling skills. Familiarity with ETL concepts, REST APIs, and data integration tools. Understanding of data governance and quality frameworks. Senior Software Engineer: Bachelor’s or Master’s in Computer Science, Data Engineering, or related field. 4+ years of experience in Informatica MDM, with at least 2 years in a lead role. Proven track record of designing scalable MDM solutions in large-scale environments. Strong leadership, communication, and stakeholder management skills. Hands-on experience with data lakes, cloud platforms (AWS, Azure, or GCP), and big data tools is a plus. Preferred Skills (Nice to Have): Experience with other Informatica products (IDQ, PowerCenter). Exposure to cloud MDM platforms or cloud data integration tools. Agile/Scrum development experience. Knowledge of industry-standard data security and compliance practices. Job Type: Full-time Pay: ₹219,797.43 - ₹1,253,040.32 per year Benefits: Health insurance Schedule: Day shift Ability to commute/relocate: Noida, Uttar Pradesh: Reliably commute or planning to relocate before starting work (Preferred) Experience: Informatica: 3 years (Required) Data warehouse: 3 years (Required) Work Location: In person
India
Not disclosed
On-site
Full Time
About the company: Veersa is a healthtech company that leverages emerging technology and data science to solve business problems in the US healthcare industry. Veersa has established a niche in serving small and medium entities in the US healthcare space through its tech frameworks, platforms, and tech accelerators. Veersa is known for providing innovative solutions using technology and data science to its client base and is the preferred innovation partner to its clients. Veersa’s rich technology expertise manifests in the various tech accelerators and frameworks developed in-house to assist in rapid solutions delivery and implementations. Its end-to-end data ingestion, curation, transformation, and augmentation framework has helped several clients quickly derive business insights and monetize data assets. Veersa teams work across all emerging technology areas such as AI/ML, IoT, and Blockchain and using tech stacks as MEAN, MERN, PYTHON, GoLang, ROR, and backend such as Java Springboot, NodeJs, and using databases as PostgreSQL, MS SQL, MySQL, Oracle on AWS and Azure cloud using serverless architecture. Veersa has two major business lines – Veersalabs: an In-house R&D and product development platform and Veersa tech consulting: Technical solutions delivery for clients. Veersa’s customer base includes large US Healthcare software vendors, Pharmacy chains, Payers, providers, and Hospital chains. Though Veersa’s focus geography is North America, Veersa also provides product engineering expertise to a few clients in Australia and Singapore. We are seeking a CFO for Veersa Technologies, someone who has sharp commercial acumen, is business oriented & can drive financial strategies, manage risks, and ensure compliance while also partnering with the CEO & board to achieve company goals. Strong leadership, analytical and communication skills, experience in financial planning, budgeting, and forecasting, and a proven ability to work with other departments to align financial decisions with overall business objectives. Key Responsibilities: Strategic Leadership: Develop and implement financial strategies that support the company's growth objectives and maximize shareholder value. Financial Planning and Analysis (FP&A): Oversee budgeting, forecasting, and financial reporting, ensuring accurate and timely information for decision-making. Risk Management: Identify, assess, and mitigate financial risks, ensuring compliance with regulatory requirements. Compliance: Ensure adherence to all relevant financial regulations and accounting standards. Stakeholder Relations: Manage relationships with istakeholders and external auditors. Team Leadership: Build, motivate, and develop a high-performing finance team. Data-Driven Decision Making: Leverage data and analytics to inform strategic decisions and improve financial performance. Operational Efficiency: Drive efficiency in financial processes and systems. Technology Adoption: Identify and implement technology solutions to streamline financial operations. M&A Expertise: Experience in mergers and acquisitions. Key Requirements: Growth hacker: Understands the physics of growth and drives a culture of hyper growth Culture of stewardship: Every dollar counts, aggressively invest and maniacally take out waste. Analytical Skills: Strong analytical and problem-solving abilities are essential. Tech first: Organization wide leader of digitization and digital worker hyper productivity. Proficiency in financial software, accounting systems, data analytics, and AI tools is expected. Communication Skills: Excellent communication and interpersonal skills are needed to effectively collaborate with stakeholders. Leadership Skills: Ability to lead and motivate a team, as well as to influence decision-making at all levels of the organization. Nurturer of top flight talent: Engage, groom, challenge and grow world class team across the organization. Business Acumen: A strong understanding of business operations and financial markets is necessary. Strategic Thinking: The ability to think strategically and to develop and implement long-term financial plans. Qualification: Education: CA/CFA/MBA - Finance Experience: Minimum 20 years in financial management with atleast 5 years in a head of finance or CFO role, particularly in a fast-growth environment across the IT industry is crucial. Show more Show less
Sadar, Uttar Pradesh, India
Not disclosed
On-site
Full Time
About The Company Veersa is a healthtech company that leverages emerging technology and data science to solve business problems in the US healthcare industry. Veersa has established a niche in serving small and medium entities in the US healthcare space through its tech frameworks, platforms, and tech accelerators. Veersa is known for providing innovative solutions using technology and data science to its client base and is the preferred innovation partner to its clients. Veersas rich technology expertise manifests in the various tech accelerators and frameworks developed in-house to assist in rapid solutions delivery and implementations. Its end-to-end data ingestion, curation, transformation, and augmentation framework has helped several clients quickly derive business insights and monetize data assets. Veersa teams work across all emerging technology areas such as AI/ML, IoT, and Blockchain and using tech stacks as MEAN, MERN, PYTHON, GoLang, ROR, and backend such as Java Springboot, NodeJs, and using databases as PostgreSQL, MS SQL, MySQL, Oracle on AWS and Azure cloud using serverless architecture. Veersa has two major business lines Veersalabs : an In-house R&D and product development platform and Veersa tech consulting : Technical solutions delivery for clients. Role Veersas customer base includes large US Healthcare software vendors, Pharmacy chains, Payers, providers, and Hospital chains. Though Veersas focus geography is North America, Veersa also provides product engineering expertise to a few clients in Australia and the Role : We are seeking talented and detail-oriented Data Engineers with expertise in Informatica MDM to join our fast-growing data engineering team. Depending on your experience, youll join as a Software Engineer or Senior Software Engineer, contributing to the design, development, and maintenance of enterprise data management solutions that support our business objectives. Responsibilities As a key player, you will be responsible for building reliable data pipelines, working with master data management, and ensuring data quality, governance, and integration across : Design, develop, and implement data pipelines using ETL tools like Informatica PowerCenter, IICS, etc., and MDM solutions using Informatica MDM. Develop and maintain batch and real-time data integration workflows. Collaborate with data architects, business analysts, and stakeholders to understand data requirements. Perform data profiling, data quality assessments, and master data matching/merging. Implement governance, stewardship, and metadata management practices. Optimize the performance of Informatica MDM Hub, IDD, and associated components. Write complex SQL queries and stored procedures as needed. Senior Software Engineer Additional Responsibilities Lead design discussions and code reviews; mentor junior engineers. Architect scalable data integration solutions using Informatica and complementary tools. Drive adoption of best practices in data modeling, governance, and engineering. Work closely with cross-functional teams to shape the data Qualifications : Software Engineer Bachelors degree in Computer Science, Information Systems, or related field. 24 years of experience with Informatica MDM (Customer 360, Business Entity Services, Match/Merge rules). Strong SQL and data modeling skills. Familiarity with ETL concepts, REST APIs, and data integration tools. Understanding of data governance and quality frameworks. Senior Software Engineer Bachelors or Masters in Computer Science, Data Engineering, or related field. 4+ years of experience in Informatica MDM, with at least 2 years in a lead role. Proven track record of designing scalable MDM solutions in large-scale environments. Strong leadership, communication, and stakeholder management skills. Hands-on experience with data lakes, cloud platforms (AWS, Azure, or GCP), and big data tools is a Skills (Nice to Have) : Experience with other Informatica products (IDQ, PowerCenter). Exposure to cloud MDM platforms or cloud data integration tools. Agile/Scrum development experience. Knowledge of industry-standard data security and compliance practices. (ref:hirist.tech) Show more Show less
Noida
INR 6.71452 - 22.18397 Lacs P.A.
On-site
Full Time
Job Title: ETL Lead – Azure Data Factory (ADF) Department : Data Engineering / Analytics Employment Type : Full-time Experience Level : 5+ years About the Role We are seeking an experienced ETL Lead with strong expertise in Azure Data Factory (ADF) to lead and oversee data ingestion and transformation projects across the organization. The role demands a mix of technical proficiency and leadership to design scalable data pipelines, manage a team of engineers, and collaborate with cross-functional stakeholders to ensure reliable data delivery. Key Responsibilities Lead the design, development, and implementation of end-to-end ETL pipelines using Azure Data Factory (ADF). Architect scalable ingestion solutions from multiple structured and unstructured sources (e.g., SQL Server, APIs, flat files, cloud storage). Define best practices for ADF pipeline orchestration, performance tuning, and cost optimization. Mentor, guide, and manage a team of ETL engineers—ensuring high-quality deliverables and adherence to project timelines. Work closely with business analysts, data modelers, and source system owners to understand and translate data requirements. Establish data quality checks, monitoring frameworks, and alerting mechanisms. Drive code reviews, CI/CD integration (using Azure DevOps), and documentation standards. Own delivery accountability across multiple ingestion and data integration workstreams. Required Qualifications Bachelor's or Master's degree in Computer Science, Data Engineering, or related discipline. 5+ years of hands-on ETL development experience, with 3+ years of experience in Azure Data Factory. Deep understanding of data ingestion, transformation, and warehousing best practices. Strong SQL skills and experience with cloud-native data storage (ADLS Gen2, Blob Storage). Proficiency in orchestrating complex data flows, parameterized pipelines, and incremental data loads. Experience in handling large-scale data migration or modernization projects. Preferred Skills Familiarity with modern data platforms like Azure Synapse, Snowflake, Databricks. Exposure to Azure DevOps pipelines for CI/CD of ADF pipelines and linked services. Understanding of data governance, security (RBAC), and compliance requirements. Experience leading Agile teams and sprint-based delivery models. Excellent communication, leadership, and stakeholder management skills. Job Type: Full-time Pay: ₹671,451.97 - ₹2,218,396.67 per year Benefits: Health insurance Schedule: Day shift Application Question(s): 5+ years of hands-on ETL development experience, with 3+ years of experience in Azure Data Factory. Experience: ETL: 5 years (Required) Azure: 3 years (Required) Work Location: In person
India
INR 18.5304 - 18.5304 Lacs P.A.
On-site
Full Time
About the Role: We are seeking talented and detail-oriented Data Engineers with expertise in Informatica MDM to join our fast-growing data engineering team. Depending on your experience, you’ll join as a Software Engineer or Senior Software Engineer, contributing to the design, development, and maintenance of enterprise data management solutions that support our business objectives. As a key player, you will be responsible for building reliable data pipelines, working with master data management, and ensuring data quality, governance, and integration across systems. Responsibilities: Design, develop, and implement data pipelines using ETL tools like Informatica PowerCenter, IICS, etc., and MDM solutions using Informatica MDM . Develop and maintain batch and real-time data integration workflows. Collaborate with data architects, business analysts, and stakeholders to understand data requirements. Perform data profiling, data quality assessments, and master data matching/merging. Implement governance, stewardship, and metadata management practices. Optimize the performance of Informatica MDM Hub, IDD, and associated components. Write complex SQL queries and stored procedures as needed. Senior Software Engineer – Additional Responsibilities: Lead design discussions and code reviews; mentor junior engineers. Architect scalable data integration solutions using Informatica and complementary tools. Drive adoption of best practices in data modeling, governance, and engineering. Work closely with cross-functional teams to shape the data strategy. Required Qualifications: Software Engineer: Bachelor’s degree in Computer Science, Information Systems, or related field. 2–4 years of experience with Informatica MDM (Customer 360, Business Entity Services, Match/Merge rules). Strong SQL and data modeling skills. Familiarity with ETL concepts, REST APIs , and data integration tools. Understanding of data governance and quality frameworks. Senior Software Engineer: Bachelor’s or Master’s in Computer Science, Data Engineering, or related field. 4+ years of experience in Informatica MDM, with at least 2 years in a lead role. Proven track record of designing scalable MDM solutions in large-scale environments. Strong leadership, communication, and stakeholder management skills. Hands-on experience with data lakes, cloud platforms (AWS, Azure, or GCP) , and big data tools is a plus. Preferred Skills (Nice to Have): Experience with other Informatica products (IDQ, PowerCenter). Exposure to cloud MDM platforms or cloud data integration tools. Agile/Scrum development experience. Knowledge of industry-standard data security and compliance practices. Job Type: Full-time Pay: Up to ₹1,853,040.32 per year Benefits: Flexible schedule Health insurance Life insurance Provident Fund Schedule: Day shift Supplemental Pay: Performance bonus Yearly bonus Application Question(s): What is your notice period? Education: Bachelor's (Preferred) Experience: Informatica: 4 years (Preferred) Location: Noida H.O, Noida, Uttar Pradesh (Preferred) Work Location: In person
Noida H.O , Noida, Uttar Pradesh
INR Not disclosed
On-site
Full Time
About the Role: We are seeking talented and detail-oriented Data Engineers with expertise in Informatica MDM to join our fast-growing data engineering team. Depending on your experience, you’ll join as a Software Engineer or Senior Software Engineer, contributing to the design, development, and maintenance of enterprise data management solutions that support our business objectives. As a key player, you will be responsible for building reliable data pipelines, working with master data management, and ensuring data quality, governance, and integration across systems. Responsibilities: Design, develop, and implement data pipelines using ETL tools like Informatica PowerCenter, IICS, etc., and MDM solutions using Informatica MDM . Develop and maintain batch and real-time data integration workflows. Collaborate with data architects, business analysts, and stakeholders to understand data requirements. Perform data profiling, data quality assessments, and master data matching/merging. Implement governance, stewardship, and metadata management practices. Optimize the performance of Informatica MDM Hub, IDD, and associated components. Write complex SQL queries and stored procedures as needed. Senior Software Engineer – Additional Responsibilities: Lead design discussions and code reviews; mentor junior engineers. Architect scalable data integration solutions using Informatica and complementary tools. Drive adoption of best practices in data modeling, governance, and engineering. Work closely with cross-functional teams to shape the data strategy. Required Qualifications: Software Engineer: Bachelor’s degree in Computer Science, Information Systems, or related field. 2–4 years of experience with Informatica MDM (Customer 360, Business Entity Services, Match/Merge rules). Strong SQL and data modeling skills. Familiarity with ETL concepts, REST APIs , and data integration tools. Understanding of data governance and quality frameworks. Senior Software Engineer: Bachelor’s or Master’s in Computer Science, Data Engineering, or related field. 4+ years of experience in Informatica MDM, with at least 2 years in a lead role. Proven track record of designing scalable MDM solutions in large-scale environments. Strong leadership, communication, and stakeholder management skills. Hands-on experience with data lakes, cloud platforms (AWS, Azure, or GCP) , and big data tools is a plus. Preferred Skills (Nice to Have): Experience with other Informatica products (IDQ, PowerCenter). Exposure to cloud MDM platforms or cloud data integration tools. Agile/Scrum development experience. Knowledge of industry-standard data security and compliance practices. Job Type: Full-time Pay: Up to ₹1,853,040.32 per year Benefits: Flexible schedule Health insurance Life insurance Provident Fund Schedule: Day shift Supplemental Pay: Performance bonus Yearly bonus Application Question(s): What is your notice period? Education: Bachelor's (Preferred) Experience: Informatica: 4 years (Preferred) Location: Noida H.O, Noida, Uttar Pradesh (Preferred) Work Location: In person
Noida, Uttar Pradesh, India
Not disclosed
On-site
Full Time
Job Title: ETL Ingestion Engineer (Azure Data Factory) Department : Data Engineering / Analytics Employment Type : Full-time Experience Level : 2–5 years Count: ETL Ingestion Engr - 1 | Sr. ETL Ingestion Engr - 1 About the Role We are looking for a talented Data Engineer with hands-on experience in Azure Data Factory (ADF) to join our Data Engineering team. An individual will be responsible for building, orchestrating, and maintaining robust data ingestion pipelines from various source systems into our data lake or data warehouse environments. Key Responsibilities Design and implement scalable data ingestion pipelines using Azure Data Factory (ADF) . Extract data from a variety of sources such as SQL Server, flat files, APIs, and cloud storage. Develop ADF pipelines and data flows to support both batch and incremental loads. Ensure data quality, consistency, and reliability throughout the ETL process. Optimize ADF pipelines for performance, cost, and scalability. Monitor pipeline execution, troubleshoot failures, and ensure data availability meets SLAs. Document pipeline logic, source-target mappings, and operational procedures. Required Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field. 2+ years of experience in ETL development and data pipeline implementation. Strong hands-on experience with Azure Data Factory (ADF) , including linked services, datasets, pipelines, and triggers. Proficiency in SQL and working with structured and semi-structured data (CSV, JSON, Parquet). Experience with Azure storage systems (ADLS Gen2, Blob Storage) and data movement. Familiarity with job monitoring and logging mechanisms in Azure. Preferred Skills Experience with Azure Data Lake, Synapse Analytics, or Databricks. Exposure to Azure DevOps for CI/CD in data pipelines. Understanding of data governance, lineage, and compliance requirements (GDPR, HIPAA, etc.). Knowledge of RESTful APIs and API-based ingestion. Show more Show less
Noida, Uttar Pradesh, India
Not disclosed
On-site
Full Time
Responsibilities : As a QA Lead Automation, you will be responsible for leading the automation testing efforts, managing a team of automation testers, and collaborating with cross-functional teams to deliver high-quality Responsibilities : Lead and mentor an automation test team, providing guidance and support in automation best practices and techniques. Develop and maintain the automation testing strategy, including selecting appropriate tools and frameworks. Design, develop, and maintain automated test scripts for various software applications and features. Collaborate with developers, QA leads, and product managers to understand testing requirements and translate them into effective automated test scripts. Execute and monitor automated test suites, analyze results, and report defects to the development team. Oversee the end-to-end automation testing process, from test planning and design to execution and reporting. Continuously enhance the automation framework to improve test coverage, efficiency, and maintainability. Stay updated with industry trends and best practices in automation testing and integrate them into our testing processes. Collaborate with cross-functional teams to ensure timely delivery of high-quality software releases. Identify opportunities for process improvement and drive initiatives to enhance the overall quality assurance process. Maintain clear and organized documentation of automated test scripts, processes, and test : Bachelor's degree in Computer Science, Engineering, or a related field. 5 years of experience in automation testing, with 2-3 years in a leadership or lead role. Proven experience with automation testing tools and frameworks, such as [Cypress/Protractor & JavaScript Or Selenium with Java ]. Real-time project experience in Cypress/Protractor or any Javascript-based framework, Good work experience with Visual Code IDE. Strong programming, framework concepts, and debugging skills. In-depth understanding of software testing methodologies, best practices, and quality assurance principles. Sound concepts and implementation experience of Page object model and App action scripts. Solid knowledge of Cypress Library. Experience with continuous integration and continuous delivery (CI/CD) pipelines. Excellent problem-solving skills and attention to detail. Effective communication and leadership skills. Ability to collaborate effectively with cross-functional teams. (ref:hirist.tech) Show more Show less
Greater Kolkata Area
Not disclosed
On-site
Full Time
About The Company Veersa is a healthtech company that leverages emerging technology and data science to solve business problems in the US healthcare industry. Veersa has established a niche in serving small and medium entities in the US healthcare space through its tech frameworks, platforms, and tech accelerators. Veersa is known for providing innovative solutions using technology and data science to its client base and is the preferred innovation partner to its clients. Veersas rich technology expertise manifests in the various tech accelerators and frameworks developed in-house to assist in rapid solutions delivery and implementations. Its end-to-end data ingestion, curation, transformation, and augmentation framework has helped several clients quickly derive business insights and monetize data assets. Veersa teams work across all emerging technology areas such as AI/ML, IoT, and Blockchain and using tech stacks as MEAN, MERN, PYTHON, GoLang, ROR, and backend such as Java Springboot, NodeJs, and using databases as PostgreSQL, MS SQL, MySQL, Oracle on AWS and Azure cloud using serverless architecture. Veersa has two major business lines Veersalabs : an In-house R&D and product development platform and Veersa tech consulting : Technical solutions delivery for clients. Veersas customer base includes large US Healthcare software vendors, Pharmacy chains, Payers, providers, and Hospital chains. Though Veersas focus geography is North America, Veersa also provides product engineering expertise to a few clients in Australia and Singapore. About The Role We are seeking a highly skilled and experienced Senior/Lead Data Engineer to join our growing Data Engineering Team. In this critical role, you will design, architect, and develop cutting-edge multi-tenant SaaS data solutions hosted on Azure Cloud. Your work will focus on delivering robust, scalable, and high-performance data pipelines and integrations that support our enterprise provider and payer data ecosystem. This role is ideal for someone with deep experience in ETL/ELT processes, data warehousing principles, and real-time and batch data integrations. As a senior member of the team, you will also be expected to mentor and guide junior engineers, help define best practices, and contribute to the overall data strategy. Key Responsibilities Architect and implement scalable data integration and data pipeline solutions using Azure cloud services. Design, develop, and maintain ETL/ELT processes, including data extraction, transformation, loading, and quality checks. Collaborate with business and technical stakeholders to understand data requirements and translate them into technical solutions. Develop and manage data flows, data mappings, and data quality & validation rules across multiple tenants and systems. Implement best practices for data modeling, metadata management, and data governance. Configure, maintain, and monitor integration jobs to ensure high availability and performance. Lead code reviews, mentor data engineers, and help shape engineering culture and standards. Stay current with emerging technologies and recommend tools or processes to improve the team's Qualifications : Bachelors or Masters degree in Computer Science, Information Systems, or related field. 7+ years of experience in data engineering, with a strong focus on Azure-based solutions. Proven experience in designing and implementing real-time and batch data integrations. Hands-on experience with Azure Data Factory, Azure Data Lake, Azure Synapse, Databricks, or similar technologies. Strong understanding of data warehousing principles, ETL/ELT methodologies, and data pipeline architecture. Proficiency in SQL, Python, or Scala for data processing and pipeline development. Familiarity with data quality, metadata management, and data validation frameworks. Strong problem-solving skills and the ability to communicate complex technical concepts clearly. Preferred Qualifications Experience with multi-tenant SaaS data solutions. Background in healthcare data, especially provider and payer ecosystems. Familiarity with DevOps practices, CI/CD pipelines, and version control systems (e.g., Git). Experience mentoring and coaching other engineers in technical and architectural decision-making. (ref:hirist.tech) Show more Show less
Noida, Uttar Pradesh
INR Not disclosed
On-site
Full Time
About the Role: We are seeking a highly skilled and experienced Data Engineer to join our growing data team. This role is ideal for professionals with 6 to 10 years of experience in data engineering, with a strong foundation in SQL , Databricks , Spark SQL , PySpark , and BI tools like Power BI or Tableau . As a Data Engineer, you will be responsible for building scalable data pipelines, optimizing data processing workflows, and enabling insightful reporting and analytics across the organization. Key Responsibilities: Design and develop robust, scalable data pipelines using PySpark and Databricks . Write efficient SQL and Spark SQL queries for data transformation and analysis. Work closely with BI teams to enable reporting through Power BI or Tableau . Optimize performance of big data workflows and ensure data quality. Collaborate with business and technical stakeholders to gather and translate data requirements. Implement best practices for data integration, processing, and governance. Required Qualifications: Bachelor’s degree in Computer Science, Engineering, or a related field. 6–10 years of experience in data engineering or a similar role. Strong experience with SQL , Spark SQL , and PySpark . Hands-on experience with Databricks for big data processing. Proven experience with BI tools such as Power BI and/or Tableau . Strong understanding of data warehousing and ETL/ELT concepts. Good problem-solving skills and the ability to work in cross-functional teams. Nice to Have: Experience with cloud data platforms (Azure, AWS, or GCP). Familiarity with CI/CD pipelines and version control tools (e.g., Git). Understanding of data governance, security, and compliance standards. Exposure to data lake architectures and real-time streaming data pipelines. Job Type: Full-time Pay: ₹586,118.08 - ₹1,894,567.99 per year Benefits: Health insurance Schedule: Day shift Application Question(s): Strong experience with SQL, Spark SQL, and PySpark( At least 5- years hands on experience). Proven experience with BI tools such as Power BI and/or Tableau(At-least 5 years) Location: Noida, Uttar Pradesh (Required) Work Location: In person
Noida, Uttar Pradesh, India
Not disclosed
On-site
Full Time
About The Company Veersa is a healthtech company that leverages emerging technology and data science to solve business problems in the US healthcare industry. Veersa has established a niche in serving small and medium entities in the US healthcare space through its tech frameworks, platforms, and tech accelerators. Veersa is known for providing innovative solutions using technology and data science to its client base and is the preferred innovation partner to its clients. Veersas rich technology expertise manifests in the various tech accelerators and frameworks developed in-house to assist in rapid solutions delivery and implementations. Its end-to-end data ingestion, curation, transformation, and augmentation framework has helped several clients quickly derive business insights and monetize data assets. Veersa teams work across all emerging technology areas such as AI/ML, IoT, and Blockchain and using tech stacks as MEAN, MERN, PYTHON, GoLang, ROR, and backend such as Java Springboot, NodeJs, and using databases as PostgreSQL, MS SQL, MySQL, Oracle on AWS and Azure cloud using serverless architecture. Veersa has two major business lines Veersalabs : an In-house R&D and product development platform and Veersa tech consulting : Technical solutions delivery for clients. Veersas customer base includes large US Healthcare software vendors, Pharmacy chains, Payers, providers, and Hospital chains. Though Veersas focus geography is North America, Veersa also provides product engineering expertise to a few clients in Australia and Singapore. About Job Position: SE/ Senior Data Engineer (with SQL, Python, Airflow, Bash) About The Role We are seeking a highly skilled and experienced Senior/Lead Data Engineer to join our growing Data Engineering Team. In this critical role, you will design, architect, and develop cutting-edge multi-tenant SaaS data solutions hosted on Azure Cloud. Your work will focus on delivering robust, scalable, and high-performance data pipelines and integrations that support our enterprise provider and payer data ecosystem. This role is ideal for someone with deep experience in ETL/ELT processes, data warehousing principles, and real-time and batch data integrations. As a senior member of the team, you will also be expected to mentor and guide junior engineers, help define best practices, and contribute to the overall data strategy. We are specifically looking for someone with strong hands-on experience in SQL, Python, and ideally Airflow and Bash scripting. Key Responsibilities Architect and implement scalable data integration and data pipeline solutions using Azure cloud services. Design, develop, and maintain ETL/ELT processes, including data extraction, transformation, loading, and quality checks using tools like SQL, Python, and Airflow. Build and automate data workflows and orchestration pipelines; knowledge of Airflow or equivalent tools is a plus. Write and maintain Bash scripts for automating system tasks and managing data jobs. Collaborate with business and technical stakeholders to understand data requirements and translate them into technical solutions. Develop and manage data flows, data mappings, and data quality & validation rules across multiple tenants and systems. Implement best practices for data modeling, metadata management, and data governance. Configure, maintain, and monitor integration jobs to ensure high availability and performance. Lead code reviews, mentor data engineers, and help shape engineering culture and standards. Stay current with emerging technologies and recommend tools or processes to improve the team's effectiveness. Required Qualifications Must have B.Tech or B.E degree in Computer Science, Information Systems, or any related field. 3+ years of experience in data engineering, with a strong focus on Azure-based solutions. Proficiency in SQL and Python for data processing and pipeline development. Experience in developing and orchestrating pipelines using Airflow (preferred) and writing automation scripts using Bash. Proven experience in designing and implementing real-time and batch data integrations. Hands-on experience with Azure Data Factory, Azure Data Lake, Azure Synapse, Databricks, or similar technologies. Strong understanding of data warehousing principles, ETL/ELT methodologies, and data pipeline architecture. Familiarity with data quality, metadata management, and data validation frameworks. Strong problem-solving skills and the ability to communicate complex technical concepts clearly. Preferred Qualifications Experience with multi-tenant SaaS data solutions. Background in healthcare data, especially provider and payer ecosystems. Familiarity with DevOps practices, CI/CD pipelines, and version control systems (e.g., Git). Experience mentoring and coaching other engineers in technical and architectural decision-making. (ref:hirist.tech) Show more Show less
Itanagar, Arunachal Pradesh, India
Not disclosed
On-site
Full Time
About The Company Veersa is a healthtech company that leverages emerging technology and data science to solve business problems in the US healthcare industry. Veersa has established a niche in serving small and medium entities in the US healthcare space through its tech frameworks, platforms, and tech accelerators. Veersa is known for providing innovative solutions using technology and data science to its client base and is the preferred innovation partner to its clients. Veersa's rich technology expertise manifests in the various tech accelerators and frameworks developed in-house to assist in rapid solutions delivery and implementations. Its end-to-end data ingestion, curation, transformation, and augmentation framework has helped several clients quickly derive business insights and monetize data assets. Veersa teams work across all emerging technology areas such as AI/ML, IoT, and Blockchain and using tech stacks as MEAN, MERN, PYTHON, GoLang, ROR, and backend such as Java Springboot, NodeJs, and using databases as PostgreSQL, MS SQL, MySQL, Oracle on AWS and Azure cloud using serverless architecture. Veersa has two major business lines - Veersalabs: an In-house R&D and product development platform and Veersa tech consulting: Technical solutions delivery for clients. Veersa's customer base includes large US Healthcare software vendors, Pharmacy chains, Payers, providers, and Hospital chains. Though Veersa's focus geography is North America, Veersa also provides product engineering expertise to a few clients in Australia and Singapore. About The Role We are seeking a highly skilled and experienced Senior/Lead Data Engineer to join our growing Data Engineering Team. In this critical role, you will design, architect, and develop cutting-edge multi-tenant SaaS data solutions hosted on Azure Cloud. Your work will focus on delivering robust, scalable, and high-performance data pipelines and integrations that support our enterprise provider and payer data ecosystem. This role is ideal for someone with deep experience in ETL/ELT processes, data warehousing principles, and real-time and batch data integrations. As a senior member of the team, you will also be expected to mentor and guide junior engineers, help define best practices, and contribute to the overall data strategy. We are specifically looking for someone with strong hands-on experience in SQL, Python, and ideally Airflow and Bash scripting. Key Responsibilities Architect and implement scalable data integration and data pipeline solutions using Azure cloud services. Design, develop, and maintain ETL/ELT processes, including data extraction, transformation, loading, and quality checks using tools like SQL, Python, and Airflow. Build and automate data workflows and orchestration pipelines; knowledge of Airflow or equivalent tools is a plus. Write and maintain Bash scripts for automating system tasks and managing data jobs. Collaborate with business and technical stakeholders to understand data requirements and translate them into technical solutions. Develop and manage data flows, data mappings, and data quality & validation rules across multiple tenants and systems. Implement best practices for data modeling, metadata management, and data governance. Configure, maintain, and monitor integration jobs to ensure high availability and performance. Lead code reviews, mentor data engineers, and help shape engineering culture and standards. Stay current with emerging technologies and recommend tools or processes to improve the team's effectiveness. Required Qualifications Bachelor's or Master's degree in Computer Science, Information Systems, or related field. 3+ years of experience in data engineering, with a strong focus on Azure-based solutions. Proficiency in SQL and Python for data processing and pipeline development. Experience in developing and orchestrating pipelines using Airflow (preferred) and writing automation scripts using Bash. Proven experience in designing and implementing real-time and batch data integrations. Hands-on experience with Azure Data Factory, Azure Data Lake, Azure Synapse, Databricks, or similar technologies. Strong understanding of data warehousing principles, ETL/ELT methodologies, and data pipeline architecture. Familiarity with data quality, metadata management, and data validation frameworks. Strong problem-solving skills and the ability to communicate complex technical concepts clearly. Preferred Qualifications Experience with multi-tenant SaaS data solutions. Background in healthcare data, especially provider and payer ecosystems. Familiarity with DevOps practices, CI/CD pipelines, and version control systems (e.g., Git). Experience mentoring and coaching other engineers in technical and architectural decision-making. (ref:hirist.tech) Show more Show less
Noida
INR 5.86118 - 18.94568 Lacs P.A.
On-site
Full Time
Data Engineer (with SQL, Python, Airflow, Bash) About the Role We are seeking a highly skilled and experienced Senior/Lead Data Engineer to join our growing Data Engineering Team. In this critical role, you will design, architect, and develop cutting-edge multi-tenant SaaS data solutions hosted on Azure Cloud . Your work will focus on delivering robust, scalable, and high-performance data pipelines and integrations that support our enterprise provider and payer data ecosystem. This role is ideal for someone with deep experience in ETL/ELT processes, data warehousing principles, and real-time and batch data integrations. As a senior member of the team, you will also be expected to mentor and guide junior engineers, help define best practices, and contribute to the overall data strategy. We are specifically looking for someone with strong hands-on experience in SQL, Python, and ideally Airflow and Bash scripting. Key Responsibilities Architect and implement scalable data integration and data pipeline solutions using Azure cloud services. Design, develop, and maintain ETL/ELT processes, including data extraction, transformation, loading, and quality checks using tools like SQL, Python, and Airflow . Build and automate data workflows and orchestration pipelines; knowledge of Airflow or equivalent tools is a plus. Write and maintain Bash scripts for automating system tasks and managing data jobs. Collaborate with business and technical stakeholders to understand data requirements and translate them into technical solutions. Develop and manage data flows, data mappings, and data quality & validation rules across multiple tenants and systems. Implement best practices for data modeling, metadata management, and data governance. Configure, maintain, and monitor integration jobs to ensure high availability and performance. Lead code reviews, mentor data engineers, and help shape engineering culture and standards. Stay current with emerging technologies and recommend tools or processes to improve the team's effectiveness. Required Qualifications Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field. 3+ years of experience in data engineering, with a strong focus on Azure-based solutions. Proficiency in SQL and Python for data processing and pipeline development. Experience in developing and orchestrating pipelines using Airflow (preferred) and writing automation scripts using Bash . Proven experience in designing and implementing real-time and batch data integrations. Hands-on experience with Azure Data Factory, Azure Data Lake, Azure Synapse, Databricks, or similar technologies. Strong understanding of data warehousing principles, ETL/ELT methodologies, and data pipeline architecture. Familiarity with data quality, metadata management, and data validation frameworks. Strong problem-solving skills and the ability to communicate complex technical concepts clearly. Preferred Qualifications Experience with multi-tenant SaaS data solutions. Background in healthcare data, especially provider and payer ecosystems. Familiarity with DevOps practices, CI/CD pipelines, and version control systems (e.g., Git). Experience mentoring and coaching other engineers in technical and architectural decision-making. Job Type: Full-time Pay: ₹586,118.08 - ₹1,894,567.99 per year Benefits: Health insurance Schedule: Day shift Application Question(s): 3+ years of experience in data engineering, with a strong focus on Azure-based solutions(Mandatory) Hands-on experience with Azure Data Factory, Azure Data Lake, Azure Synapse, Databricks, or similar technologies(Min 3 years) Experience in developing and orchestrating pipelines using Airflow (preferred) and writing automation scripts using Bash(Min 3 years) Are you from Delhi NCR loaction(Mandatory) Experience: Airflow: 3 years (Required) Bash (Unix shell): 3 years (Required) SQL: 3 years (Required) Work Location: In person
Sadar, Uttar Pradesh, India
Not disclosed
On-site
Full Time
About The Company Veersa is a healthtech company that leverages emerging technology and data science to solve business problems in the US healthcare industry. Veersa has established a niche in serving small and medium entities in the US healthcare space through its tech frameworks, platforms, and tech accelerators. Veersa is known for providing innovative solutions using technology and data science to its client base and is the preferred innovation partner to its clients. Veersas rich technology expertise manifests in the various tech accelerators and frameworks developed in-house to assist in rapid solutions delivery and implementations. Its end-to-end data ingestion, curation, transformation, and augmentation framework has helped several clients quickly derive business insights and monetize data assets. Veersa teams work across all emerging technology areas such as AI/ML, IoT, and Blockchain and using tech stacks as MEAN, MERN, PYTHON, GoLang, ROR, and backend such as Java Springboot, NodeJs, and using databases as PostgreSQL, MS SQL, MySQL, Oracle on AWS and Azure cloud using serverless architecture. Veersa has two major business lines Veersalabs : an In-house R&D and product development platform and Veersa tech consulting : Technical solutions delivery for clients. Veersas customer base includes large US Healthcare software vendors, Pharmacy chains, Payers, providers, and Hospital chains. Though Veersas focus geography is North America, Veersa also provides product engineering expertise to a few clients in Australia and Singapore. About The Role We are seeking talented and detail-oriented Data Engineers with expertise in Informatica MDM to join our fast-growing data engineering team. Depending on your experience, youll join as a Software Engineer or Senior Software Engineer, contributing to the design, development, and maintenance of enterprise data management solutions that support our business objectives. As a key player, you will be responsible for building reliable data pipelines, working with master data management, and ensuring data quality, governance, and integration across systems. Responsibilities Design, develop, and implement data pipelines using ETL tools like Informatica PowerCenter, IICS, etc., and MDM solutions using Informatica MDM. Develop and maintain batch and real-time data integration workflows. Collaborate with data architects, business analysts, and stakeholders to understand data requirements. Perform data profiling, data quality assessments, and master data matching/merging. Implement governance, stewardship, and metadata management practices. Optimize the performance of Informatica MDM Hub, IDD, and associated components. Write complex SQL queries and stored procedures as needed. Senior Software Engineer Additional Responsibilities Lead design discussions and code reviews; mentor junior engineers. Architect scalable data integration solutions using Informatica and complementary tools. Drive adoption of best practices in data modeling, governance, and engineering. Work closely with cross-functional teams to shape the data strategy. Required Qualifications Software Engineer : Bachelors degree in Computer Science, Information Systems, or related field. 2 to 4 years of experience with Informatica MDM (Customer 360, Business Entity Services, Match/Merge rules). Strong SQL and data modeling skills. Familiarity with ETL concepts, REST APIs, and data integration tools. Understanding of data governance and quality frameworks. Senior Software Engineer Bachelors or Masters in Computer Science, Data Engineering, or related field. 4+ years of experience in Informatica MDM, with at least 2 years in a lead role. Proven track record of designing scalable MDM solutions in large-scale environments. Strong leadership, communication, and stakeholder management skills. Hands-on experience with data lakes, cloud platforms (AWS, Azure, or GCP), and big data tools is a plus. Preferred Skills (Nice To Have) Experience with other Informatica products (IDQ, PowerCenter). Exposure to cloud MDM platforms or cloud data integration tools. Agile/Scrum development experience. Knowledge of industry-standard data security and compliance practices. (ref:hirist.tech) Show more Show less
Noida, Uttar Pradesh, India
Not disclosed
On-site
Full Time
Job Description Position : Technical Lead (.NET) Location : Noida Experience : 6 to 10 years Key Responsibilities Design, develop, and maintain scalable web applications using .NET Core, ASP.NET MVC, and WCF. Implement front-end solutions using modern frameworks like Angular or React. Develop, optimize, and maintain data access layers using Entity Framework. Build and deploy applications on Azure Cloud, leveraging services such as Azure Functions, Azure App Services, and Azure SQL. Work with VB.NET for legacy system maintenance and migration. Collaborate with cross-functional teams to define, design, and ship new features. Ensure code quality through automated tests, code reviews, and adherence to best practices. Troubleshoot, debug, and upgrade existing systems as needed. Document technical designs, workflows, and processes on Jira Confluence for team-wide accessibility. Team Lead Responsibilities Technical Leadership: Guide and mentor the development team, ensuring best practices in coding, architecture, and performance optimization. Project Oversight: Oversee project timelines, technical implementation, and delivery to ensure high-quality releases. Collaboration: Work closely with product managers, designers, and stakeholders to align development efforts with business goals. Code Reviews & Standards: Conduct regular code reviews to maintain high coding standards and improve overall system design. Team Development: Identify skill gaps within the team and provide necessary training and mentorship. Issue Resolution: Act as an escalation point for complex technical challenges and production issues. Agile Execution: Lead Agile/Scrum ceremonies, ensuring smooth sprint planning, stand-ups, and retrospectives. Skills And Qualifications Technical Skills : Backend Development: Proficiency in .NET Core, ASP.NET MVC, C#, WCF, and RESTful APIs. Legacy System Support: Strong hands-on experience in VB.NET for maintaining and refactoring older applications. Frontend Development: Expertise in modern front-end frameworks like Angular or React, including state management tools (e.g., Redux or NgRx). Database Management: Hands-on experience with Entity Framework, SQL Server, and writing optimized queries for performance. Cloud Expertise: Strong experience in Azure Cloud Services, including Azure App Services, Azure Functions, Azure Storage, Azure SQL, and Azure DevOps. Architecture: Knowledge of microservices architecture, design patterns, and event-driven systems. DevOps and CI/CD: Experience with tools like Azure DevOps, Git, Jenkins, and implementing CI/CD pipelines. Testing: Familiarity with automated testing frameworks such as NUnit, MSTest, Jasmine, or Karma for both backend and frontend testing. Version Control: Proficiency with Git and branching strategies. Containerization: Knowledge of Docker and Kubernetes for containerized application development and deployment. Soft Skills Strong analytical and problem-solving skills with attention to detail. Excellent communication and ability to articulate complex technical concepts to non-technical stakeholders. Leadership capabilities with experience mentoring and guiding junior developers. Adaptability to work in Agile/Scrum environments and deliver under tight deadlines. Additional Skills WCF service development and Asp.net Web Services Stong debugging skills (ref:hirist.tech) Show more Show less
Noida, Uttar Pradesh, India
Not disclosed
On-site
Full Time
About The Company Veersa is a healthtech company that leverages emerging technology and data science to solve business problems in the US healthcare industry. Veersa has established a niche in serving small and medium entities in the US healthcare space through its tech frameworks, platforms, and tech accelerators. Veersa is known for providing innovative solutions using technology and data science to its client base and is the preferred innovation partner to its clients. Veersas rich technology expertise manifests in the various tech accelerators and frameworks developed in-house to assist in rapid solutions delivery and implementations. Its end-to-end data ingestion, curation, transformation, and augmentation framework has helped several clients quickly derive business insights and monetize data assets. Veersa teams work across all emerging technology areas such as AI/ML, IoT, and Blockchain and using tech stacks as MEAN, MERN, PYTHON, GoLang, ROR, and backend such as Java Springboot, NodeJs, and using databases as PostgreSQL, MS SQL, MySQL, Oracle on AWS and Azure cloud using serverless architecture. Veersa has two major business lines Veersalabs : an In-house R&D and product development platform and Veersa tech consulting : Technical solutions delivery for clients. Veersas customer base includes large US Healthcare software vendors, Pharmacy chains, Payers, providers, and Hospital chains. Though Veersas focus geography is North America, Veersa also provides product engineering expertise to a few clients in Australia and Singapore. Position : SE/ Senior Data Engineer (with SQL, Python, Airflow, Bash) About The Role We are seeking a highly skilled and experienced Senior/Lead Data Engineer to join our growing Data Engineering Team. In this critical role, you will design, architect, and develop cutting-edge multi-tenant SaaS data solutions hosted on Azure Cloud. Your work will focus on delivering robust, scalable, and high-performance data pipelines and integrations that support our enterprise provider and payer data ecosystem. This role is ideal for someone with deep experience in ETL/ELT processes, data warehousing principles, and real-time and batch data integrations. As a senior member of the team, you will also be expected to mentor and guide junior engineers, help define best practices, and contribute to the overall data strategy. We are specifically looking for someone with strong hands-on experience in SQL, Python, and ideally Airflow and Bash scripting. Key Responsibilities Architect and implement scalable data integration and data pipeline solutions using Azure cloud services. Design, develop, and maintain ETL/ELT processes, including data extraction, transformation, loading, and quality checks using tools like SQL, Python, and Airflow. Build and automate data workflows and orchestration pipelines; knowledge of Airflow or equivalent tools is a plus. Write and maintain Bash scripts for automating system tasks and managing data jobs. Collaborate with business and technical stakeholders to understand data requirements and translate them into technical solutions. Develop and manage data flows, data mappings, and data quality & validation rules across multiple tenants and systems. Implement best practices for data modeling, metadata management, and data governance. Configure, maintain, and monitor integration jobs to ensure high availability and performance. Lead code reviews, mentor data engineers, and help shape engineering culture and standards. Stay current with emerging technologies and recommend tools or processes to improve the team's effectiveness. Required Qualifications Must have B.Tech or B.E degree in Computer Science, Information Systems, or any related field. 3+ years of experience in data engineering, with a strong focus on Azure-based solutions. Proficiency in SQL and Python for data processing and pipeline development. Experience in developing and orchestrating pipelines using Airflow (preferred) and writing automation scripts using Bash. Proven experience in designing and implementing real-time and batch data integrations. Hands-on experience with Azure Data Factory, Azure Data Lake, Azure Synapse, Databricks, or similar technologies. Strong understanding of data warehousing principles, ETL/ELT methodologies, and data pipeline architecture. Familiarity with data quality, metadata management, and data validation frameworks. Strong problem-solving skills and the ability to communicate complex technical concepts clearly. Preferred Qualifications Experience with multi-tenant SaaS data solutions. Background in healthcare data, especially provider and payer ecosystems. Familiarity with DevOps practices, CI/CD pipelines, and version control systems (e.g., Git). Experience mentoring and coaching other engineers in technical and architectural decision-making. (ref:hirist.tech) Show more Show less
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.