We are looking for a talented Frontend Developer to contribute to our core product development. You will be involved in the full software development lifecycle, building scalable and user-friendly applications. Responsibilities Develop and maintain user-facing features using modern web technologies, such as React.js . Collaborate with designers and product managers to translate design mockups and user stories into responsive and engaging web applications. Optimize application performance and ensure cross-browser compatibility. Implement best practices and coding standards to ensure high-quality and maintainable code. Participate in code reviews and provide constructive feedback to improve code quality. Stay up-to-date with the latest industry trends and technologies to drive innovation in frontend development. Requirements Strong proficiency in HTML, CSS, and JavaScript. Experience in building web applications using React.js . Familiarity with RESTful APIs and integrating frontend applications with backend services. Understanding of responsive design principles and mobile-first development. Knowledge of version control systems, such as Git. Ability to work collaboratively in an Agile/Scrum development environment. Excellent problem-solving and communication skills. This job was posted by Sharan Mithran from The DataFlow Group. Show more Show less
Job Role: Dataflow is looking to hire an experienced full-stack user with rich experience in developing Node.JS and React.JS or Angular applications on Amazon’s AWS platform for internal and external consumers. This highly responsible position involves using established work procedures to analyse, design, develop, implement, maintain, re-engineer and troubleshoot the applications, both legacy and new. Successful candidates should be focussed on rapid, agile delivery of high quality designs with an eye for the smaller details and a passion for over-delivering. In return you can expect a salary commensurate with your experience, and the freedom to grow your capabilities into leading edge facets of technology. Duties and responsibilities: ● Working with DataFlow’s business analysis and project management team to fully comprehend requirements, to gather requirement stories and to develop solutions that accurately meet the design specs. ● Delivering innovative and well constructed technology solutions that meet the needs of today, but are envisioned for future use ● Owing experienced points of view to the remainder of the technology team ● Developing the highest quality code with associated commentary and documentation ● Ensure that data and application security are considered at the very outset of development and through the lifecycle of deployment. ● Respond quickly to major incidents and outages, providing immediate workarounds where business is impacted Key skills/requirements: ● Design, develop, and maintain high-quality full-stack applications using Node.js for back-end and React.js for front-end. ● Architect and implement microservices with a focus on scalability and performance. ● Develop and manage RESTful APIs using frameworks like Express.js or Meteor. ● Optimize application performance and troubleshoot production issues. ● Collaborate with cross-functional teams to define, design, and deliver new features. ● Ensure code quality by implementing best practices, including testing, documentation, and continuous integration/deployment (CI/CD). ● Work within Amazon AWS architecture, utilizing services such as Lambda, S3, and RDS. ● Manage and maintain relational databases such as Oracle, MySQL, POSTGRES, or their AWS RDS equivalents. ● Document and test APIs using tools like Swagger or Postman. ● 14+ years of full-stack development experience using Node.JS and React.JS/Angular ● Hands-on experience with microservices architecture. ● Strong problem-solving skills and attention to detail. ● Excellent communication skills in English. ● Experience in agile development methodologies. ● Build user-friendly and responsive interfaces using React.js. ● Ensure seamless integration between front-end and back-end services. Show more Show less
Role Overview: We are looking for a highly skilled and experienced Senior ETL & Data Streaming Engineer with over 10 years of experience to play a pivotal role in designing, developing, and maintaining our robust data pipelines. The ideal candidate will have deep expertise in both batch ETL processes and real-time data streaming technologies, coupled with extensive hands-on experience with AWS data services. A proven track record of working with Data Lake architectures and traditional Data Warehousing environments is essential. Key Responsibilities: Design, develop, and implement highly scalable, fault-tolerant, and performant ETL processes using industry-leading ETL tools to extract, transform, and load data from various source systems into our Data Lake and Data Warehouse. Architect and build batch and real-time data streaming solutions using technologies like Talend, Informatica, Apache Kafka or AWS Kinesis to support immediate data ingestion and processing requirements. Utilize and optimize a wide array of AWS data services, including but not limited to AWS S3, AWS Glue, AWS Redshift, AWS Lake Formation, AWS EMR, and others, to build and manage data pipelines. Collaborate with data architects, data scientists, and business stakeholders to understand data requirements and translate them into efficient data pipeline solutions. Ensure data quality, integrity, and security across all data pipelines and storage solutions. Monitor, troubleshoot, and optimize existing data pipelines for performance, cost-efficiency, and reliability. Develop and maintain comprehensive documentation for all ETL and streaming processes, data flows, and architectural designs. Implement data governance policies and best practices within the Data Lake and Data Warehouse environments. Mentor junior engineers and contribute to fostering a culture of technical excellence and continuous improvement. Stay abreast of emerging technologies and industry best practices in data engineering, ETL, and streaming. Required Qualifications: 10+ years of progressive experience in data engineering, with a strong focus on ETL, ELT and data pipeline development. Deep expertise in ETL Tools: Extensive hands-on experience with commercial or open-source ETL tools (Talend) Strong proficiency in Data Streaming Technologies: Proven experience with real-time data ingestion and processing using platforms such as AWS Glue,Apache Kafka, AWS Kinesis, or similar. Extensive AWS Data Services Experience: Proficiency with AWS S3 for data storage and management. Hands-on experience with AWS Glue for ETL orchestration and data cataloging. Strong knowledge of AWS Redshift for data warehousing and analytics. Familiarity with AWS Lake Formation for building secure data lakes. Good to have experience with AWS EMR for big data processing . Data Warehouse (DWH) Knowledge: Strong background in traditional data warehousing concepts, dimensional modeling (Star Schema, Snowflake Schema), and DWH design principles. Programming Languages: Proficient in SQL and at least one scripting language (e.g., Python, Scala) for data manipulation and automation. Database Skills: Strong understanding of relational databases and NoSQL databases. Version Control: Experience with version control systems (e.g., Git). Problem-Solving: Excellent analytical and problem-solving skills with a keen eye for detail. Communication: Strong verbal and written communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences. Preferred Qualifications: Certifications in AWS Data Analytics or other relevant areas. Show more Show less
Key Responsibilities Data Analysis & Insight Generation: Conduct in-depth analysis of large and complex datasets to identify trends, patterns, and anomalies. Translate raw data into clear, concise, and actionable insights that address key business questions. Data Manipulation & Transformation: Expertly "slice and dice" data using advanced functions and features in Google Sheets or Microsoft Excel to prepare data for analysis, build models, and create reports. Reporting & Visualization: Develop and maintain comprehensive dashboards, reports, and presentations that effectively communicate analytical findings to various stakeholders, including senior leadership. Problem Solving: Proactively identify business problems and opportunities, and leverage data to propose solutions and strategies. Stakeholder Collaboration: Work closely with cross-functional teams (e.g., Marketing, Operations, Product, Finance) to understand their data needs and provide data-driven support for decision-making. Data Quality & Governance: Contribute to ensuring data accuracy, consistency, and integrity across various data sources. Methodology & Innovation: Continuously explore and implement new analytical techniques and tools to enhance the team's capabilities and efficiency. Qualifications Required Experience: Progressive experience in a data analytics role, with a proven track record of delivering impactful data-driven insights. Analytical Mindset: Demonstrated strong analytical and problem-solving skills, with an innate curiosity and ability to break down complex problems into manageable components. Data Manipulation Expertise: Exceptional hands-on proficiency with either Google Sheets or Microsoft Excel (advanced functions, pivot tables, VLOOKUP/XLOOKUP, conditional formatting, data validation, charting, etc.) is a must. Data Slicing & Dicing: Proven ability to effectively manipulate, transform, and analyze large volumes of data from various sources to uncover meaningful patterns and relationships. Communication Skills: Excellent verbal and written communication skills, with the ability to articulate complex analytical concepts to non-technical audiences clearly and concisely. Attention to Detail: Meticulous attention to detail and a commitment to data accuracy and integrity. Proactiveness: Self-motivated, proactive, and able to work independently as well as collaboratively within a team environment. Preferred Qualifications (Nice to Haves): Experience with SQL for data querying and extraction. Experience with statistical analysis and modeling (e.g., Python, R)
About The Organisation DataFlow Group is a pioneering global provider of specialized Primary Source Verification (PSV) solutions, and background screening and immigration compliance services that assist public and private organizations in mitigating risks to make informed, cost-effective decisions regarding their Applicants and Registrants. About The Role Were currently searching for an experienced business analyst to help guide our organization to the future. From researching progressive systems solutions to evaluating their impacts, the ideal candidate will be a detailed planner, expert communicator, and top-notch analyst. This person should also be wholly committed to the discovery and development of innovative solutions in an ever changing digital landscape. Duties And Responsibilities Strategic Alignment : Collaborate closely with senior leadership (e.g., C-suite executives, Directors) to understand their strategic goals, key performance indicators (KPIs), and critical information needs. Requirements Elicitation & Analysis Facilitate workshops, interviews, and other elicitation techniques to gather detailed business requirements for corporate analytics dashboards. Analyze and document these requirements clearly, concisely, and unambiguously, ensuring alignment with overall business strategy. User Story & Acceptance Criteria Definition Translate high-level business requirements into detailed user stories with clear and measurable acceptance criteria for the development team. Data Understanding & Mapping Work with data owners and subject matter experts to understand underlying data sources, data quality, and data governance policies relevant to the dashboards. Collaborate with the development team on data mapping and transformation logic. Dashboard Design & Prototyping Collaboration Partner with UI/UX designers and the development team to conceptualize and prototype dashboard layouts, visualizations, and user interactions that effectively communicate key insights to senior stakeholders. Provide feedback and ensure designs meet business requirements and usability standards. Stakeholder Communication & Management Act as the central point of contact between senior leadership and the development team. Proactively communicate progress, challenges, and key decisions to all stakeholders. Manage expectations and ensure alignment throughout the project lifecycle. Prioritization & Backlog Management Work with stakeholders to prioritize dashboard development based on business value and strategic importance. Maintain and groom the product backlog, ensuring it reflects current priorities and requirements. Testing & Validation Support Support the testing phase by reviewing test plans, participating in user acceptance testing (UAT), and ensuring the delivered dashboards meet the defined requirements and acceptance criteria. Training & Documentation Develop and deliver training materials and documentation for senior users on how to effectively utilize the new dashboards and interpret the presented data. Continuous Improvement Gather feedback from users post-implementation and work with the development team to identify areas for improvement and future enhancements to the corporate analytics platform. Industry Best Practices Stay abreast of the latest trends and best practices in business intelligence, data visualization, and analytics. Project Management Develop and maintain project plans for agreed initiatives in collaboration with stakeholders. Monitor project progress against defined timelines, prepare and present regular project status reports to stakeholders. Qualifications Bachelor's degree in Business Administration, Computer Science, Information Systems, Economics, Finance, or a related field. Minimum of 10+ years of experience as a Business Analyst, with a significant focus on business intelligence, data analytics, and dashboard development projects. Proven experience in leading requirements gathering, and analysis efforts with senior leadership and executive stakeholders, and able to translate complex business requirements into clear and actionable technical specifications. Demonstrable experience in managing BI and dashboarding projects, including project planning, risk management, and stakeholder communication Strong understanding of reporting, data warehousing concepts, ETL processes and data modeling principles. Excellent knowledge of data visualization best practices and principles of effective dashboard design. Experience working with common business intelligence and data visualization tools (e.g., Tableau, Power BI, Qlik Sense). Exceptional communication (written and verbal), presentation, and interpersonal skills, with the ability to effectively communicate with both business and technical audiences. Strong facilitation and negotiation skills to lead workshops and drive consensus among diverse stakeholder groups. Excellent analytical and problem-solving skills with keen attention to detail. Ability to work independently and manage multiple priorities in a fast-paced environment. Experience with Agile methodologies (e.g., Scrum, Kanban). (ref:hirist.tech)
About The Organisation DataFlow Group is a pioneering global provider of specialized Primary Source Verification (PSV) solutions, and background screening and immigration compliance services that assist public and private organizations in mitigating risks to make informed, cost-effective decisions regarding their Applicants and Registrants. About The Role We are looking for a highly skilled and experienced Senior ETL & Data Streaming Engineer with over 10 years of experience to play a pivotal role in designing, developing, and maintaining our robust data pipelines. The ideal candidate will have deep expertise in both batch ETL processes and real-time data streaming technologies, coupled with extensive hands-on experience with AWS data services. A proven track record of working with Data Lake architectures and traditional Data Warehousing environments is essential. Duties And Responsibilities Design, develop, and implement highly scalable, fault-tolerant, and performant ETL processes using industry-leading ETL tools to extract, transform, and load data from various source systems into our Data Lake and Data Warehouse. Architect and build batch and real-time data streaming solutions using technologies like Talend, Informatica, Apache Kafka or AWS Kinesis to support immediate data ingestion and processing requirements. Utilize and optimize a wide array of AWS data services Collaborate with data architects, data scientists, and business stakeholders to understand data requirements and translate them into efficient data pipeline solutions. Ensure data quality, integrity, and security across all data pipelines and storage solutions. Monitor, troubleshoot, and optimize existing data pipelines for performance, cost-efficiency, and reliability. Develop and maintain comprehensive documentation for all ETL and streaming processes, data flows, and architectural designs. Implement data governance policies and best practices within the Data Lake and Data Warehouse environments. Mentor junior engineers and contribute to fostering a culture of technical excellence and continuous improvement. Stay abreast of emerging technologies and industry best practices in data engineering, ETL, and streaming. Qualifications 10+ years of progressive experience in data engineering, with a strong focus on ETL, ELT and data pipeline development. Deep expertise in ETL Tools : Extensive hands-on experience with commercial ETL tools (Talend) Strong proficiency in Data Streaming Technologies : Proven experience with real-time data ingestion and processing using platforms such as AWS Glue,Apache Kafka, AWS Kinesis, or similar. Extensive AWS Data Services Experience : Proficiency with AWS S3 for data storage and management. Hands-on experience with AWS Glue for ETL orchestration and data cataloging. Familiarity with AWS Lake Formation for building secure data lakes. Good to have experience with AWS EMR for big data processing Data Warehouse (DWH) Knowledge : Strong background in traditional data warehousing concepts, dimensional modeling (Star Schema, Snowflake Schema), and DWH design principles. Programming Languages : Proficient in SQL and at least one scripting language (e.g., Python, Scala) for data manipulation and automation. Database Skills : Strong understanding of relational databases and NoSQL databases. Version Control : Experience with version control systems (e.g., Git). Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. Communication : Strong verbal and written communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences. (ref:hirist.tech)
As a Senior ETL & Data Streaming Engineer at DataFlow Group, a global provider of Primary Source Verification solutions and background screening services, you will be a key player in the design, development, and maintenance of robust data pipelines. With over 10 years of experience, you will leverage your expertise in both batch ETL processes and real-time data streaming technologies to ensure efficient data extraction, transformation, and loading into our Data Lake and Data Warehouse. Your responsibilities will include designing and implementing highly scalable ETL processes using industry-leading tools, as well as architecting batch and real-time data streaming solutions with technologies like Talend, Informatica, Apache Kafka, or AWS Kinesis. You will collaborate with data architects, data scientists, and business stakeholders to understand data requirements and translate them into effective pipeline solutions, ensuring data quality, integrity, and security across all storage solutions. Monitoring, troubleshooting, and optimizing existing data pipelines for performance, cost-efficiency, and reliability will be a crucial part of your role. Additionally, you will develop comprehensive documentation for all ETL and streaming processes, contribute to data governance policies, and mentor junior engineers to foster a culture of technical excellence and continuous improvement. To excel in this position, you should have 10+ years of progressive experience in data engineering, with a focus on ETL, ELT, and data pipeline development. Your deep expertise in ETL tools like Talend, proficiency in Data Streaming Technologies such as AWS Glue and Apache Kafka, and extensive experience with AWS data services like S3, Glue, and Lake Formation will be essential. Strong knowledge of traditional data warehousing concepts, dimensional modeling, programming languages like SQL and Python, and relational and NoSQL databases will also be required. If you are a problem-solver with excellent analytical skills, strong communication abilities, and a passion for staying updated on emerging technologies and industry best practices in data engineering, ETL, and streaming, we invite you to join our team at DataFlow Group and make a significant impact in the field of data management.,
As a Senior ETL & Data Streaming Engineer at DataFlow Group, you will have the opportunity to utilize your extensive expertise in designing, developing, and maintaining robust data pipelines. With over 10 years of experience in the field, you will play a pivotal role in ensuring the scalability, fault-tolerance, and performance of our ETL processes. Your responsibilities will include architecting and building both batch and real-time data streaming solutions using technologies like Talend, Informatica, Apache Kafka, or AWS Kinesis. You will collaborate closely with data architects, data scientists, and business stakeholders to translate data requirements into efficient pipeline solutions and ensure data quality, integrity, and security across all storage solutions. In addition to monitoring, troubleshooting, and optimizing existing data pipelines, you will also be responsible for developing and maintaining comprehensive documentation for all ETL and streaming processes. Your role will involve implementing data governance policies and best practices within the Data Lake and Data Warehouse environments, as well as mentoring junior engineers to foster a culture of technical excellence and continuous improvement. To excel in this role, you should have a strong background in data engineering, with a focus on ETL, ELT, and data pipeline development. Your deep expertise in ETL tools, data streaming technologies, and AWS data services will be essential for success. Proficiency in SQL and at least one scripting language for data manipulation, along with strong database skills, will also be valuable assets in this position. If you are a proactive problem-solver with excellent analytical skills and strong communication abilities, this role offers you the opportunity to stay abreast of emerging technologies and industry best practices in data engineering, ETL, and streaming. Join us at DataFlow Group and be part of a team dedicated to making informed, cost-effective decisions through cutting-edge data solutions.,
About The Organisation DataFlow Group is a pioneering global provider of specialized Primary Source Verification (PSV) solutions, and background screening and immigration compliance services that assist public and private organizations in mitigating risks to make informed, cost-effective decisions regarding their Applicants and Registrants. About The Role Were currently searching for an experienced business analyst to help guide our organization to the future. From researching progressive systems solutions to evaluating their impacts, the ideal candidate will be a detailed planner, expert communicator, and top-notch analyst. This person should also be wholly committed to the discovery and development of innovative solutions in an ever changing digital landscape. Duties And Responsibilities Strategic Alignment : Collaborate closely with senior leadership (e.g., C-suite executives, Directors) to understand their strategic goals, key performance indicators (KPIs), and critical information needs. Requirements Elicitation & Analysis Facilitate workshops, interviews, and other elicitation techniques to gather detailed business requirements for corporate analytics dashboards. Analyze and document these requirements clearly, concisely, and unambiguously, ensuring alignment with overall business strategy. User Story & Acceptance Criteria Definition Translate high-level business requirements into detailed user stories with clear and measurable acceptance criteria for the development team. Data Understanding & Mapping Work with data owners and subject matter experts to understand underlying data sources, data quality, and data governance policies relevant to the dashboards. Collaborate with the development team on data mapping and transformation logic. Dashboard Design & Prototyping Collaboration Partner with UI/UX designers and the development team to conceptualize and prototype dashboard layouts, visualizations, and user interactions that effectively communicate key insights to senior stakeholders. Provide feedback and ensure designs meet business requirements and usability standards. Stakeholder Communication & Management Act as the central point of contact between senior leadership and the development team. Proactively communicate progress, challenges, and key decisions to all stakeholders. Manage expectations and ensure alignment throughout the project lifecycle. Prioritization & Backlog Management Work with stakeholders to prioritize dashboard development based on business value and strategic importance. Maintain and groom the product backlog, ensuring it reflects current priorities and requirements. Testing & Validation Support Support the testing phase by reviewing test plans, participating in user acceptance testing (UAT), and ensuring the delivered dashboards meet the defined requirements and acceptance criteria. Training & Documentation Develop and deliver training materials and documentation for senior users on how to effectively utilize the new dashboards and interpret the presented data. Continuous Improvement Gather feedback from users post-implementation and work with the development team to identify areas for improvement and future enhancements to the corporate analytics platform. Industry Best Practices Stay abreast of the latest trends and best practices in business intelligence, data visualization, and analytics. Project Management Develop and maintain project plans for agreed initiatives in collaboration with stakeholders. Monitor project progress against defined timelines, prepare and present regular project status reports to stakeholders. Qualifications Bachelor's degree in Business Administration, Computer Science, Information Systems, Economics, Finance, or a related field. Minimum of 10+ years of experience as a Business Analyst, with a significant focus on business intelligence, data analytics, and dashboard development projects. Proven experience in leading requirements gathering, and analysis efforts with senior leadership and executive stakeholders, and able to translate complex business requirements into clear and actionable technical specifications. Demonstrable experience in managing BI and dashboarding projects, including project planning, risk management, and stakeholder communication Strong understanding of reporting, data warehousing concepts, ETL processes and data modeling principles. Excellent knowledge of data visualization best practices and principles of effective dashboard design. Experience working with common business intelligence and data visualization tools (e.g., Tableau, Power BI, Qlik Sense). Exceptional communication (written and verbal), presentation, and interpersonal skills, with the ability to effectively communicate with both business and technical audiences. Strong facilitation and negotiation skills to lead workshops and drive consensus among diverse stakeholder groups. Excellent analytical and problem-solving skills with keen attention to detail. Ability to work independently and manage multiple priorities in a fast-paced environment. Experience with Agile methodologies (e.g., Scrum, Kanban). (ref:hirist.tech)
About The Organisation DataFlow Group is a pioneering global provider of specialized Primary Source Verification (PSV) solutions, and background screening and immigration compliance services that assist public and private organizations in mitigating risks to make informed, cost-effective decisions regarding their Applicants and Registrants. About The Role We are looking for a highly skilled and experienced Senior ETL & Data Streaming Engineer with over 10 years of experience to play a pivotal role in designing, developing, and maintaining our robust data pipelines. The ideal candidate will have deep expertise in both batch ETL processes and real-time data streaming technologies, coupled with extensive hands-on experience with AWS data services. A proven track record of working with Data Lake architectures and traditional Data Warehousing environments is essential. Duties And Responsibilities Design, develop, and implement highly scalable, fault-tolerant, and performant ETL processes using industry-leading ETL tools to extract, transform, and load data from various source systems into our Data Lake and Data Warehouse. Architect and build batch and real-time data streaming solutions using technologies like Talend, Informatica, Apache Kafka or AWS Kinesis to support immediate data ingestion and processing requirements. Utilize and optimize a wide array of AWS data services Collaborate with data architects, data scientists, and business stakeholders to understand data requirements and translate them into efficient data pipeline solutions. Ensure data quality, integrity, and security across all data pipelines and storage solutions. Monitor, troubleshoot, and optimize existing data pipelines for performance, cost-efficiency, and reliability. Develop and maintain comprehensive documentation for all ETL and streaming processes, data flows, and architectural designs. Implement data governance policies and best practices within the Data Lake and Data Warehouse environments. Mentor junior engineers and contribute to fostering a culture of technical excellence and continuous improvement. Stay abreast of emerging technologies and industry best practices in data engineering, ETL, and streaming. Qualifications 10+ years of progressive experience in data engineering, with a strong focus on ETL, ELT and data pipeline development. Deep expertise in ETL Tools : Extensive hands-on experience with commercial ETL tools (Talend) Strong proficiency in Data Streaming Technologies : Proven experience with real-time data ingestion and processing using platforms such as AWS Glue,Apache Kafka, AWS Kinesis, or similar. Extensive AWS Data Services Experience : Proficiency with AWS S3 for data storage and management. Hands-on experience with AWS Glue for ETL orchestration and data cataloging. Familiarity with AWS Lake Formation for building secure data lakes. Good to have experience with AWS EMR for big data processing Data Warehouse (DWH) Knowledge : Strong background in traditional data warehousing concepts, dimensional modeling (Star Schema, Snowflake Schema), and DWH design principles. Programming Languages : Proficient in SQL and at least one scripting language (e.g., Python, Scala) for data manipulation and automation. Database Skills : Strong understanding of relational databases and NoSQL databases. Version Control : Experience with version control systems (e.g., Git). Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. Communication : Strong verbal and written communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences. (ref:hirist.tech)
About the organisation DataFlow Group, founded in 2007, is a global leader in Primary Source Verification (PSV), background screening, and immigration compliance solutions. The business works with a range of global public and private sector organisations to mitigate risk by validating credentials and detecting fraudulent documents, safeguarding communities and organisations worldwide. With over 160,000 issuing authorities across more than 200 countries, DataFlow is at the forefront of trust and transparency in talent verification. The mission is simple: Empower talent to navigate careers without borders, with trust and transparency. To learn more about DataFlow Group, please visit : https://www.dataflowgroup.com/ . Role context and Summary DataFlow Group is seeking a highly skilled and results-driven Quality Assurance Director to lead the end-to-end quality assurance function for the rollout and ongoing delivery of our new Apex Platform - a mission critical system that supports primary source verification for professional credentials, licences and work experience. This role is instrumental in designing, implementing and governing the test strategy, quality assurance processes, and associated tooling that ensure the platform meets the highest standards of functionality, usability, performance and scalability. The ideal candidate will have deep experience in leading quality assurance and test teams, driving test automation, and building the testing components of CI/CD pipelines to support fast, iterative and high quality delivery. Key responsibilities Test Strategy & Governance Define and own the overall end-to-end quality assurance strategy, covering functional, non functional, integration, regression, team capability, tooling strategy and test KPIs to measure the effectiveness of the test strategy. Develop and implement a test governance framework to ensure test traceability, coverage and quality control. 2 Tooling Strategy Define and implement a test tool strategy, selecting and configuring, managing the test tools and frameworks (e.g. Selenium, Playwright, Cypress, Postman, JMeter, Gitlab, Sonarqube, DevOps, Jenkins to seamlessly work together in the CI pipeline. 3 CI/CD Design and implement the testing architecture within the CI/CD pipeline to support automated build, test and deployment cycles. Work closely with the engineering team to integrate automated tests (unit, API, UI, functional, and regression into the CI/CD workflows. Establish shift left testing practices, enabling earlier detection of defects in the SDLC. Build reusable test libraries and test automation suites to accelerate regression and release testing. 4 Team Leadership & Collaboration Lead a cross-functional team of test engineers, automation specialists and manual testers. Foster a culture of quality, continuous testing, and proactive risk identification. Work closely with Product Engineering and Business Operations, to align on priorities and milestones. 5 Performance & Scalability Develop and refine the platform volumetrics, manage the benchmarking activities to establish a baseline. Plan and execute performance testing aligned to volumetric benchmarks, SLAs and peak scenarios, by managing an external vendor for this exercise. Validate platform stability and scalability through repeatable test cycles and proactive risk identification. Provide assurance on platform readiness for client migrations and high volume activity. 6 Operational Excellence Develop and maintain test metrics and reporting dashboards to inform stakeholders of quality status, test progress and defect trends Essential requirements and qualifications Minimum of 15+ years in the software development industry, with 3+ years as a Test/QA Manager. Proven experience with designing and running test strategies for complex platform rollouts. Deep knowledge of QA methodologies, Agile delivery and DevOps practices. Experience in building and maintaining automated test pipelines in AWS CI/CD environments. Hands-on experience with tools such as Selenium, Cypress, Playwrite, JMeter, Gitlab, and Jenkins. Familiar with working within a Hyperscalar environment such as AWS, GCP or Azure. Ability to manage test planning, defect triage, and test sign-off for large scale programs. Strong stakeholder communications and leadership skills Experienced with API testing, microservices, data migrations. ISTQB or other formal testing certifications.
DataFlow Group is a pioneering global provider of specialized Primary Source Verification (PSV) solutions, background screening, and immigration compliance services. The company assists public and private organizations in mitigating risks to make informed decisions regarding their Applicants and Registrants. DataFlow adheres to the highest regulatory standards, including JCI, ISO, and GDPR. Headquartered in Dubai, DataFlow has offices in multiple regions globally and provides multilingual services tailored to local markets. The company utilizes proficient language partners to enhance linguistic capabilities for document processing. We are seeking a highly experienced and strategic Senior Manager to lead our global Legal, Compliance, and Contract Management functions. This role is pivotal in ensuring legal integrity, regulatory adherence, and effective contract lifecycle management. The ideal candidate will have a deep understanding of legal principles, compliance frameworks, and contract administration to provide expert guidance to senior leadership. **Duties And Responsibilities:** **Legal & Compliance:** - Provide expert legal advice to senior management on legal and compliance matters. - Anticipate and mitigate potential legal risks. - Develop and implement legal strategies aligned with business objectives. **Regulatory Compliance:** - Establish and maintain a comprehensive compliance program. - Monitor and interpret relevant laws and regulations. - Conduct internal audits and investigations for compliance. - Manage relationships with regulatory bodies. **Risk Management:** - Identify, assess, and mitigate legal and compliance risks. - Develop risk management policies and procedures. - Ensure effective risk reporting. **Training & Education:** - Provide training and education on legal and compliance matters to employees. **Liaison:** - Act as a liaison with external legal counsel and regulatory bodies. - Manage intellectual property portfolios. **Contract Management:** - Oversee the contract lifecycle from drafting to termination. - Maintain a centralized contract database. - Review and negotiate contract terms to protect company interests. - Monitor contract compliance and generate reports. - Develop contract templates. **Policy Development:** - Implement company-wide legal, compliance, and contract management policies. - Communicate policies to all employees. **Dashboard Development & Reporting:** - Create interactive dashboards to visualize key metrics. - Analyze data to identify trends and insights. - Track relevant KPIs for legal, compliance, and contract management. - Generate reports and presentations for management. - Automate data collection and reporting. **Requirements:** - Bachelor's degree in Law (LLB) required; Master's degree (LLM) preferred. - 8+ years of experience in legal, compliance, and contract management. - Strong understanding of relevant laws, regulations, and industry standards. - Experience in contract negotiation, drafting, and administration. - Proficiency in contract management tools. - Excellent communication, negotiation, and interpersonal skills. - Strong analytical and problem-solving abilities. - Ability to work independently and as part of a team. - Detail-oriented, organized, and experienced in compliance programs. - Knowledge of risk assessment methodologies and data privacy laws. - High ethical standards and integrity. *Note: This job description was sourced from iimjobs.com*,
As a Business Analyst at DataFlow Group, your role is crucial in guiding the organization towards the future by researching progressive system solutions, evaluating their impacts, and developing innovative solutions in the ever-changing digital landscape. You will collaborate closely with senior leadership to understand strategic goals, key performance indicators, and critical information needs. By facilitating workshops, interviews, and other elicitation techniques, you will gather detailed business requirements for corporate analytics dashboards and ensure alignment with the overall business strategy. Translating high-level business requirements into detailed user stories with clear acceptance criteria for the development team will be a key responsibility. You will work with data owners and subject matter experts to understand data sources, quality, and governance policies, collaborating on data mapping and transformation logic. Partnering with UI/UX designers and the development team, you will conceptualize dashboard layouts, visualizations, and user interactions that communicate key insights effectively to senior stakeholders. Act as the central point of contact between senior leadership and the development team, proactively communicating progress, challenges, and key decisions to all stakeholders throughout the project lifecycle. You will prioritize dashboard development based on business value and strategic importance, maintaining and grooming the product backlog to reflect current priorities and requirements. Supporting the testing phase, participating in user acceptance testing, and ensuring delivered dashboards meet defined requirements and acceptance criteria are crucial tasks. Developing and delivering training materials and documentation for senior users on utilizing new dashboards and interpreting data, gathering feedback post-implementation, and identifying areas for improvement are part of your responsibilities. Staying updated with industry best practices in business intelligence, data visualization, and analytics is essential. Your qualifications include a Bachelor's degree in Business Administration, Computer Science, Information Systems, Economics, Finance, or a related field. With a minimum of 10+ years of experience as a Business Analyst, focusing on business intelligence, data analytics, and dashboard development projects. Strong communication, presentation, and interpersonal skills are required to effectively communicate with both business and technical audiences. Additionally, experience with Agile methodologies and common business intelligence tools is preferred. In this role, you will play a vital part in shaping the future of DataFlow Group by leveraging your expertise in business analysis, data analytics, and dashboard development to drive strategic decision-making and innovation.,
DataFlow Group, founded in 2007, is a global leader in Primary Source Verification (PSV), background screening, and immigration compliance solutions. The business collaborates with various global public and private sector organizations to mitigate risk by validating credentials and identifying fraudulent documents, thereby safeguarding communities and organizations worldwide. With a network of over 160,000 issuing authorities spanning across more than 200 countries, DataFlow Group upholds trust and transparency in talent verification. The mission at DataFlow Group is clear and concise: Empower talent to navigate careers without borders, fostering an environment of trust and transparency. For more information about DataFlow Group, please visit their website at: https://www.dataflowgroup.com/. DataFlow Group is currently looking for a highly skilled and results-driven Quality Assurance Director to take charge of the end-to-end quality assurance function for the implementation and continuous delivery of their new Apex Platform. This platform serves as a mission-critical system supporting primary source verification for professional credentials, licenses, and work experience. In this role, the Quality Assurance Director will play a pivotal role in formulating, executing, and overseeing the test strategy, quality assurance processes, and related tools to ensure that the platform adheres to the highest standards of functionality, usability, performance, and scalability. The ideal candidate should possess extensive experience in leading quality assurance and test teams, promoting test automation, and establishing the testing components of CI/CD pipelines to facilitate rapid, iterative, and high-quality delivery. Key Responsibilities: **Test Strategy & Governance** Define and take ownership of the comprehensive end-to-end quality assurance strategy encompassing functional, non-functional, integration, regression, team capability, tooling strategy, and test KPIs for evaluating the efficacy of the test strategy. Establish and implement a test governance framework to ensure test traceability, coverage, and quality control. **Tooling Strategy** Define and implement a test tool strategy by selecting, configuring, and managing test tools and frameworks (e.g., Selenium, Playwright, Cypress, Postman, JMeter, Gitlab, Sonarqube, DevOps, Jenkins) to seamlessly integrate in the CI pipeline. **CI/CD** Architect and implement the testing architecture within the CI/CD pipeline to support automated build, test, and deployment cycles. Collaborate closely with the engineering team to incorporate automated tests (unit, API, UI, functional, and regression) into the CI/CD workflows. Introduce shift left testing practices to enable early defect detection in the SDLC. **Team Leadership & Collaboration** Lead a cross-functional team comprising test engineers, automation specialists, and manual testers. Cultivate a culture of quality, continuous testing, and proactive risk identification. Engage with Product Engineering and Business Operations teams to align on priorities and milestones. **Performance & Scalability** Develop and refine platform volumetrics, oversee benchmarking activities to establish a baseline. Plan and conduct performance testing in line with volumetric benchmarks, SLAs, and peak scenarios by coordinating with an external vendor for this purpose. Verify platform stability and scalability through repeatable test cycles and proactive risk identification. Ensure platform readiness for client migrations and high-volume activities. **Operational Excellence** Create and maintain test metrics and reporting dashboards to update stakeholders on quality status, test progress, and defect trends. Essential Requirements and Qualifications: Minimum of 15+ years in the software development industry, with at least 3 years as a Test/QA Manager. Proven track record of designing and executing test strategies for complex platform rollouts. In-depth understanding of QA methodologies, Agile delivery, and DevOps practices. Hands-on experience with tools like Selenium, Cypress, Playwright, JMeter, Gitlab, and Jenkins. Familiarity with working in a Hyperscalar environment such as AWS, GCP, or Azure. Proficiency in managing test planning, defect triage, and test sign-off in large-scale programs. Strong communication and leadership skills with stakeholders. Experience with API testing, microservices, and data migrations. Possession of ISTQB or other formal testing certifications.,
About The Role We are seeking a highly experienced and passionate Engineering Manager to lead and grow our software engineering team. In this critical role, you will be responsible for leading and mentoring a team of talented engineers, driving the technical vision for our products, and ensuring the successful delivery of high-quality, impactful solutions. Responsibilities Lead and mentor a high-performing team of software engineers, fostering a collaborative and innovative environment. Define and execute the technical vision for our products, ensuring they align with business objectives and industry best practices. Design, develop, and review technical architectures for complex software systems. Drive the adoption of agile methodologies and best practices within the engineering team. Collaborate closely with product managers, stakeholders, and other departments to understand business requirements and translate them into technical solutions. Ensure high code quality and maintainability through code reviews, design reviews, and continuous improvement processes. Drive efficiency and predictability in a fast-paced, agile environment. Foster a culture of innovation and continuous learning within the team. Mentor and groom senior tech talent and first-line managers. Build strong relationships with senior leadership while effectively advocating for the team's needs and priorities. Qualifications 12+ years of experience in software development and product engineering. Proven experience managing and leading multiple high-performing engineering teams. Strong technical architecture and design skills. Hands-on experience with software development, including coding, code reviews, and design reviews. Experience with agile development methodologies and best practices. Excellent communication, interpersonal, and presentation skills. Strong leadership and mentorship skills with the ability to motivate and inspire teams. Proven ability to build and maintain strong relationships with stakeholders across different departments. A passion for technology and a desire to build innovative and impactful solutions. Bonus Points Experience with cloud platforms (We are in AWS). Experience with building and scaling high-volume, high-availability systems. (ref:hirist.tech)
We are seeking a highly motivated and detail-oriented Associate Operations to perform the critical Primary Source Verification process. This role is important to our operations, ensuring the accuracy and timely completion of validation, verification, research work for the applicants and for our clients. The ideal candidate will possess excellent communication skills, a keen eye for detail, and the ability to work effectively in a fast-paced environment. This position offers an excellent opportunity for individuals looking to start or build their career in operations and compliance . Duties and Responsibilities: ● Initiation of Checks: Accurately initiate the verification process within our system, ensuring all necessary information is correctly reviewed and validated. This is the critical first step in the entire verification lifecycle. ● Communication & Coordination: Effectively communicate with various external stakeholders such as, issuing authorities (government agencies, educational institutions, previous employers, etc.) and vendors located across different regions to request and obtain necessary verification information. Coordinate and communicate with internal stakeholders such as, applicant assist team, insufficiency support, immediate supervisors, client delivery managers etc. This requires clear and concise communication, both written and verbal. ● Quality Assurance: Conduct thorough review of submitted documents to ensure accurate processing and raise flags when documents are incomplete, unclear, tempered in any ways to the applicant. This ensures the accuracy and integrity of our reports. ● Research & Analysis: Conduct detailed secondary research and analysis on issuing authorities and verification processes to stay up-to-date on requirements and best practices. This includes understanding the nuances of different verification sources and their processes. ● Process Improvement: Identify opportunities to improve the efficiency and effectiveness of the background verification process. Qualifications and Work Experience: ● Education: Graduate/3 Years Diploma Holder ● Experience: 1+ year of experience. ● Essential Skills: ○Excellent Written & Spoken English: Must be able to communicate clearly and professionally, both verbally and in writing. Emphasis on writing short, clear, and error-free messages and sentences. ○ Detail Orientation: A strong ability to focus on details and identify even minor discrepancies or errors. A true "eye for detail" is essential. ○ Analytical Skills: Ability to analyze information from various sources and synthesize it into a coherent report. ○ Communication Skills: Ability to communicate effectively with a variety of stakeholders, including issuing authorities, vendors, and clients. ○ Computer Proficiency: Must be comfortable with using computers and work on multiple screens, using internal tools. ○ Adaptability: Ability to work in a fast-paced environment and adapt to changing priorities. ○ Open to Calling Profile: Comfortable making outbound calls to verification sources. Skills Focus: This role heavily emphasizes the following skills: ● Detail Orientation: Consistently and accurately processes information, minimizing errors. ● Eye for Detail: Proactively identifies discrepancies, inconsistencies, and errors in data. ● Identifying Errors: Quickly and accurately recognizes mistakes in information or processes. ● Writing Short and Clear Messages and Sentences: Communicates effectively and efficiently in writing, ensuring clarity and conciseness. Job Types: Full-time, Fresher Pay: Up to ₹300,000.00 per year Benefits: Flexible schedule Work Location: In person
About The Organisation DataFlow Group is a pioneering global provider of specialized Primary Source Verification (PSV) solutions, and background screening and immigration compliance services that assist public and private organizations in mitigating risks to make informed, cost-effective decisions regarding their Applicants and Registrants. DataFlow with its best practices in this field such as a high level of data security, cutting-edge technology, rigorous processes, qualified research analysts, and a global network of over 100,000 issuing authorities, verifies professionals credentials from the primary issuer of the document - regardless of its nature. We are seeking a highly skilled and experienced Senior Data Analyst to join our dynamic Operations team. In this pivotal role, you will be instrumental in leveraging data to optimize operational efficiency, identify bottlenecks, enhance service delivery, and uncover actionable insights that directly impact our operational excellence and client satisfaction. Duties And Responsibilities Data Analysis & Insight Generation : Conduct in-depth analysis of large and complex datasets to identify trends, patterns, and anomalies. Translate raw data into clear, concise, and actionable insights that address key business questions. Data Manipulation & Transformation Expertly "slice and dice" data using advanced functions and features in Google Sheets or Microsoft Excel to prepare data for analysis, build models, and create reports. Reporting & Visualization Develop and maintain comprehensive dashboards, reports, and presentations that effectively communicate analytical findings to various stakeholders, including senior leadership. Problem Solving Proactively identify business problems and opportunities, and leverage data to propose solutions and strategies. Stakeholder Collaboration Work closely with cross-functional teams (e.g., Marketing, Operations, Product, Finance) to understand their data needs and provide data-driven support for decision-making. Data Quality & Governance Contribute to ensuring data accuracy, consistency, and integrity across various data & Innovation : Continuously explore and implement new analytical techniques and tools to enhance the team's capabilities and : Experience : Progressive experience in a data analytics role, with a proven track record of delivering impactful data-driven Mindset : Demonstrated strong analytical and problem-solving skills, with an innate curiosity and ability to break down complex problems into manageable components. Data Manipulation Expertise Exceptional hands-on proficiency with either Google Sheets or Microsoft Excel (advanced functions, pivot tables, VLOOKUP/XLOOKUP, conditional formatting, data validation, charting, etc.) is a Slicing & Dicing : Proven ability to effectively manipulate, transform, and analyze large volumes of data from various sources to uncover meaningful patterns and Skills : Excellent verbal and written communication skills, with the ability to articulate complex analytical concepts to non-technical audiences clearly and to Detail : Meticulous attention to detail and a commitment to data accuracy and integrity. Proactiveness Self-motivated, proactive, and able to work independently as well as collaboratively within a team environment. Preferred Qualifications (Nice To Haves) Experience with SQL for data querying and extraction. Experience with statistical analysis and modeling (e.g., Python, R) (ref:hirist.tech)
We are looking for a talented Full Stack Developer to contribute to our core product development. You will be involved in the full software development lifecycle, building scalable and user-friendly applications. Responsibilities Develop and maintain backend components. Utilize Node.js for backend development and frameworks like Express.js (Node.js ) or Django. Design and interact with databases such as MySQL, MongoDB, or PostgreSQL. Build and consume RESTful APIs and understand microservices. Work with cloud platforms, preferably AWS. Participate in an Agile/Scrum environment. Requirements Strong proficiency in Node.js or Java. Strong knowledge of backend frameworks (Express.js or Django) and databases (MySQL, MongoDB, or PostgreSQL). Familiarity with RESTful APIs and AWS. Experience with Git and collaborative development. Excellent problem-solving and communication skills. This job was posted by Sharan Mithran from The DataFlow Group.
We are searching for a highly skilled and seasoned Senior ETL & Data Streaming Engineer with over 10 years of experience to take on a crucial role in the design, development, and maintenance of our robust data pipelines. The ideal candidate will possess in-depth expertise in batch ETL processes as well as real-time data streaming technologies, along with extensive hands-on experience with AWS data services. A proven track record of working with Data Lake architectures and traditional Data Warehousing environments is a must. Your responsibilities will include designing, developing, and implementing highly scalable, fault-tolerant, and performant ETL processes using leading ETL tools to extract, transform, and load data from diverse source systems into our Data Lake and Data Warehouse. You will also be tasked with architecting and constructing batch and real-time data streaming solutions using technologies like Talend, Informatica, Apache Kafka, or AWS Kinesis to facilitate immediate data ingestion and processing requirements. Furthermore, you will need to leverage and optimize various AWS data services such as AWS S3, AWS Glue, AWS Redshift, AWS Lake Formation, AWS EMR, and others to develop and manage data pipelines. Collaboration with data architects, data scientists, and business stakeholders to comprehend data requirements and translate them into efficient data pipeline solutions is a key aspect of the role. It will also be essential for you to ensure data quality, integrity, and security across all data pipelines and storage solutions, as well as monitor, troubleshoot, and optimize existing data pipelines for performance, cost-efficiency, and reliability. Additionally, you will be responsible for developing and maintaining comprehensive documentation for all ETL and streaming processes, data flows, and architectural designs, and implementing data governance policies and best practices within the Data Lake and Data Warehouse environments. As a mentor to junior engineers, you will contribute to fostering a culture of technical excellence and continuous improvement. Staying updated on emerging technologies and industry best practices in data engineering, ETL, and streaming will also be expected. Required Qualifications: - 10+ years of progressive experience in data engineering, focusing on ETL, ELT, and data pipeline development. - Extensive hands-on experience with commercial or open-source ETL tools (Talend). - Proven experience with real-time data ingestion and processing using platforms such as AWS Glue, Apache Kafka, AWS Kinesis, or similar. - Proficiency with AWS S3, AWS Glue, AWS Redshift, AWS Lake Formation, and potentially AWS EMR. - Strong background in traditional data warehousing concepts, dimensional modeling, and DWH design principles. - Proficient in SQL and at least one scripting language (e.g., Python, Scala) for data manipulation and automation. - Strong understanding of relational databases and NoSQL databases. - Experience with version control systems (e.g., Git). - Excellent analytical and problem-solving skills with attention to detail. - Strong verbal and written communication skills for conveying complex technical concepts to diverse audiences. Preferred Qualifications: - Certifications in AWS Data Analytics or related areas.,
DataFlow is looking for an ambitious, energetic and highly skilled Senior Manager within the Equivalence Services within DataFlow. This critical role combines team leadership with high-level operational and project coordination. The individual will foster close working relationships with internal and external stakeholders to ensure timely delivery. The ideal candidate will possess deep expertise in international education systems and qualifications including academic, technical, professional, and vocational, a proven track record of operational leadership to complex credentialing challenges and ensuring the highest standards of service delivery. DataFlow offers a flexible and collaborative working environment with a multi-national multi-cultural team spread across the globe. The role will be working closely with all key stakeholders internally and externally and would be a high visibility role. Job Summary: The primary focus of this position is to ensure all academic and vocational credential evaluations and associated functions are done on time maintaining quality, speed and integrity. This is a critical position requiring strong coordination, analytic skills, and a process-oriented mindset. The role requires answering complex queries and resolving escalations, ensuring our services maintain the highest standards of quality and consistency. Duties and responsibilities: Review and evaluate academic, technical and vocational credentials to the UAE/Global education system, using UAE/Global rules and regulations. Provides expert clarification and support to the team on all aspects of the equivalency process, acting as their go-to resource for the most challenging questions. Lead and manage a team of operation professionals and drive operational excellence by improving and automating our workflows. Coordinate daily operational workflow to ensure evaluations are completed smoothly and meet all quality and speed targets (SLAs). Use data to find and fix bottlenecks, improving the process with automation and new technology. Applies expert, hands-on research skills to resolve difficult cases, using these as opportunities to mentor the team and enhance standards. Work closely with other teams like Applicant Support, Technology to give customers a smooth experience. Stay abreast of changes in global education systems, accreditation standards, and international treaties related to academic, technical and vocational qualification recognition. Ensure the team follows all client regulations, industry standards and data privacy regulations. Skills: Understanding of different types of qualification credentials, credential evaluation, regulatory procedures, and education systems Organizational and project management skills with the ability to manage multiple priorities Strong data analytical and problem solving skills Process-oriented mindset with attention to detail and structured thinking Collaborative, able to work independently under pressure, with time-management skill Internal and External Stakeholder Management Excellent interpersonal, communication skills Excellent written and verbal communication skills A passion for continuous improvement and innovation Qualification and Experience Requirements: Minimum Bachelor’s degree required; degree in Education, or related field preferred. 10-12 years of experience in education service delivery Prior experience in academic or vocational education credentialing or education regulatory domains is preferred. Knowledge of qualification frameworks of different countries and credential databases is an advantage. Willing to work extended hours when required to meet deadlines.