Role & responsibilities Author realistic and complex IT tasks and their corresponding multi-step execution plans in clear, natural language, accurately mimicking real-world enterprise IT operations across ITAM, ITSM, and ITOM domains. Conduct rigorous, in-depth validation of created IT tasks and plans, critically reviewing them to ensure reflection of real-world IT team operations, processes, and inherent complexities, including adherence to ITIL best practices. Act as a central resource for IT process expertise, offering guidance and insights on complex scenarios, industry best practices, and potential edge cases to the entire data team. Identify and document discrepancies, inconsistencies, or gaps in IT process descriptions, proposing solutions for continuous data quality improvement. Contribute to the development of comprehensive guidelines and standards for IT task and plan creation, ensuring consistency and scalability of our datasets. Preferred candidate profile Bachelors degree in Engineering, Computer Science, Information Technology or equivalent Deep understanding of IT service management (ITIL v3/v4), IT domains (ITSM, ITOM, ITAM) and enterprise IT processes, ideally with ITIL Foundation certification or higher. Proven experience in IT process analysis, documentation, and improvement, or a similar role requiring detailed understanding of IT workflows. Experience with IT Service Management platforms like ServiceNow, Jira, or similar. Foundational understanding of scripting (e.g., Python) and JSON, with an ability to comprehend structured data formats like DSL/DSQL, to effectively collaborate with AI Data Engineers on text-to-JSON conversion and data pipeline processes. Exceptional attention to detail and a meticulous approach to data accuracy and validation. Strong analytical and problem-solving skills, with the ability to dissect complex IT scenarios and identify critical elements. Excellent written and verbal communication skills in English, with the ability to articulate complex IT concepts clearly and concisely. Nice to have Prior experience in a data-intensive role or working closely with data science/AI teams.
Introduction We are looking for an experienced L2 IT Analyst with a strong background in the retail sector . In this role, you will be a Subject Matter Expert (SME) for real-world retail IT operations. Your primary responsibility will be to provide in-depth technical support and to help build and refine the datasets that power our revolutionary AI agent, Skyfall. This is a unique opportunity to leverage your retail IT expertise to shape the future of AI in the enterprise. The ideal candidate will have hands-on experience with the major tools and technologies that power retail companies. Key Responsibilities Dataset Creation : Author realistic and complex IT datasets, including simulated user tickets and the corresponding multi-step resolution plans. These will accurately mimic real-world L2-level IT issues found in retail environments across ITSM, ITOM, and ITAM domains. Dataset Review and Validation : Conduct rigorous, in-depth validation of datasets created by our AI models. Critically review them to ensure technical accuracy, adherence to ITIL best practices, and precise reflection of real-world retail IT operations and their inherent complexities. Subject Matter Expertise : Act as a central resource and internal point of contact for our product and engineering teams on all things related to IT. Provide crucial domain context, offer guidance on complex scenarios and industry best practices, and clarify technical nuances to ensure accurate data conversion and model development. Process Improvement and Analysis : Identify and document discrepancies, inconsistencies, or gaps in IT process descriptions. Required Skills and Qualifications A bachelor's degree in Engineering, Computer Science, Information Technology, or a related field. Proven experience (at least 2 years) in an L2 or higher IT role within the retail industry, with a deep understanding of L2 or higher support processes, IT process analysis, and documentation. Experience with IT Service Management (ITSM) platforms like ServiceNow or Jira, with a focus on analyzing workflows and ticket data. Deep understanding of core retail IT systems and workflows , including hands-on experience with tools for: ERP : SAP S/4HANA or Oracle NetSuite CRM : Salesforce Sales Cloud or Microsoft Dynamics 365 POS Systems : Shopify POS, NCR SelfServ Kiosks, or similar Supply Chain Management (SCM) : Blue Yonder or Oracle SCM Cloud HRMS/WFM : Workday or UKG A strong grasp of ITIL v3/v4 frameworks and best practices, particularly concerning incident, problem, and change management processes. Exceptional analytical and problem-solving skills, with the ability to deconstruct complex IT scenarios into logical, sequential steps for dataset creation. Excellent written and verbal communication skills in English, with the ability to articulate complex IT concepts clearly and concisely to product and engineering teams. Nice to Have Foundational understanding of scripting (e.g., Python) and JSON
Key Responsibilities Synthetic Data Generation & Quality Assurance Design and implement scalable synthetic data generation systems to support model training Develop and maintain data quality validation pipelines ensuring synthetic data meets training requirements Build automated testing frameworks for synthetic data generation workflows Collaborate with ML teams to optimize synthetic data for model performance APIs & Integration Develop and maintain REST API integrations across multiple enterprise platforms Implement robust data exchange, transformation, and synchronisation logic between systems Ensure error handling, retries, and monitoring for all integration workflows Data Quality & Testing Implement automated data validation and testing frameworks for ETL and synthetic data workflows Translate data quality feedback from stakeholders into pipeline or generation process improvements Proactively monitor and maintain data consistency across systems Multi-System Integration & MCP Development Build and maintain tool registries for Model Control Protocol (MCP) integration across multiple enterprise systems Develop robust APIs for multi-system communication through MCP frameworks Design and implement workflows that coordinate multi-system interactions Ensure reliable data flow and error handling across distributed system architectures Cross-Functional Collaboration & Production Integration Partner with domain specialists to translate plan execution feedback into actionable insights Work closely with Product Managers to align synthetic data generation with business requirements Collaborate with Core Engineering teams to ensure seamless production deployment Establish feedback mechanisms between synthetic data systems and production environments Required Qualifications Technical Skills Programming: Proficiency in Python, Typescript (optional) Data Engineering: Experience in data engineering frameworks and libraries (Pandas, Apache Airflow, Prefect) APIs & Integration: Strong background in REST APIs and system integration Databases: Experience with relational and NoSQL databases (PostgreSQL, MongoDB) Cloud Platforms: Hands on experience with AWS/GCP/Azure Experience Requirements 2+ years experience in building production-scale data pipelines and orchestration systems Demonstrated success in cross-functional collaboration in technical environments Preferred Qualifications Familiarity with managing Kubernetes-based production workloads and workflow orchestration (Argo) Familiarity with containerisation and orchestration with tools like Docker, Kubernetes etc. Familiarity with synthetic or large-scale data generation Background in enterprise software integration Experience with Model Control Protocol (MCP) or similar orchestration frameworks Knowledge of automated testing frameworks for data pipelines What We Offer Lots of learning many systems are being built from the ground up, with no existing references or open-source projects to rely on. This will be the first time not just for you, but for the industry as well. Opportunity to work at the forefront of enterprise-scale synthetic data generation Collaborative environment with product teams, engineering, and domain specialists Competitive compensation and comprehensive benefits Professional development opportunities in cutting-edge data engineering and Kubernetes orchestration Team Structure You'll report to the AI Engineering Lead and work closely with: ML Engineers developing foundation models Product Managers defining business requirements Product Specialists providing domain expertise Backend Engineers handling production infrastructure This role offers significant impact on our data capabilities and the opportunity to shape how we generate and utilize synthetic data for training enterprise systems.
Your Role As a Business Analyst IT Operations Data , you will play a pivotal role in creating the foundational data layer that powers our AI platform. Youll work closely with IT domain experts, data engineers, and external consultants to define enterprise-grade use cases, map real-world execution plans, and translate business processes into structured data. Your work will directly shape how our AI learns to interact with complex IT systems across sectors - including retail. Responsibilities Translate real-world IT operations into structured formats : Collaborate with domain consultants to document multi-step execution flows for common and complex IT tasks in environments such as retail, financial services, and large enterprise IT. Drive consultant collaboration and execution : Manage and coordinate work with external IT consultants and SMEs. Ensure delivery of high-quality, domain-rich content and maintain clear workflows and expectations to meet project goals and timelines. Work across domains : Capture detailed task breakdowns involving ticketing, provisioning, incident response, monitoring, asset lifecycle, and system integrations - especially as they appear in operationally intense domains like retail. Retail systems and workflows modeling : Identify and document IT processes related to retail-specific platforms like POS systems, workforce scheduling, inventory management, customer support platforms, and store operations. Define system-level specifications : Partner with engineers to model how tasks execute across systems such as ServiceNow, Jira, Active Directory, SCCM, POS software, ERP systems (e.g., SAP), and cloud environments. Create and validate structured task plans : Develop clear, reusable templates that define triggers, personas, systems involved, approvals, and automation patterns for common enterprise IT tasks. Contribute to AI training data : Prepare labeled use cases and structured task flows to help train and validate AI models, ensuring clarity and real-world alignment. Maintain process consistency and data quality : Establish internal standards for how IT workflows are modeled, and validate all outputs for consistency, completeness, and realism. Must-have Requirements 4-7 years of experience as a Business Analyst, IT Process Analyst, or System Analyst in IT operations or supporting enterprise applications. Direct experience sourcing, managing, or working closely with external consultants or SMEs to deliver technical documentation or process maps. Demonstrated experience supporting retail business systems such as POS, inventory, order management, CRM, or e-commerce platforms (e.g., Oracle Retail, SAP IS-Retail, Manhattan Associates). Hands-on experience with enterprise platforms like ServiceNow , Jira, or BMC for IT operations or support ticketing. Strong grasp of ITIL principles and operational IT processes across ITSM, ITOM, and ITAM domains. Bachelors degree in Information Systems, Engineering, Computer Science, or a related field. Ability to understand and contribute to structured formats like JSON or YAML for data modeling. Strong attention to detail, especially in capturing system behavior, task flows, dependencies, and integrations. Nice-to-Have Requirements Prior experience working closely with AI, ML, or data engineering teams. Experience documenting IT workflows or standard operating procedures for large-scale environments, particularly retail or multi-location businesses. Exposure to data annotation, ontology, or taxonomy design in enterprise systems. Familiarity with scripting or automation tools (Python, PowerShell, or RPA platforms).