About Us : CLOUDSUFI, a Google Cloud Premier Partner, is a global leading provider of data-driven digital transformation across cloud-based enterprises. With a global presence and focus on Software & Platforms, Life sciences and Healthcare, Retail, CPG, financial services and supply chain, CLOUDSUFI is positioned to meet customers where they are in their data monetization journey. Our Values : We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community. Equal Opportunity Statement : CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, colour, religion, gender, gender identity or expression, sexual orientation and national origin status. We are dedicated to providing equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/ What we are looking for for a Backend Engineer with expertise in Python, FastAPI, and PostgreSQL to join our team. The ideal candidate will have strong product development experience, knowledge of scalable architectures, and hands-on exposure to CI/CD pipelines. Key Responsibilities : - Design, develop, and maintain high-performance backend services using Python and FastAPI. - Optimize and manage PostgreSQL databases for scalability and efficiency. - Build and integrate RESTful APIs and GraphQL endpoints. - Implement asynchronous processing with Celery, RabbitMQ, or Kafka. - Ensure security, performance, and reliability of backend systems. - Collaborate with DevOps teams to set up CI/CD pipelines and containerized deployments. - Write unit and integration tests to ensure code quality. Must-Have Skills : - Strong experience with Python, FastAPI (or Flask/Django). - Proficiency in PostgreSQL (PGSQL), including query optimization and indexing. - Experience with Docker, Kubernetes, and CI/CD tools (GitHub Actions, Jenkins). - Knowledge of authentication and authorization (OAuth, JWT). - Hands-on experience with microservices architecture and API development. - Strong debugging, problem-solving, and system design skills. Good to Have : - Experience with Redis, RabbitMQ, Kafka, or Elasticsearch. - Familiarity with AWS, Azure, or GCP for cloud deployments. - Knowledge of Infrastructure as Code (Terraform, Ansible). - Exposure to serverless architectures. Non-Technical/ Behavioral competencies required : - Written communication, technical articulation, listening and presentation skills (8/10 minimum) - Should have good conflict management - Should have superior persuasive and negotiation skills - Should have demonstrated effective task prioritization, time management and internal/external stakeholder management skills - Should be a quick learner, self starter, go-getter and team player - Should have experience of working under stringent deadlines in a Matrix organization structure - Should have demonstrated appreciable Organizational Citizenship Behavior (OCB) in past organizations Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Our Values : We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community. Equal Opportunity Statement : CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, colour, religion, gender, gender identity or expression, sexual orientation and national origin status. We are dedicated to providing equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/ What we are looking for for a Backend Engineer with expertise in Python, FastAPI, and PostgreSQL to join our team. The ideal candidate will have strong product development experience, knowledge of scalable architectures, and hands-on exposure to CI/CD pipelines. Key Responsibilities : - Design, develop, and maintain high-performance backend services using Python and FastAPI. - Optimize and manage PostgreSQL databases for scalability and efficiency. - Build and integrate RESTful APIs and GraphQL endpoints. - Implement asynchronous processing with Celery, RabbitMQ, or Kafka. - Ensure security, performance, and reliability of backend systems. - Collaborate with DevOps teams to set up CI/CD pipelines and containerized deployments. - Write unit and integration tests to ensure code quality. Must-Have Skills : - Strong experience with Python, FastAPI (or Flask/Django). - Proficiency in PostgreSQL (PGSQL), including query optimization and indexing. - Experience with Docker, Kubernetes, and CI/CD tools (GitHub Actions, Jenkins). - Knowledge of authentication and authorization (OAuth, JWT). - Hands-on experience with microservices architecture and API development. - Strong debugging, problem-solving, and system design skills. Good to Have : - Experience with Redis, RabbitMQ, Kafka, or Elasticsearch. - Familiarity with AWS, Azure, or GCP for cloud deployments. - Knowledge of Infrastructure as Code (Terraform, Ansible). - Exposure to serverless architectures. Non-Technical/ Behavioral competencies required : - Written communication, technical articulation, listening and presentation skills (8/10 minimum) - Should have good conflict management - Should have superior persuasive and negotiation skills - Should have demonstrated effective task prioritization, time management and internal/external stakeholder management skills - Should be a quick learner, self starter, go-getter and team player - Should have experience of working under stringent deadlines in a Matrix organization structure - Should have demonstrated appreciable Organizational Citizenship Behavior (OCB) in past organizations
We are seeking a dynamic and experienced Human Resources Specialist to join our team in a global capacity. This role involves direct collaboration with the CFO and the US team, making it an excellent opportunity for professionals with strong expertise in HR operations and systems. A key requirement is a robust understanding of HRMS platforms, specifically KEKA. Key Responsibilities: Global HR Operations: - Manage HR functions across multiple regions, ensuring compliance with local labor laws and global policies. - Collaborate with the CFO and US team to align HR strategies with business goals. HRMS Management: - Maintain and optimize HR processes using KEKA HRMS, including employee records, payroll integration, and performance management systems. - Provide training and support to team members on KEKA functionality. Talent Acquisition and Management: - Lead global recruitment initiatives, from sourcing to onboarding. - Develop and execute employee engagement and retention strategies. Policy Development and Compliance: - Create and implement HR policies and procedures to meet international standards. - Ensure adherence to employment regulations and global best practices. Performance Management: - Design and implement performance evaluation processes. - Work with leaders to identify training needs and career development plans. Reporting and Analysis: - Generate HR analytics and reports for the CFO and leadership team. - Use data to provide insights and recommendations on workforce trends. Qualifications: - Bachelor's or Master's degree in Human Resources, Business Administration, or a related field. - Proven experience in a global HR role, preferably with direct reporting to senior leadership. - Strong expertise in KEKA HRMS is mandatory. - Excellent understanding of global labor laws and HR best practices. - Exceptional interpersonal and communication skills. - Ability to manage multiple priorities in a fast-paced, dynamic environment. What We Offer: A global platform to showcase your HR expertise. Direct interaction and collaboration with senior leadership and international teams. Opportunities for professional growth and development.
Objective. CLOUDSUFI is seeking for a hands-on Delivery Lead of Client Services, who will be responsible for all client interfaces within the assigned scope. He/she will work together with technical leads/architects to create an execution plan in consultation with the customer stakeholders and drive the execution with team wrt. People, Process and Structure. Key KPIs for this role are Gross Margin, Customer Advocacy (NPS), ESAT (Employee Satisfaction),Penetration (Net New) and Target Revenue realization. Location: The job location for this role will be Noida, India. Key Responsibilities. - Develop and own vision, strategy and roadmap for the account. - Participate in business reviews with executive leadership. - Own weekly and monthly dashboards. - and reporting packages to business and leadership. - Actively mine new opportunities within account by cross/up selling. - Customer centric approach which includes understanding of expectations/prioritizations of customer. needs. - Review the capacity and timeline of the deliverables on an ongoing basis. - Identify and address technical and operational challenges or risks during project execution and. develop action plans for mitigation and aversion of identified risks. - Assess the team hurdles in their development cycles and provide critical thinking. - Assign tasks, track deliverables, and remove impediments to team members. - Create and maintain delivery best practices and audit the implementation of processes and best practices in the organization. - Contribute towards the creation of a knowledge repository, reusable templates reports/dashboards. - Supports development of others and self with effective feedback, recognition and coaching. - Solid track record of successfully leading sizable teams, with a preference for experience in offshore delivery setups. - Established success in achieving business goals using analytics-driven solutions. - Past experience in collaborating with international stakeholders and directing diverse teams. Key Qualifications. - Education Background: BTech/ BE / BS / MS / MBA (Tier 1). - Professional Experience: 12+. - A minimum of 10 years of professional experience in the software/product development domain. - 3+ years of experience as Technical Product/Program Manager, with a track record of leading sizable teams 30+ team members. - Product Mindset to provide improvement points to enhance features and customer experience. - Should have thorough understanding of DevOps philosophy and driven the same in past projects. - Must have performed product/program delivery for a leading cloud partner of size more than USD.5M annually and grown the account through delivery excellence. - Hands-on experience of agile project management. - Must be conversant in performing agile ceremonies i.e Daily Scrum calls, Retro and Customer Demos and Velocity calculation. Technical Proficiency:. - Expertise in managing software development projects around java, open-source technologies and data integration products. - Experience on GCP is mandatory. - Experience on Java , Scala, Maven, Test Coverage, Code Review, Quality Deliverables with Automation. - Product development experience on HDFS, with programming experience on spark. - Should be well verse with current technology trends in IT Solutions eg Cloud Platform development, DevOps, Low Code solutions, Intelligent Automation. Good to Have: Git, Bitbucket, Microservices. Desired Certifications (Mandatory): Certified GCP Solution Architect and/or TOGAF Enterprise Architect. Leadership & Management:. - Must have experience working with customers across North America and Europe. - Written communication, technical articulation, listening and presentation skills (8/10 minimum). - Should have good conflict management and superior persuasive and negotiation skills. - Should have demonstrated effective task prioritization, time management and internal/external stakeholder management skills. - Should be a quick learner, self-starter, go-getter and team player. - Comfortable and flexible to operate in a matrix structure. - Should have demonstrated appreciable Organizational Citizenship Behaviour (OCB) in past organizations. - Proven experience in refining processes to enhance operational efficiency. - Robust project management skills with an emphasis on punctual delivery and quality assurance.
About Us CLOUDSUFI, a Google Cloud Premier Partner, is a global leading provider of data-driven digital transformation across cloud-based enterprises. With a global presence and focus on Software & Platforms, Life sciences and Healthcare, Retail, CPG, financial services, and supply chain, CLOUDSUFI is positioned to meet customers where they are in their data monetization journey. Our Values We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community. Equal Opportunity Statement CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/ What are we looking for We are seeking a highly skilled and experienced Senior DevOps Engineer to join our team. The ideal candidate will have extensive expertise in modern DevOps tools and practices, particularly in managing CI/CD pipelines, infrastructure as code, and cloud-native environments. This role involves designing, implementing, and maintaining robust, scalable, and efficient infrastructure and deployment pipelines to support our development and operations teams. Required Skills and Experience: - 7+ years of experience in DevOps, infrastructure automation, or related fields. - Advanced expertise in Terraform for infrastructure as code. - Solid experience with Helm for managing Kubernetes applications. - Proficient with GitHub for version control, repository management, and workflows. - Extensive experience with Kubernetes for container orchestration and management. - In-depth understanding of Google Cloud Platform (GCP) services and architecture. - Strong scripting and automation skills (e.g., Python, Bash, or equivalent). - Excellent problem-solving skills and attention to detail. - Strong communication and collaboration abilities in agile development environments. Preferred Qualifications: - Experience with other CI/CD tools (e.g., Jenkins, GitLab CI/CD). - Knowledge of additional cloud platforms (e.g., AWS, Azure). - Certification in Kubernetes (CKA/CKAD) or Google Cloud (GCP Professional DevOps Engineer). Behavioral Competencies • Must have worked with US/Europe based clients in onsite/offshore delivery models. • Should have very good verbal and written communication, technical articulation, listening and presentation skills. • Should have proven analytical and problem solving skills. • Should have collaborative mindset for cross-functional team work • Passion for solving complex search problems • Should have demonstrated effective task prioritization, time management and internal/external stakeholder management skills. • Should be a quick learner, self starter, go-getter and team player. • Should have experience of working under stringent deadlines in a Matrix organization structure.
About Us CLOUDSUFI, a Google Cloud Premier Partner, is a global leading provider of data-driven digital transformation across cloud-based enterprises. With a global presence and focus on Software & Platforms, Life sciences and Healthcare, Retail, CPG, financial services, and supply chain, CLOUDSUFI is positioned to meet customers where they are in their data monetization journey. Our Values We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community. Equal Opportunity Statement CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/ What are we looking for Primary Skills : ABAP, OOPS, RFC, BAPI, IDOC, ALE, ODATA Secondary Skills : SAP HANA, SAP BASIS Required Experience Must have prior experience of developing integration solutions around SAP ECC and SAP S4 HANA using SAP RFC, BAPI, IDOC and ODATA. Must have good understanding on IDocs creation posting and doing the necessary ALE configuration. Must have good understanding coding fundamentals in ABAP and ABAP OO. Should have very good debugging skills, including the system debugging. Should be familiar with SAP Authorisation Profiles Roles Objects. Should have strong fundamentals in OOAD and Design Principles. Should have experience of creating and modifying SAP Transport Requests eg releasing and adding new objects in TRs. Should have understanding of Application areas like MM PP FI etc. Should have good hands on experience with Software Engineering tools viz. JIRA, Confluence, BitBucket, SVN etc. Should be very well verse with current technology trends in SAP and overall industry e.g.H4HANA, OData, Cloud Platform Development and deployment, DevOps, Low Code solutions, Intelligent Automation. Job Responsibilities Understanding the entire code base architectural considerations design related to the product features assigned. Developing new features in the Product using OOAD methodology. Maintaining current functionality by working on defects raised internally and externally. Working closely with BASIS team to identify SAP Notes and applying to SAP servers for resolution. Exploring new areas for integrating SAP with the third party applications. Assisting the creation of the required SAP Objects Configuration as required by QA and Internal Development team. Create and maintain coding best practices and do peer code / solution reviews. Participate in Daily Scrum calls, Scrum Planning, Retro and Demos meetings. Bring out technical/design/architectural challenges/risks during execution, develop action plan for mitigation and aversion of identified risks. Comply with development processes, documentation templates and tools prescribed by CloudSufi or and its clients. Work with other teams and Architects in the organization and assist them on technical Issues/Demos/POCs and proposal writing for prospective clients. Contribute towards the creation of knowledge repository, reusable assets/solution accelerators and IPs. Provide feedback to junior developers and be a coach and mentor for them. Provide training sessions on the latest technologies and topics to others employees in the organization. Participate in organization development activities time to time - Interviews, CSR/Employee engagement activities, participation in business events/conferences, implementation of new policies, systems and procedures as decided by Management team. Behavioral Competencies • Must have worked with US/Europe based clients in onsite/offshore delivery models. • Should have very good verbal and written communication, technical articulation, listening and presentation skills. • Should have proven analytical and problem solving skills. • Should have collaborative mindset for cross-functional team work • Passion for solving complex search problems • Should have demonstrated effective task prioritization, time management and internal/external stakeholder management skills. • Should be a quick learner, self starter, go-getter and team player. • Should have experience of working under stringent deadlines in a Matrix organization structure.
Were looking for a techno-functional consultant with hands-on experience integrating Cerner with surrounding systemspreferably using an iPaaS platformand strong knowledge of Cerner access governance (roles, privileges, and security models). Youll translate clinical and operational needs into robust integrations, design secure access models, and ensure compliance with healthcare regulations. Key Responsibilities Integration (iPaaS & Interfaces) Lead design and implementation of integrations between Cerner and EMPI/IDM, ADT/registration, lab/diagnostics, pharmacy, billing/RCM, CRM, care management, and analytics platforms. Build, monitor, and optimize integrations via HL7 v2.x (e.g., ADT, ORM/ORU, RDE), FHIR R4 (Patient, Encounter, Observation, Medication*, Order/Request*, Practitioner, Appointment, DocumentReference, etc.), and C-CDA where applicable. Deliver integrations on iPaaS platforms with resilient patterns: idempotency, retries, dead-letter queues, and alerting. Work with interface engines (e.g., Mirth Connect/NextGen, Cloverleaf, Corepoint), API gateways, and secure transport (mTLS, VPN, SFTP). Understanding of canonical data mappings, code sets, and transformations (ICD-10, SNOMED CT, LOINC, RxNorm). Establish CI/CD, test harnesses, and simulators/mocks for interfaces; lead system, integration, and performance testing. Minimum Qualifications 7+ years in healthcare IT, with 35+ years focused on Cerner and EHR integrations. Proven delivery of HL7 v2 and FHIR R4 integrations and at least one production-grade iPaaS implementation. Hands-on with interface engines (Mirth/Cloverleaf/Corepoint), API design (REST, OAuth2/OIDC), and data mapping. Practical experience designing/maintaining Cerner access models (roles, privileges, templates) and running access reviews. Strong understanding of IGA concepts (RBAC/ABAC, SoD, certifications, access requests, JML). Excellent communication with clinical and technical stakeholders; strong documentation skills. Preferred Qualifications Experience Integrating Cerner with other enterprise application using IPaaS platforms like Workato/Mulesoft Background integrating with EMPI/MDM, LIS, PACS/VNA, PBM, payer APIs (X12/EDI exposure a plus), and data platforms (EDW, Lakehouse). Familiarity with Cerner solutions (e.g., Millennium, PowerChart, CareAware, HealtheIntent) and operational workflows. Knowledge of break-glass” workflows, emergency access logging, and audit/reporting practices. Exposure to SailPoint orOracle Access Governance, Understanding of healthcare vocabularies and terminology services; experience with test data de-identification for PHI.
Seeking a detail-oriented and motivated Coupa Administrator to support the ongoing maintenance, configuration, and optimization of our Coupa platform. This role will focus on supporting day-to-day operations, troubleshooting issues, implementing configuration changes, and assisting with integrations to ensure Coupa continues to meet business needs. This position requires hands-on Coupa administration experience, strong analytical and problem-solving skills, and the ability to work effectively with a distributed team. Fulltime Contractual role (3-6 months) - 8 hrs in a day Shift - Tentative - 2pm-11pm 3+yrs experience Qualifications & Experience Bachelors degree in Finance, Business, Information Systems, or related field. 3-5 years of experience as a Coupa Administrator or in a similar Coupa support role. Strong understanding of Coupa functionalities and configurations. Hands-on experience with CLMA (Contract Lifecycle Management Applications) . Familiarity with Navan or other travel & expense systems. Experience with Procure-to-Pay (P2P), Invoicing, or Expense Management modules. Knowledge of system integrations and data flows between ERP and Coupa (NetSuite, Workday, or similar). Good communication skills and ability to collaborate with global teams. Strong analytical and problem-solving abilities. Attention to detail with a commitment to accuracy and quality. Exposure to Coupa Analytics module is a plus.
We are seeking a talented and passionate Data Engineer to join our growing data team. In this role, you will be responsible for building, maintaining, and optimizing our data pipelines and infrastructure on Google Cloud Platform (GCP). The ideal candidate will have a strong background in data warehousing, ETL/ELT processes, and a passion for turning raw data into actionable insights. You will work closely with data scientists, analysts, and other engineers to support a variety of data-driven initiatives. Responsibilities: Design, develop, and maintain scalable and reliable data pipelines using Dataform or DBT. Build and optimize data warehousing solutions on Google BigQuery. Develop and manage data workflows using Apache Airflow. Write complex and efficient SQL queries for data extraction, transformation, and analysis. Develop Python-based scripts and applications for data processing and automation. Collaborate with data scientists and analysts to understand their data requirements and provide solutions. Implement data quality checks and monitoring to ensure data accuracy and consistency. Optimize data pipelines for performance, scalability, and cost-effectiveness. Contribute to the design and implementation of data infrastructure best practices. Troubleshoot and resolve data-related issues. Stay up-to-date with the latest data engineering trends and technologies, particularly within the Google Cloud ecosystem. Qualifications : • Bachelor's degree in Computer Science, a related technical field, or equivalent practical experience. • 3-4 years of experience in a Data Engineer role. • Strong expertise in SQL (preferably with BigQuery SQL). • Proficiency in Python programming for data manipulation and automation. • Hands-on experience with Google Cloud Platform (GCP) and its data services. • Solid understanding of data warehousing concepts and ETL/ELT methodologies. • Experience with Dataform or DBT for data transformation and modeling. • Experience with workflow management tools such as Apache Airflow. • Excellent problem-solving and analytical skills. • Strong communication and collaboration skills. • Ability to work independently and as part of a team. Preferred Qualifications: • Google Cloud Professional Data Engineer certification. • Knowledge of data modeling techniques (e.g., dimensional modeling, star schema). • Familiarity with Agile development methodologies.
About Us CLOUDSUFI, a Google Cloud Premier Partner , a Data Science and Product Engineering organization building Products and Solutions for Technology and Enterprise industries. We firmly believe in the power of data to transform businesses and make better decisions. We combine unmatched experience in business processes with cutting edge infrastructure and cloud services. We partner with our customers to monetize their data and make enterprise data dance. Our Values We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevatethe quality of lives for our family, customers, partners and the community. Equal Opportunity Statement CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/. Role & responsibilities : Role : AI/Senior AIEngineer Location : Noida, Delhi/NCR Experience : 3- 8years Education : BTech / BE / MCA / MSc Computer Science Job Summary : We are seeking a highly innovative and skilled AI Engineer to join our AI CoE for the Data Integration Project. The ideal candidate will be responsible for designing, developing, and deploying intelligent assets and AI agents that automate and optimize various stages of the data ingestion and integration pipeline. This role requires expertise in machine learning, natural language processing (NLP), knowledge representation, and cloud platform services, with a strong focus on building scalable and accurate AI solutions. Key Responsibilities : - LLM-based Auto-schematization : Develop and refine LLM-based models and techniques for automatically inferring schemas from diverse unstructured and semi-structured public datasets and mapping them to a standardized vocabulary. - Entity Resolution & ID Generation AI : Design and implement AI models for highly accurate entity resolution, matching new entities with existing IDs and generating unique, standardized IDs for newly identified entities. - Automated Data Profiling & Schema Detection : Develop AI/ML accelerators for automated data profiling, pattern detection, and schema detection to understand data structure and quality at scale. - Anomaly Detection & Smart Imputation : Create AI-powered solutions for identifying outliers, inconsistencies, and corrupt records, and for intelligently filling missing values using machine learning algorithms. - Multilingual Data Integration AI : Develop AI assets for accurately interpreting, translating (leveraging automated tools with human-in-the-loop validation), and semantically mapping data from diverse linguistic sources, preserving meaning and context. - Validation Automation & Error Pattern Recognition : Build AI agents to run comprehensive data validation tool checks, identify common error types, suggest fixes, and automate common error corrections. - Knowledge Graph RAG/RIG Integration : Integrate Retrieval Augmented Generation (RAG) and Retrieval Augmented Indexing (RIG) techniques to enhance querying capabilities and facilitate consistency checks within the Knowledge Graph. - MLOps Implementation : Implement and maintain MLOps practices for the lifecycle management of AI models, including versioning, deployment, monitoring, and retraining on a relevant AI platform. - Code Generation & Documentation Automation : Develop AI tools for generating reusable scripts, templates, and comprehensive import documentation to streamline development. - Continuous Improvement Systems : Design and build learning systems, feedback loops, and error analytics mechanisms to continuously improve the accuracy and efficiency of AI-powered automation over time. Required Skills and Qualifications : - Bachelor's or Master's degree in Computer Science, Artificial Intelligence, Machine Learning, or a related quantitative field. - Proven experience (e.g., 3+ years) as an AI/ML Engineer, with a strong portfolio of deployed AI solutions. - Strong expertise in Natural Language Processing (NLP), including experience with Large Language Models (LLMs) and their applications in data processing. - Proficiency in Python and relevant AI/ML libraries (e.g., TensorFlow, PyTorch, scikit-learn). - Hands-on experience with cloud AI/ML services, - Understanding of knowledge representation, ontologies (e.g., Schema.org, RDF), and knowledge graphs. - Experience with data quality, validation, and anomaly detection techniques. - Familiarity with MLOps principles and practices for model deployment and lifecycle management. - Strong problem-solving skills and an ability to translate complex data challenges into AI solutions. - Excellent communication and collaboration skills. Preferred Qualifications : - Experience with data integration projects, particularly with large-scale public datasets. - Familiarity with knowledge graph initiatives. - Experience with multilingual data processing and AI. - Contributions to open-source AI/ML projects. - Experience in an Agile development environment. Benefits : - Opportunity to work on a high-impact project at the forefront of AI and data integration. - Contribute to solidifying a leading data initiative's role as a foundational source for grounding Large Models. - Access to cutting-edge cloud AI technologies. - Collaborative, innovative, and fast-paced work environment. - Significant impact on data quality and operational efficiency.
We are looking for Technical Deployment Coordinators to support the rollout of our Lumos Deployment initiative, which includes the launch of the AppStore platform across multiple business units. These roles will be instrumental in executing deployment tasks, supporting system integrations, and ensuring smooth onboarding of users and systems. The coordinators will work closely with IT and Systems Administration teams and utilize tools such as Lumos, Okta, and our custom Access Management Portal (AMP). Responsibilities Assist with the rollout and configuration of the Lumos AppStore platform across departments. Support user onboarding, access provisioning, and troubleshooting during deployment. Execute tasks from the Lumos Deployment checklist including system validation, testing, and documentation. Collaborate with Systems Administration and IT teams to ensure alignment with existing infrastructure. Monitor deployment progress and escalate issues as needed. Maintain clear documentation of deployment activities and user support interactions. Skills and Experience Needed 2+ years of experience in IT systems administration or technical support roles. Familiarity with Lumos, Okta, and Access Management Portal (AMP). Experience with Microsoft 365, Azure Active Directory, and workflow automation tools such as Power Automate or Okta Workflows. Strong troubleshooting and communication skills. Ability to work independently and manage multiple deployment tasks. Prior experience supporting SaaS rollouts or system integrations preferred
We are seeking a highly skilled and innovative AI Integration Specialist to join our dynamic team. This pivotal role will be responsible for designing, developing, and implementing robust and scalable integration solutions using the Workato platform, with a strong focus on leveraging its AI capabilities to automate complex business processes and connect disparate applications. You will play a crucial part in enhancing our operational efficiency, enabling seamless data flow, and driving digital transformation initiatives across the organization. Qualifications: • Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field, or equivalent practical experience. • Proven 6+ years of hands-on experience in designing, developing, and deploying integrations using the Workato platform. • Strong understanding of Integration Platform as a Service (iPaaS) concepts and enterprise application integration (EAI) patterns. • Demonstrate experience with Workato's AI capabilities, including building solutions with AI by Workato, IDP, or Agentic AI. • Proficiency in working with various APIs (REST, SOAP), webhooks, and different data formats (JSON, XML, CSV). • Experience with scripting languages (e.g., Ruby, Python, JavaScript) for custom logic within Workato recipes is a plus. • Solid understanding of database concepts and experience with SQL/NoSQL databases. • Excellent problem-solving, analytical, and troubleshooting skills with keen attention to detail. • Strong communication and interpersonal skills, with the ability to effectively collaborate with technical and nontechnical stakeholders. • Workato certifications (e.g., Automation Pro I, II, III, Integration Developer) are desirable. • Experience with Agile/Scrum methodologies is a plus. Preferred Skills (Nice to Have): • Experience with other integration platforms (e.g., MuleSoft, Boomi, Zapier). • Familiarity with cloud platforms (AWS, Azure, GCP). • Knowledge of microservices architecture. • Experience with version control systems (e.g., Git).
About Us CLOUDSUFI, a Google Cloud Premier Partner , a Data Science and Product Engineering organization building Products and Solutions for Technology and Enterprise industries. We firmly believe in the power of data to transform businesses and make better decisions. We combine unmatched experience in business processes with cutting edge infrastructure and cloud services. We partner with our customers to monetize their data and make enterprise data dance. Our Values We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community. Equal Opportunity Statement CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/. Job Summary: We are seeking a highly innovative and skilled AI Engineer to join our AI CoE for the Data Integration Project. The ideal candidate will be responsible for designing, developing, and deploying intelligent assets and AI agents that automate and optimize various stages of the data ingestion and integration pipeline. This role requires expertise in machine learning, natural language processing (NLP), knowledge representation, and cloud platform services, with a strong focus on building scalable and accurate AI solutions. Key Responsibilities: LLM-based Auto-schematization: Develop and refine LLM-based models and techniques for automatically inferring schemas from diverse unstructured and semi-structured public datasets and mapping them to a standardized vocabulary. Entity Resolution & ID Generation AI: Design and implement AI models for highly accurate entity resolution, matching new entities with existing IDs and generating unique, standardized IDs for newly identified entities. Automated Data Profiling & Schema Detection: Develop AI/ML accelerators for automated data profiling, pattern detection, and schema detection to understand data structure and quality at scale. Anomaly Detection & Smart Imputation: Create AI-powered solutions for identifying outliers, inconsistencies, and corrupt records, and for intelligently filling missing values using machine learning algorithms. Multilingual Data Integration AI: Develop AI assets for accurately interpreting, translating (leveraging automated tools with human-in-the-loop validation), and semantically mapping data from diverse linguistic sources, preserving meaning and context. Validation Automation & Error Pattern Recognition: Build AI agents to run comprehensive data validation tool checks, identify common error types, suggest fixes, and automate common error corrections. Knowledge Graph RAG/RIG Integration: Integrate Retrieval Augmented Generation (RAG) and Retrieval Augmented Indexing (RIG) techniques to enhance querying capabilities and facilitate consistency checks within the Knowledge Graph. MLOps Implementation: Implement and maintain MLOps practices for the lifecycle management of AI models, including versioning, deployment, monitoring, and retraining on a relevant AI platform. Code Generation & Documentation Automation: Develop AI tools for generating reusable scripts, templates, and comprehensive import documentation to streamline development. Continuous Improvement Systems: Design and build learning systems, feedback loops, and error analytics mechanisms to continuously improve the accuracy and efficiency of AI-powered automation over time. Required Skills and Qualifications: Bachelor's or Master's degree in Computer Science, Artificial Intelligence, Machine Learning, or a related quantitative field. Proven experience (e.g., 3+ years) as an AI/ML Engineer, with a strong portfolio of deployed AI solutions. Strong expertise in Natural Language Processing (NLP), including experience with Large Language Models (LLMs) and their applications in data processing. Proficiency in Python and relevant AI/ML libraries (e.g., TensorFlow, PyTorch, scikit-learn). Hands-on experience with cloud AI/ML services, Understanding of knowledge representation, ontologies (e.g., Schema.org, RDF), and knowledge graphs. Experience with data quality, validation, and anomaly detection techniques. Familiarity with MLOps principles and practices for model deployment and lifecycle management. Strong problem-solving skills and an ability to translate complex data challenges into AI solutions. Excellent communication and collaboration skills. Preferred Qualifications: Experience with data integration projects, particularly with large-scale public datasets. Familiarity with knowledge graph initiatives. Experience with multilingual data processing and AI. Contributions to open-source AI/ML projects. Experience in an Agile development environment. Benefits: Opportunity to work on a high-impact project at the forefront of AI and data integration. Contribute to solidifying a leading data initiative's role as a foundational source for grounding Large Models. Access to cutting-edge cloud AI technologies. Collaborative, innovative, and fast-paced work environment. Significant impact on data quality and operational efficiency.