Apply directly at request@tytantech.com with the Job Title in Subject. Company Description Tytan Technology Inc. is a leading provider of technology solutions, specializing in AI & Machine Learning, Data Science, Cloud Solutions, Product Development, and Digital Transformation . Our flagship product, TytanAMP , streamlines insurance agency operations, while our Intelligent Document Processing (IDP) Accelerator enhances document automation and data extraction. We leverage Google Cloud, Azure, AWS, and modern frameworks to deliver tailored analytics, process automation, and enterprise application development services. We serve insurance agencies, financial services, small businesses, and enterprises , offering innovative, scalable, and future-ready solutions that transform operations and enhance customer experiences. Role Description This is a contract remote role for a Senior Next.js Developer with Chinese language proficiency . The developer will: Lead the migration of an enterprise application from Next.js pages router → app router framework. Convert API endpoints into server actions where applicable. Provide ticket-based office-hours support . Implement periodic feature upgrades and enhancements . Collaborate closely with client teams in Singapore. Qualifications Strong skills in Front-End Development and JavaScript (Next.js / React / TypeScript) Experience with Back-End Web Development and API integration Proficiency in modern frameworks such as Redux.js Excellent written and verbal communication skills in English and Chinese (Mandarin) Ability to work independently and remotely, with overlap for Singapore business hours Experience in cloud-based development (Azure, AWS, or GCP) is a plus Bachelor’s degree in Computer Science, Software Engineering, or related field 📩 To apply, send your CV to request@tytantech.com with the subject line: "Senior Next.js Developer – Chinese Proficiency".
Send your resume to request@tytantech.com with the subject line “Application – QA Tester” or apply directly in LinkedIn We are seeking a Quality Assurance Tester to join our client supporting their Drilling suite of applications . Under the guidance of the Team Lead, you will execute and evaluate software tests across functionality, architecture, and security. This role blends QA expertise with upstream oil & gas domain knowledge. Responsibilities Provide engineering support in drilling and completions to the team. Contribute as an active Software Tester on Agile teams. Develop and execute test cases based on acceptance criteria. Validate functionality, workflows, and end-user requirements. Identify, document, and track bugs through resolution. Verify development changes do not impact associated areas of functionality. Participate in sprint planning, ensuring QA requirements are covered. Maintain test environments and contribute to automation frameworks. Required Skills & Experience Bachelor’s or Master’s degree in Petroleum, Geophysics, Geology, or Chemical Engineering 2+ years in QA/testing (Oil & Gas upstream applications preferred). Good understanding of upstream applications (Preferred: EDT). Knowledge of well engineering concepts: drilling techniques, fluids management, completions, well performance. Familiarity with engineering tools (WellCat, WellPlan, Compass – a plus). Exposure to automation frameworks and test tools. Strong knowledge of SDLC/STLC and multiple test types (Smoke, Functional, Regression, UAT, Performance). Hands-on with test/defect management tools (Azure DevOps, TFS, JIRA). Agile methodology experience. ISTQB certification preferred. Excellent communication skills.
Send your resume to request@tytantech.com with the subject line “Application – AI Engineer” or apply directly in LinkedIn Key Responsibilities: Lead the day-to-day development of AI solutions, including writing production-grade code and building scalable systems. Candidate should have good hands-on -Design & Development - Full SDLC in Python, .Net and Strong in Gen AI area. Interact directly with customers to understand requirements, propose solutions, and ensure successful delivery. Design and implement AI-driven features such as chatbots, recommendation engines, Agentic AI, and automation tools. Build secure and scalable AI pipelines that integrate with enterprise systems and cloud platforms. Evaluate and apply AI frameworks, libraries, and platforms aligned with business and technical needs. Select and implement machine learning algorithms, NLP models, and other Gen AI technologies. Ensure compliance with data privacy, security, and ethical standards in all AI/ML implementations. Stay current with advancements in Gen AI tools and recommend their adoption within engineering teams. Provide technical mentorship to developers and engineers working on AI initiatives. Maintain clear documentation and architecture diagrams for deployed AI solutions. Qualifications: Proven experience as an AI Engineer, Machine Learning Engineer, or similar role, with a portfolio of delivered AI/Gen AI solutions. Proficiency in AI platforms and tools such as Azure OpenAI, Hugging Face, and LangChain ecosystem (LangGraph, LangSmith, LangFuse). Strong programming skills in Python, C#, and Node.js, with experience in API development and system integration. Expertise in NLP, machine learning, deep learning, or computer vision. Experience with Azure and AWS cloud platforms and their AI/ML services. Familiarity with enterprise application development and full-stack technologies. Hands-on experience with chatbot development, automation frameworks, and AI-based decision systems. Knowledge of AI ethics, data governance, and security best practices. Excellent problem-solving and communication skills. Bachelor’s or Master’s degree in Computer Science, AI/ML, Data Science, or a related field. Preferred Qualifications: Certifications in AI/ML from Azure or AWS. Experience with low-code/no-code AI integration tools. Understanding of DevOps practices for AI/ML model deployment. Experience in customer-facing roles, with the ability to translate business needs into technical solutions.
Send your resume to request@tytantech.com with the subject line “Application – Domain QA Tester | Oil & Gas” or apply directly on LinkedIn. We are seeking a Domain QA Tester with a strong background in Petroleum Engineering and hands-on experience in the Oil & Gas upstream industry. The role involves validating drilling and completions workflows, ensuring product quality, and contributing directly to the testing of mission-critical engineering applications. Responsibilities: Provide domain expertise and QA support in drilling and completions for Oil & Gas applications. Develop and execute test cases based on acceptance criteria and user workflows. Validate product functionality across upstream applications, ensuring compliance with industry practices. Participate in sprint planning and contribute QA inputs to story sizing and acceptance. Collaborate with SMEs to identify and define end-user workflows for testing. Document, track, and manage defects consistently using tools such as JIRA or Azure DevOps. Ensure test coverage across smoke, functional, integration, regression, performance, and UAT cycles. Maintain test environments and ensure seamless execution across SDLC/STLC phases. Requirements: Bachelor’s or Master’s degree in Petroleum Engineering, Geophysics, Geology, or Chemical Engineering (must-have). 2+ years of QA/testing experience (Oil & Gas upstream applications strongly preferred). Good understanding of drilling, well construction, completions, fluids, mechanics, and performance monitoring. Exposure to Oil & Gas engineering applications such as WellCat, Wellplan, Compass (preferred). Familiarity with automation frameworks and QA tools (JIRA, Azure DevOps, TFS). Comfortable working in Agile/Scrum environments. Strong communication and collaboration skills. ISTQB Foundation Level certification is a plus.
Send your resume to request@tytantech.com with the subject line “Application – AI Engineer” or apply directly in LinkedIn Key Responsibilities: Lead the day-to-day development of AI solutions, including writing production-grade code and building scalable systems. Candidate should have good hands-on -Design & Development - Full SDLC in Python, .Net and Strong in Gen AI area. Interact directly with customers to understand requirements, propose solutions, and ensure successful delivery. Design and implement AI-driven features such as chatbots, recommendation engines, Agentic AI, and automation tools. Build secure and scalable AI pipelines that integrate with enterprise systems and cloud platforms. Evaluate and apply AI frameworks, libraries, and platforms aligned with business and technical needs. Select and implement machine learning algorithms, NLP models, and other Gen AI technologies. Ensure compliance with data privacy, security, and ethical standards in all AI/ML implementations. Stay current with advancements in Gen AI tools and recommend their adoption within engineering teams. Provide technical mentorship to developers and engineers working on AI initiatives. Maintain clear documentation and architecture diagrams for deployed AI solutions. Qualifications: Proven experience as an AI Engineer, Machine Learning Engineer, or similar role, with a portfolio of delivered AI/Gen AI solutions. (Added advantage) Proficiency in AI platforms and tools such as Azure OpenAI, Hugging Face, and LangChain ecosystem (LangGraph, LangSmith, LangFuse). Strong programming skills in Python, C#, and Node.js, with experience in API development and system integration. Expertise in NLP, machine learning, deep learning, or computer vision. Experience with Azure and AWS cloud platforms and their AI/ML services. Familiarity with enterprise application development and full-stack technologies. Hands-on experience with chatbot development, automation frameworks, and AI-based decision systems. Knowledge of AI ethics, data governance, and security best practices. Excellent problem-solving and communication skills. Bachelor’s or Master’s degree in Computer Science, AI/ML, Data Science, or a related field. Preferred Qualifications: Certifications in AI/ML from Azure or AWS. Experience with low-code/no-code AI integration tools. Understanding of DevOps practices for AI/ML model deployment. Experience in customer-facing roles, with the ability to translate business needs into technical solutions.
Send your resume to request@tytantech.com with the subject line “Application – Solutions Engineer | Automation & Implementation” or apply directly in LinkedIn We are looking for a Solutions Engineer with over 5 years of experience who will play a pivotal role in implementing automation solutions and empowering clients to streamline their operations. This role requires a balance of hands-on technical expertise in Python programming, UI Automation (eg. UiPath, AA etc.) and the ability to engage with clients, manage projects, and deliver impactful outcomes. Responsibilities: Lead solution implementation and build tailored automation solutions. Collaborate with internal teams to design and deliver innovative client-focused solutions. Conduct client meetings to capture requirements, define milestones, and ensure delivery alignment. Manage end-to-end project execution, including timelines, budgets, and resources. Troubleshoot technical issues, debug code, and ensure smooth feature deployment. Document processes, project plans, and technical specifications. Act as a technical leader, promoting knowledge sharing across teams. Requirements: Bachelor’s degree in Engineering or related field (preferred). Experience in Professional Services / Consulting roles, ideally in AI automation, data, or related domains. Proficiency in Python with a strong understanding of automation best practices. Knowledge of Automation tools like UiPath Automation Anywhere, Blue Prism is a plus. Knowledge of API integrations; exposure to automation platforms (Blue Prism, Mulesoft, Databricks, Snowflake) is a plus. Strong project management skills with the ability to work in fast-paced, dynamic environments. Excellent communication and stakeholder management abilities. Proven ownership of end-to-end delivery in high-pressure or startup settings. If you thrive at the intersection of technology and business, and enjoy solving complex problems through automation, we’d love to hear from you.
Send your resume to request@tytantech.com with the subject line “Application – Revenue Cloud Solutions Architect” or apply directly on LinkedIn. We are seeking a highly experienced Revenue Cloud Solutions Architect with deep expertise in Salesforce, CPQ, and Revenue Cloud. The ideal candidate will bring over a decade of hands-on and leadership experience in architecting and delivering large-scale, enterprise-grade solutions. This role offers the opportunity to design and implement end-to-end Revenue Cloud solutions that optimize sales operations, quoting, and billing processes for global clients. Responsibilities: Lead the architecture, design, and implementation of Salesforce Revenue Cloud and CPQ solutions. Translate complex business requirements into scalable, future-proof technical solutions. Collaborate with cross-functional teams, including Sales, Finance, and IT, to align solutions with business goals. Provide technical leadership across discovery, design, implementation, and deployment phases. Ensure best practices in system integration, data migration, and Salesforce platform governance. Develop solution roadmaps, technical documentation, and design standards. Mentor and guide developers, consultants, and administrators on Revenue Cloud and CPQ capabilities. Partner with stakeholders to drive adoption, continuous improvement, and measurable business impact. Requirements: 10+ years of relevant Salesforce ecosystem experience , with at least 5 years focused on CPQ and Revenue Cloud. Proven track record as a Solutions Architect designing and implementing Revenue Cloud solutions at enterprise scale. Deep expertise in Salesforce CPQ, Billing, Revenue Recognition, and Subscription Management . Strong understanding of data modeling, integrations, and Salesforce security model. Hands-on experience with Apex, Lightning, Flows, and APIs is a plus. Salesforce Revenue Cloud and CPQ certifications strongly preferred. Excellent communication skills with the ability to engage business and technical stakeholders. Strong problem-solving mindset and ability to balance innovation with practical execution.
Send your resume to request@tytantech.com with the subject line “Application – Fortran/MCC Developer” or apply directly in LinkedIn We are seeking a highly experienced MFC / C++ / Fortran Developer to maintain, optimize, and modernize mission-critical legacy applications. The ideal candidate has deep expertise in system-level programming, Windows desktop application development, and modernization of large, complex codebases. This role offers the opportunity to ensure stability of existing platforms while also driving incremental modernization and refactoring efforts. Key Responsibilities Maintain, support, and enhance MFC-based Windows desktop applications. Work on high-performance C++ and Fortran modules for scientific, engineering, or computational systems. Refactor, modularize, and modernize legacy code to improve maintainability, performance, and scalability. Debug, profile, and optimize performance in memory- and CPU-intensive applications. Document system architecture, workflows, and maintenance procedures. Collaborate with stakeholders, business users, and cross-functional teams to understand requirements and deliver stable, reliable solutions. Contribute to migration roadmaps for legacy systems to modern architectures (where feasible). Support release management, testing, and deployment processes. Requirements 10+ years of professional experience in MFC, C++, and Fortran development. Strong expertise in Windows desktop application development. Solid knowledge of system-level programming, multithreading, memory management, and performance optimization. Proven experience working with large legacy codebases and ensuring backward compatibility. Familiarity with software modernization approaches, including migration to newer platforms, APIs, or UI frameworks. Excellent debugging, troubleshooting, and problem-solving skills. Strong communication skills and ability to work directly with end-users/stakeholders. Preferred Qualifications Exposure to .NET, C#, or modern C++ standards (C++14/17/20). Experience with Fortran-to-C++ migration or interoperability. Familiarity with scientific computing, engineering applications, or financial modeling systems. Knowledge of version control (Git), build systems, and CI/CD pipelines. Experience working in regulated industries (pharma, aerospace, energy, or financial services). Soft Skills Strong analytical and architectural thinking. Ability to balance short-term fixes with long-term modernization goals. Clear technical documentation and knowledge-sharing mindset. Collaboration across distributed/global teams.
About the Role We are looking for an experienced DevOps Engineer who will play a critical role in building, maintaining, and scaling our CI/CD pipelines, cloud deployments, and automation frameworks. This position is ideal for someone passionate about infrastructure as code, containerization, and modern DevOps practices. Key Responsibilities Design, implement, and maintain CI/CD pipelines using Azure DevOps, Jenkins, or GitHub Actions . Manage and optimize Git branching strategies for collaborative development. Build and maintain containerized applications using Docker and Kubernetes . Implement and manage Infrastructure as Code (IaC) solutions with Terraform and Bicep . Write and maintain automation scripts using Python, Bash, and PowerShell . Deploy and manage workloads on Azure and AWS cloud environments . Collaborate with cross-functional teams to ensure secure, reliable, and scalable deployments. Support and troubleshoot production issues across Windows and Linux environments. Qualifications Strong experience with CI/CD tools (Azure DevOps, Jenkins, GitHub Actions). Hands-on expertise with Git workflows and branching strategies . Solid knowledge of containerization (Docker, Kubernetes). Experience with Infrastructure as Code (Terraform, Bicep). Strong scripting skills ( Python, Bash, PowerShell ). Cloud deployment experience in Azure and AWS . Working knowledge of both Windows and Linux operating systems. Good to have : Experience with Terraform & Ansible for automation and configuration management.
Please apply directly or send your resume to request@tytantech.com with the subject line: “Application – Dynamics 365 F&O Technical Consultant” About the Role We are seeking an experienced Dynamics 365 Finance & Operations Technical Consultant to join on a contract basis. The consultant will focus on customizations, X++ development, system integrations, and AP/AR automation , ensuring smooth upgrades and alignment with business needs. This role is ideal for a hands-on developer with strong knowledge of D365 F&O technical stack and the ability to collaborate with functional teams. Key Responsibilities Develop and maintain customizations in Dynamics 365 F&O using X++. Design and implement workflows (e.g., approvals for invoices, product receipts, sales orders). Build and enhance system integrations (AP/AR e-invoicing with GST, bank reconciliation automation using MT940 files, SharePoint, Power Automate). Manage ISV solution compatibility during D365 upgrades (.44 → .46 and beyond). Create and maintain custom fields, reports, and validations across multiple modules. Work closely with functional consultants to translate requirements into technical solutions. Ensure adherence to best practices and performance standards , minimizing unnecessary customizations. Provide technical input on when to use standard features vs. customization . Document technical designs and deliver clear handovers to internal teams. Required Skills & Experience 5+ years of Dynamics 365 Finance & Operations technical development experience. Strong expertise in X++ development . Hands-on experience with modules: Accounts Payable, Accounts Receivable, General Ledger, Inventory Management, HRMS, Warehouse Management . Experience creating custom workflows, reports, and fields . Strong knowledge of ISV integrations (Docentric, Avalara, etc.). Prior work in AP/AR automation, e-invoicing, bank reconciliation . Familiarity with Power Automate and SharePoint integrations . Ability to test and validate in Dev, Sandbox, and Production environments . Contract Details Engagement Type: Contractor Duration: 6–12 months (extendable) Location: Remote Start Date: Immediate
Please apply directly or send your resume to request@tytantech.com with the subject line: “Application – Data Scientist / AI Engineer – Energy & Asset Intelligence” About the Role We are seeking a highly skilled Data Scientist / AI Engineer to join our growing AI & Analytics Team within a leading Oil & Gas Private Equity firm. This role is ideal for individuals who combine strong data science and software engineering expertise with a deep understanding of energy markets, operations, or asset optimization . You’ll work closely with our investment, engineering, and technology teams to develop advanced models and data-driven products that enhance decision-making across exploration, production, asset management, and capital allocation. Key Responsibilities Design and implement AI and ML models that support investment analysis, production forecasting, equipment optimization, and market intelligence. Build end-to-end data pipelines for data ingestion, transformation, and feature engineering using structured and unstructured datasets (production logs, financial data, IoT, etc.). Develop and deploy machine learning solutions (predictive, prescriptive, generative) leveraging modern frameworks (PyTorch, TensorFlow, Scikit-learn, LangChain, etc.). Collaborate with engineers, investment analysts, and domain experts to translate business challenges into analytical solutions . Work on GenAI and NLP-based applications for unstructured document analysis, deal evaluation, and portfolio reporting. Write production-grade, scalable code for ML pipelines and APIs (Python, FastAPI, Flask, etc.). Conduct exploratory data analysis (EDA) , model validation, and performance tracking using statistical and visualization techniques. Contribute to cloud-based ML workflows (Azure, AWS, or GCP), including containerization (Docker) and orchestration (Kubernetes). Support internal innovation projects on Agentic AI, MLOps, and generative intelligence for operational insights . Qualifications Bachelor’s or Master’s degree in Computer Science, Data Science, Petroleum Engineering, Applied Mathematics, or related fields . 3–7 years of hands-on experience in data science, AI engineering, or ML operations, preferably in Oil & Gas, Energy, or Financial Services sectors. Proven ability to design and implement ML algorithms , time-series models, and optimization techniques. Strong programming skills in Python (Pandas, NumPy, Scikit-learn, PyTorch/TensorFlow). Experience with data querying and processing tools (SQL, PySpark, Databricks, or Snowflake). Knowledge of GenAI/LangChain/OpenAI API for automation or analytical applications. Familiarity with cloud platforms (Azure, AWS, or GCP) and CI/CD pipelines for ML deployment. Excellent analytical, problem-solving, and communication skills, with the ability to work cross-functionally in technical and investment teams. Preferred Experience (Nice-to-Have) Exposure to energy trading analytics , asset performance modeling , or reservoir data analysis . Experience in developing AI copilots or internal chatbots using enterprise data. Understanding of financial modeling or risk analysis in private equity or investment environments. Experience integrating LLMs or knowledge graphs for document intelligence or portfolio insights.
Please apply directly or send your resume to request@tytantech.com with the subject line: “Application – Data Engineer” We are seeking an experienced Data Engineer with a strong background in building scalable data pipelines, architecting modern data platforms, and driving high-performance data transformations. The ideal candidate will have deep expertise in Microsoft Azure Synapse , Data Lake , SQL , and Spark , and will be capable of designing solutions aligned with modern data strategy and architecture principles. Key Responsibilities Design and develop scalable, efficient, and reliable data pipelines for ingestion, transformation, and storage of structured and unstructured data. Architect and implement modern data lakehouse and data warehouse solutions using Azure Synapse Analytics and Azure Data Lake . Develop and optimize complex data transformation workflows leveraging Apache Spark for parallel processing and distributed computation. Work closely with business and analytics teams to translate requirements into robust data models and reusable data components. Lead a multi-skilled, distributed data engineering team , providing technical guidance, code reviews, and best practices in data architecture and governance. Implement and enforce data quality, lineage, and performance optimization standards. Stay current with emerging data engineering tools, cloud-native services, and architectural patterns , and recommend adoption of best-fit technologies. Requirements 5+ years of hands-on experience in data engineering, data transformation, and data architecture . Strong expertise in Azure Synapse Analytics , Azure Data Lake , and SQL-based data modeling . Deep understanding of Spark clusters , parallel data processing , and ETL optimization . Proven ability to handle complex data transformation scenarios and optimize data processing for scale and performance. Strong understanding of modern data architectures (e.g., lakehouse, data mesh, delta architecture). Experience working in agile environments and leading cross-functional data engineering teams. Excellent communication and documentation skills, with the ability to convey technical concepts to non-technical stakeholders. Preferred Skills (Good to Have) Experience with Azure Data Factory , Databricks , or Fabric . Knowledge of CI/CD for data pipelines using Git and DevOps tools. Exposure to data governance, security, and compliance frameworks . Familiarity with Python, PySpark , or Scala for custom data transformations.