Future Forward Technologies (FFT)

7 Job openings at Future Forward Technologies (FFT)
Data Engineering Lead Coimbatore,Tamil Nadu,India 7 years None Not disclosed Remote Contractual

Data Engineering Lead | 7+ Years | Remote | Work Timings: 1:00 PM to 10:00 PM Or 2:00 PM to 11:00 PM | Contract Duration- 3 Months + Job Description: Provide technical leadership on data engineering (acquisition, transformation, distribution), across multiple projects and squads Guide, nurture, and coach a team of data engineers, architects and database administrators working in an agile, fast- paced environment Drive adoption and evolution of frameworks and products based on best- in-class data management technologies, such as ETL/ELT tools, open-source frameworks, workflow management tools, etc. Provide suggestions and recommendation on how to improve the overall data engineering architecture towards a scalable, cost effective ecosystem Your Deliverables: Design, implement, and manage data pipelines using our customer data platform Treasure Data. Develop, test, and maintain custom workflows. Monitor and troubleshoot Airflow workflows to ensure optimal performance and reliability. Collaborate with data engineers and data scientists to understand data requirements and translate them into scalable workflows. Ensure data quality and integrity across all workflows. Document processes, workflows, and configurations. Interact with Redshift on a regular basis ensuring pipelines are functioning as expected Your Qualities: Inherent curiosity and empathy for customer success Obsessive about solving customer problems Think long and Act short Collaborative with your peers, partners and your team. Excited about the mission and milestones not titles and hierarchies Nurture an environment of experimentation and learning Your Expertise: Excellent understanding of enterprise data warehousing principles of ETL/ELT and the tools/technologies used to implement them Proven experience with Treasure Data, including designing and maintaining complex workflows. Strong knowledge of SQL programming and databases Strong knowledge of DigDag files

Agentic AI Engineer Coimbatore,Tamil Nadu,India 6 years None Not disclosed Remote Contractual

Agentic AI Engineer | 6+ Years | Remote | Work Timings: 2:00 PM to 11PM Job Description: AI/ML Model Development & Deployment – Focused on marketing and operations use cases Deep Learning – CNNs, RNNs, Transformers, Attention Mechanisms Generative AI – Experience with OpenAI (GPT, DALL·E, Whisper) and Anthropic (Claude) Agentic AI Platforms – AutoGen, CrewAI, AWS Bedrock Multimodal AI – Building agents using text, voice, and other inputs for automation Python Programming – Proficient with NumPy, Pandas, Matplotlib, TensorFlow, PyTorch Traditional ML Techniques – Supervised/Unsupervised Learning, PCA, Feature Engineering, Model Evaluation, Hyperparameter Tuning Data Analytics – Predictive analysis, clustering, A/B testing, KPI monitoring MLOps & CI/CD – Model versioning, deployment pipelines, monitoring Cloud Services – AWS (S3, Lambda, EC2, SageMaker, Bedrock), Serverless architectures Development Tools – Proficient in using Cursor for design, development, and code reviews Communication & Collaboration – Strong communication skills and client engagement experience Domain Expertise – Marketing-focused AI solutions (big plus)

Azure Data Integrations Engineers Coimbatore,Tamil Nadu,India 7 years None Not disclosed On-site Contractual

Azure Data Integrations Engineers | 7+ Yrs | Work Timings: Standard IST | 2 positions Description Proven track record building production-grade pipelines using ADF, Synapse Pipelines, and Azure Data Lake Gen2 Demonstrated expertise integrating with authenticated REST APIs (OAuth2, bearer tokens, pagination) and ingesting structured/semi-structured data (JSON, CSV) Consistently delivers in structured, async environments — with clean Jira hygiene, well-documented logic, and precise written communication Azure Data Integration Engineer with deep hands-on experience building Azure-native data pipelines to support enterprise-scale data ingestion into a governed Azure Data Lake environment. This is a delivery-focused role responsible for building and maintaining production-ready pipelines that move structured and semi-structured data into central lake zones from internal and external systems. Work closely with the engineering team to design pipeline logic, implement robust orchestration, and ensure production readiness through strong parameterization, observability, and documentation. The right candidate brings strong ownership, communicates with precision, and delivers consistently in asynchronous environments. Responsibilities Design and implement scalable ingestion pipelines using Azure Data Factory, Synapse Pipelines, and Azure Data Lake Storage Gen2 Build modular, parameterized pipeline components - including linked services, datasets, and triggers - to support batch, file-based, and API-based ingestion patterns Integrate data from various sources using REST APIs (OAuth2, bearer tokens, pagination), SFTP, and structured file formats (CSV, JSON) Define ingestion logic across raw, curated, and conformed zones within a governed enterprise lakehouse model Establish and maintain monitoring, logging, and alerting using Azure-native observability tools Own the full pipeline lifecycle - design, development, deployment, documentation, and issue resolution Refactor and stabilize legacy pipelines where needed to improve maintainability and performance Document technical logic, runtime behavior, data mapping assumptions, and schema expectations using Confluence or Markdown Track tasks and sprint deliverables using tools like Jira, and participate in structured status updates and reviews Operate with high autonomy and accountability - consistently delivering work with minimal oversight and strong async communication Qualifications 7+ years of experience in data engineering, backend integration, or cloud pipeline development Strong expertise with: Azure Data Factory (ADF) - pipelines, integration runtimes, parameterization, failure handling Synapse Pipelines Azure Data Lake Gen2 Proficiency in SQL and Python for data transformations, validation, and control logic Experience ingesting data from authenticated REST APIs, handling pagination, retries, and token management Familiarity with enterprise ingestion patterns - file lifecycle management, schema evolution, secure access layers Comfortable with Git-based development, pull requests, and CI/CD deployment workflows Able to maintain reliable pipeline observability - alerts, logging, monitoring, and root cause tracing Strong written communication skills and documentation habits Full-time availability during India Standard Time (IST) Preferred Experience with metadata frameworks, ingestion registries, or data cataloging Background working in distributed enterprise environments or regulated data contexts Microsoft certification (e.g. DP-203) is a plus

Oracle EBS Technical Consultant Coimbatore,Tamil Nadu,India 7 years None Not disclosed On-site Contractual

Oracle EBS Technical Consultant | 7+ Yrs | Work Timings: UK Timezone Description Key Responsibilities Lead Oracle Cloud Success Navigator implementations and configuration • Configure dashboards, metrics, and success measurement frameworks • Execute end-to-end Oracle EBS and Fusion implementation projects Conduct business process analysis and system integrations Lead client workshops and provide strategic consulting guidance Mentor team members and ensure quality deliverables Required Qualifications 7+ years Oracle EBS (R12) and Fusion Cloud Applications experience Proven Oracle Cloud Success Navigator implementation experience Strong Oracle Cloud Infrastructure (OCI) and integration tools knowledge Experience with 3+ full-cycle Oracle implementations Deep understanding of Finance, Supply Chain, and HR modules Excellent client-facing and leadership skills Oracle certifications preferred Preferred Experience Oracle Cloud Success Navigator certification Big 4 or Oracle Partner consulting background

Python Full Stack Developer Coimbatore,Tamil Nadu,India 4 years None Not disclosed Remote Contractual

Position: Python Full Stack Developer Location: Remote (India-based preferred) Engagement: Full-time (8-9 hours/day) Duration : 3 Months ( +6 months extendable only based on performance and outcome oriented) Exp : 4-6 years (5 years of relevant experience is preferred) About Us: We are a small, fast-moving startup building AI agents for the trading sector, with tight deadlines and a strong culture of ownership. We are looking for passionate individual contributors who can think independently, handle end-to-end tasks, and collaborate in a small, focused team. What You Will Do: ● Design and develop end-to-end features across backend and frontend. ● Integrate external APIs and transform data into internal formats. ● Build middleware services with authentication and clean architecture. ● Work with Python for backend services and data processing. ● Design and implement React + TypeScript frontend pages with responsive UI. ● Leverage AWS services for deployment and scaling. ● Use PostgreSQL for structured data storage and efficient querying. ● Apply knowledge of system design to plan scalable, maintainable solutions. ● Use AI tools and APIs to automate tasks and improve workflow efficiency. Key Skills Required: ● Strong experience with Python for backend development with Django or Flask. ● Solid AWS knowledge for deploying and managing services and awareness of pipelines. ● Proficient with React with TypeScript for frontend development. ● Experience with REST APIs or FAST API: building, consuming, and securing them. ● Good understanding of PostgreSQL or similar relational databases. ● Ability to design clean, maintainable middleware and API integrations. ● Exposure to system design concepts for small-scale, production-ready systems. ● Familiarity with AI/ML tools (e.g., LLM APIs, automation frameworks) is a strong plus. Who We Are Looking For: ● Passionate problem-solvers who take ownership of features end-to-end. ● Comfortable working in a small team with minimal hand-holding. ● Ready to work with tight deadlines. ● Self-driven, disciplined, and able to prioritize independently. ● Eager to experiment with AI tools to speed up and automate work. ● Strong communication and collaborative mindset despite being remote. Why Join Us: ● Direct impact on building AI-driven products for trading and finance. ● Opportunity to work on cutting-edge API integrations and automation. ● Ownership of features from system design to production. ● Tight-knit, passionate team with fast feedback cycles. ● Learning environment with exposure to modern AI, cloud, and web tech.

Lead Automation Engineer coimbatore,tamil nadu 8 - 12 years INR Not disclosed On-site Full Time

As an Automation Architect/Lead at INFOSYS, you will be responsible for leading automation initiatives and providing technical solutions for automation challenges. With a focus on automation tools such as Selenium, Playwright, and Appium, along with proficiency in Java, TypeScript, TestNG, and Maven, you will drive the development of UI frameworks for automation. Your expertise in CI/CD & DevOps tools like GitHub Actions, Azure DevOps, Azure Pipelines, Docker Containers, and Selenium Grid will be essential in ensuring efficient and effective automation processes. In addition to your technical skills, you will showcase your leadership abilities by managing teams and collaborating on agile methodologies. Your experience in API testing, cloud and mobile testing with tools like Perfecto, and version control with GitHub and Maven will be instrumental in delivering high-quality automation solutions. Your strong analytical and problem-solving skills, along with excellent communication and customer-facing capabilities, will contribute to the success of automation projects. Join us in this dynamic role, where your attention to detail and quality mindset will drive innovation and excellence in automation practices.,

Databricks Developer maharashtra 2 - 12 years INR Not disclosed On-site Full Time

As a Databricks Developer at Future Forward Technologies (FFT), you will be part of a contract remote role where you will play a key role in back-end web development, software development, programming, and object-oriented programming (OOP) tasks on a daily basis. Your primary skills will include proficiency in Databricks, SQL, Python, PySpark, and logical thinking. The role is based in Mumbai and requires working from the office (WFO). At FFT, we are looking for individuals with a strong foundation in Computer Science and Software Development, along with expertise in back-end web development and OOP skills. Experience with the Databricks platform is a plus, but not mandatory. As a Databricks Developer, you should possess strong problem-solving and analytical skills, enabling you to work independently and remotely when needed. We have multiple positions available at FFT, including Architect (8-12 years experience), Senior Developer (4-6 years experience), and Developer (2-4 years experience). The Architect role specifically requires a hands-on approach to the responsibilities. If you are passionate about leveraging your programming skills to develop scalable solutions and enhance efficiency, this role at FFT could be the perfect fit for you. A Bachelor's degree in Computer Science or a related field is required to be considered for this opportunity. Join us at Future Forward Technologies and be a part of our dynamic team where we focus on delivering tailored applications that drive efficiency and productivity across various devices and platforms.,