We are looking for a highly skilled Fullstack Developer with React.js and Python scripting expertise to join our team. The ideal candidate will have experience building dynamic and interactive user interfaces using React.js while also leveraging Python for backend scripting, automation, or data processing. You will collaborate closely with our frontend and backend teams to deliver high-quality web applications and services. Responsibilities Develop and maintain responsive and interactive web applications using React.js . Implement state management solutions (e. g., Redux, Context API). Optimize applications for performance and scalability. Work with RESTful APIs and WebSockets for seamless frontend-backend communication. Develop reusable UI components using modern JavaScript (ES6+). Ensure cross-browser compatibility and mobile responsiveness. Write Python scripts for data processing, automation, and backend tasks. Work with APIs and integrate frontend applications with Python-based backends. Develop and maintain server-side scripts and microservices using Flask/Django/FastAPI. Perform ETL (Extract, Transform, Load) operations on datasets when needed. Automate workflows and optimize scripts for performance and efficiency. Collaborate with backend engineers, designers, and product managers. Follow best practices in coding, testing, and deployment. Use version control systems (Git, GitHub/GitLab/Bitbucket). Write unit and integration tests for React components and Python scripts. Troubleshoot and resolve bugs, performance issues, and security vulnerabilities. Requirements 4-6 years of experience in React.js development. Mid-level - Fullstack Web Developer. Skills: React 18 Javascript, HTML, CSS, SCSS, Python, Python Flask, Restful APIS, Postgres SQL, Python Scripting. Strong experience with Python scripting and backend frameworks (Flask). Familiarity with SQL and NoSQL databases (PostgreSQL, MongoDB, etc. ). Experience working with RESTful APIs, GraphQL, and WebSockets. Knowledge of state management libraries (Redux, MobX, Context API). This job was posted by Akshay Pawar from Exponentia.ai.
We are looking for a highly skilled Fullstack Developer with React.js and Python scripting expertise to join our team. The ideal candidate will have experience building dynamic and interactive user interfaces using React.js while also leveraging Python for backend scripting, automation, or data processing. You will collaborate closely with our frontend and backend teams to deliver high-quality web applications and services. The Core Responsibilities For The Job Include The Following Frontend Development (React.js ): Develop and maintain responsive and interactive web applications using React.js . Implement state management solutions (e. g., Redux, Context API). Optimize applications for performance and scalability. Work with RESTful APIs and WebSockets for seamless frontend-backend communication. Develop reusable UI components using modern JavaScript (ES6+). Ensure cross-browser compatibility and mobile responsiveness. Python Scripting And Backend Integration Write Python scripts for data processing, automation, and backend tasks. Work with APIs and integrate frontend applications with Python-based backends. Develop and maintain server-side scripts and microservices using Flask/Django/FastAPI. Perform ETL (Extract, Transform, Load) operations on datasets when needed. Automate workflows and optimize scripts for performance and efficiency. Collaboration And Best Practices Collaborate with backend engineers, designers, and product managers. Follow best practices in coding, testing, and deployment. Use version control systems (Git, GitHub/GitLab/Bitbucket). Write unit and integration tests for React components and Python scripts. Troubleshoot and resolve bugs, performance issues, and security vulnerabilities. Requirements Must have 4-6 years of experience in React.js development. 3-5 years of experience in React.js development. Skills: React 18 JavaScript, HTML, CSS, SCSS, Python, Python Flask, Restful APIS, Postgres SQL, Python Scripting. Strong experience with Python scripting and backend frameworks (Flask). Familiarity with SQL and NoSQL databases (PostgreSQL, MongoDB, etc. ). Experience working with RESTful APIs, GraphQL, and WebSockets. Knowledge of state management libraries (Redux, MobX, Context API). This job was posted by Swapna Kalvapu from Exponentia.ai.
The Qlik Sense Developer will be responsible for designing, developing, and deploying data visualizations and dashboards using Qlik Sense. You will need a combination of technical expertise in Qlik Sense, project planning skills, and the ability to collaborate with cross-functional teams to ensure successful project delivery. It is expected that you have hands-on experience in data visualization, analytics, and translating business requirements into actionable insights by working closely with stakeholders. Proficiency in data modeling, integration, and data engineering principles will be crucial for supporting and driving data-driven decision-making processes effectively. Your responsibilities will include planning and executing the migration of existing QlikView applications to Qlik Sense. You will need to analyze current QlikView reports and dashboards to understand data sources, data models, and user requirements. Developing and implementing migration strategies to ensure minimal disruption to business operations will be a key part of your role. As a Qlik Sense Developer, you will design and develop Qlik Sense applications and dashboards that meet business requirements and enhance user experience. Optimizing data models and visualization techniques to improve performance and efficiency will also be essential. You will be responsible for ensuring data integrity and accuracy throughout the migration process. Collaboration is a vital aspect of this role, as you will need to work with cross-functional teams, including business analysts, data engineers, and IT support, to ensure successful migration and implementation. Providing technical support and troubleshooting for Qlik Sense applications post-migration and training end-users and stakeholders on new Qlik Sense functionalities and best practices will also be part of your responsibilities. Documentation and reporting are crucial in this role. You will need to document migration processes, data models, and application designs for future reference and compliance. Additionally, preparing and presenting reports on migration progress, challenges, and solutions to management and stakeholders will be required to ensure transparency and effective communication.,
As a Senior Application Developer, you will be responsible for designing, developing, testing, deploying, and maintaining robust and scalable web or mobile applications. Your expertise in Python-based application development, both front-end and back-end, will be crucial for building full-stack applications. This includes utilizing backend frameworks such as Django, Flask, or FastAPI, and frontend libraries/tools like React, Vue, or integration via REST/GraphQL APIs. Collaboration with cross-functional teams, including product managers, designers, QA, and DevOps, will be essential in ensuring the accuracy, functionality, and security of the code you review. Your role will also involve participating in architectural discussions, contributing to strategic technology decisions, and mentoring junior developers through peer code reviews. Additionally, staying current with emerging technologies and industry trends is expected while leveraging simulation platforms or tools like Stimulate to model, test, or visualize application behavior under different conditions where applicable. Documenting development processes, system design, and application features will be part of your responsibilities, along with maintaining and improving legacy systems by implementing modern solutions. If you have a proven track record in developing scalable applications across the full software development lifecycle and can translate business requirements into high-quality technical solutions, we encourage you to apply for this challenging and rewarding position.,
About the Role: Resource & Staffing Manager As a Resource & Staffing Manager, you will play a critical role in managing and optimizing the allocation of human resources across projects and teams. You will ensure that staffing aligns with business objectives, project requirements, and employee skillsets, while fostering a collaborative and inclusive work environment. Key Responsibilities Develop and implement resource allocation strategies to meet project demands and organizational goals. Manage staffing processes, including recruitment, onboarding, and workforce planning, ensuring optimal utilization of resources. Work closely with project managers, department heads, and HR teams to understand staffing needs and align resources accordingly. Identify skill gaps and coordinate training programs to enhance employee capabilities and readiness for future projects. Monitor resource performance and provide feedback to ensure alignment with project objectives and organizational standards. Analyze workforce trends and project pipelines to anticipate future staffing needs and proactively address challenges. Foster a supportive and inclusive workplace culture that values learning, initiative, and ownership. Stay updated on industry trends and best practices in resource management and staffing.
About the Role: Senior Presales Manager As an Account Manager in the Data & AI space, you will be pivotal in managing customer relationships and driving business growth. You will collaborate closely with clients and internal teams to understand business challenges and demonstrate how our Data and AI solutions, including generative AI, can effectively address these challenges and contribute to their success. Key Responsibilities Revenue Management: Support revenue targets for assigned accounts, ensuring alignment with overall business objectives. Account Development: Assist in developing and executing account growth strategies focused on Data and AI solutions, including generative AI applications. Customer Relationship Management: Nurture existing customer relationships to identify upsell and cross-sell opportunities, acting as a trusted advisor to clients. New Business Opportunities: Identify and qualify new business opportunities through marketing efforts and referrals. Collaboration with Teams: Work closely with pre-sales, technical, and marketing teams to deliver tailored solutions that meet client needs. Sales Pipeline Tracking: Maintain and manage sales pipeline, tracking deal progress and revenue forecasts. Market Awareness: Stay updated on industry trends, emerging AI technologies, and competitor activities within the Data & AI space. Product Knowledge: Develop a strong understanding of our Data and AI solutions to effectively communicate their value to clients. Sales Presentations: Assist in the development and delivery of sales presentations that showcase the value proposition of our solution.
You are a highly skilled and motivated Talend Developer with 4-5 years of hands-on experience in designing, developing, and implementing data integration solutions using the Talend platform. Your strong understanding of data pipelines, ETL processes, and ability to work with large datasets will be crucial in your role. As a Talend Developer, you will be part of a dynamic team responsible for managing data workflows, data quality, and ensuring seamless integration of data across multiple systems. Your responsibilities include designing, developing, and implementing data integration processes using Talend Studio (both Talend Open Studio and Talend Data Integration). You will create and maintain ETL processes to extract, transform, and load data from various sources (e.g., databases, flat files, APIs) into target systems. Developing and managing complex data pipelines to support real-time and batch data integration will be part of your daily tasks. Collaboration with business and technical teams is essential as you translate data requirements into efficient, scalable solutions. Your work will involve data cleansing, transformation, and data quality management. Additionally, you will integrate Talend with other tools and technologies such as databases, cloud platforms, and message queues. In your role, you will troubleshoot and resolve issues related to ETL jobs, data synchronization, and performance to ensure high performance, scalability, and reliability of data integration solutions. Monitoring and optimizing ETL jobs and data workflows to meet performance and operational standards will be part of your routine. Participating in code reviews, testing, and debugging activities is crucial to ensure code quality. To document technical specifications, processes, and procedures accurately is also part of your responsibilities. Keeping yourself updated with the latest developments in Talend and data integration best practices is essential to excel in this role.,
Exponentia.ai is a fast-growing AI-first technology services company, partnering with enterprises to shape and accelerate their journey to AI maturity. With a presence across the US, UK, UAE, India, and Singapore, we bring together deep domain knowledge, cloud-scale engineering, and cutting-edge artificial intelligence to help our clients transform into agile, insight-driven organizations. We are proud partners with global technology leaders such as Databricks, Microsoft, AWS, and Qlik, and have been consistently recognized for innovation, delivery excellence, and trusted advisories. Awards & Recognitions Innovation Partner of the Year – Databricks, 2024 Digital Impact Award, UK – 2024 (TMT Sector) Rising Star – APJ Databricks Partner Awards 2023 Qlik’s Most Enabled Partner – APAC With a team of 450+ AI engineers, data scientists, and consultants, we are on a mission to redefine how work is done, by combining human intelligence with AI agents to deliver exponential outcomes. Learn more: www.exponentia.ai Role Overview We are seeking a detail-oriented and analytical Data Analyst to join our team. The ideal candidate will be responsible for extracting, organizing, and analyzing data from various sources to support strategic decision-making. You will work with tools such as Python and Power BI to clean, transform, and visualize data, while ensuring high levels of data integrity and security. Key Responsibilties Extract and consolidate data from multiple sources including Excel spreadsheets and PDF documents with high accuracy and efficiency. Organize, clean, and validate data to ensure consistency, completeness, and readiness for analysis. Write and maintain Python scripts for data manipulation, automation, and integration across various platforms. Design, develop, and maintain interactive dashboards and reports in Power BI to visualize key performance indicators and business metrics. Collaborate with internal stakeholders to understand data requirements and deliver actionable insights through tailored reporting solutions. Analyze data to identify trends, patterns, and anomalies that can drive business decisions and process improvements. Ensure the integrity, confidentiality, and security of data throughout all stages of the data lifecycle. Skills & Qualifications Proficiency in Python for data manipulation, automation, and integration. Experience extracting data from Excel and PDF documents using tools like Tabula or Camelot. Strong skills in data cleaning, transformation, and validation. Hands-on experience with Power BI for building dynamic reports and dashboards. Basic knowledge of SQL and working with relational databases. Strong analytical skills with the ability to identify trends and anomalies. Excellent communication and collaboration skills with stakeholders. Why Join Exponentia.ai? Innovate with Purpose: Opportunity to create pioneering AI solutions in partnership with leading cloud and data platforms Shape the Practice: Build a marquee capability from the ground up with full ownership Work with the Best: Collaborate with top-tier talent and learn from industry leaders in AI Global Exposure: Be part of a high-growth firm operating across US, UK, UAE, India, and Singapore Continuous Growth: Access to certifications, tech events, and partner-led innovation labs Inclusive Culture: A supportive and diverse workplace that values learning, initiative, and ownership Ready to build the future of AI with us? Apply now and become a part of a next-gen tech company that’s setting benchmarks in enterprise AI solutions.
We are looking for a highly skilled Fullstack Developer with React.js and Python scripting expertise to join our team. The ideal candidate will have experience building dynamic and interactive user interfaces using React.js while also leveraging Python for backend scripting, automation, or data processing. You will collaborate closely with our frontend and backend teams to deliver high-quality web applications and services. The Core Responsibilities For The Job Include The Following Frontend Development (React.js ): Develop and maintain responsive and interactive web applications using React.js . Implement state management solutions (e. g., Redux, Context API). Optimize applications for performance and scalability. Work with RESTful APIs and WebSockets for seamless frontend-backend communication. Develop reusable UI components using modern JavaScript (ES6+). Ensure cross-browser compatibility and mobile responsiveness. Python Scripting And Backend Integration Write Python scripts for data processing, automation, and backend tasks. Work with APIs and integrate frontend applications with Python-based backends. Develop and maintain server-side scripts and microservices using Flask/Django/FastAPI. Perform ETL (Extract, Transform, Load) operations on datasets when needed. Automate workflows and optimize scripts for performance and efficiency. Collaboration And Best Practices Collaborate with backend engineers, designers, and product managers. Follow best practices in coding, testing, and deployment. Use version control systems (Git, GitHub/GitLab/Bitbucket). Write unit and integration tests for React components and Python scripts. Troubleshoot and resolve bugs, performance issues, and security vulnerabilities. Requirements Must have 4-6 years of experience in React.js development. 3-5 years of experience in React.js development. Skills: React 18 JavaScript, HTML, CSS, SCSS, Python, Python Flask, Restful APIS, Postgres SQL, Python Scripting. Strong experience with Python scripting and backend frameworks (Flask). Familiarity with SQL and NoSQL databases (PostgreSQL, MongoDB, etc. ). Experience working with RESTful APIs, GraphQL, and WebSockets. Knowledge of state management libraries (Redux, MobX, Context API). This job was posted by Swapna Kalvapu from Exponentia.ai.
Location : Mumbai Exp : 5 to 10 years Job Summary We are seeking a highly skilled Data Modeller with 5-10 years of experience in data analytics and data warehousing projects on the AWS platform. The ideal candidate will have a strong background in data modelling, business requirements analysis, and KPI development. This role requires expertise in creating data flow diagrams, business process flow diagrams, and data mapping from source to target systems. The candidate should also be proficient in analysing source data for data quality issues and designing conceptual, logical, and physical data models for dashboards and KPIs. Experience with star schema, snowflake schema, and data marts is essential. Key Responsibilities Gather and document business requirements for KPI development, dashboards, and reports. Create and maintain business process flow diagrams and data flow diagrams. Perform data mapping from source systems to target data models. Analyse source data in Excel or databases to identify and resolve data quality issues. Design conceptual data domains, logical data entities, and physical data models. Develop data warehouse models, including star schema, snowflake schema, and data marts. Collaborate with stakeholders, including business teams, data engineers, and developers, to ensure alignment between business needs and data architecture. Support data governance initiatives by defining metadata, data lineage, and data quality rules. Work on data analytics projects to enhance data-driven decision-making. Implement best practices for data modelling, ensuring scalability and performance optimization. (ref:hirist.tech)
About the Role: Intern - Data Engineering Exponentia.ai is launching the AI Talent Accelerator Program , a 6–12-month career accelerator designed for recent experienced graduates eager to build careers in Data Engineering, AI, and Cloud technologies . This program combines structured training with real-world enterprise projects, offering hands-on exposure to Databricks, AI, and cloud ecosystems . Graduates who successfully complete the program will have the opportunity to transition into a full-time Data Engineer role at Exponentia.ai . Key Responsibilities Design, build, and maintain data pipelines and ETL processes for enterprise projects. Work with structured and unstructured datasets to enable AI and analytics solutions. Collaborate with senior data engineers and mentors on real business use cases. Optimize data workflows for quality, scalability, and performance. Gain hands-on experience with platforms like Databricks, Azure, AWS and MS Fabric. Support global clients in industries such as BFSI, CPG, and Manufacturing.
Role Overview: As a highly skilled and motivated Talend Developer with 4-5 years of experience, you will be responsible for designing, developing, and implementing data integration solutions using the Talend platform. Your role will involve working with data pipelines, ETL processes, and managing large datasets to ensure seamless data integration across systems. Key Responsibilities: - Design, develop, and implement data integration processes using Talend Studio (both Talend Open Studio and Talend Data Integration). - Create and maintain ETL processes for extracting, transforming, and loading data from various sources into target systems. - Develop and manage complex data pipelines to support real-time and batch data integration. - Collaborate with business and technical teams to understand data requirements and translate them into scalable solutions. - Perform data cleansing, transformation, and data quality management tasks. - Integrate Talend with other tools and technologies like databases, cloud platforms, and message queues. - Troubleshoot and resolve issues related to ETL jobs, data synchronization, and performance. - Ensure high performance, scalability, and reliability of data integration solutions. - Monitor and optimize ETL jobs and data workflows to meet performance and operational standards. - Participate in code reviews, testing, and debugging activities to maintain code quality. - Document technical specifications, processes, and procedures. - Stay updated with the latest trends and best practices in Talend and data integration. Qualifications Required: - 4-5 years of hands-on experience in designing and implementing data integration solutions using Talend. - Strong understanding of data pipelines, ETL processes, and working with large datasets. - Proficiency in collaborating with business and technical teams to deliver efficient data solutions. - Experience in troubleshooting ETL job issues and optimizing data workflows. - Excellent documentation skills to maintain technical specifications and procedures.,
As a Market Research Intern, you will have the opportunity to work closely with the marketing and sales teams to gather, analyze, and interpret market data, enabling strategic decision-making and contributing to the growth of the business. This role is perfect for individuals passionate about market research and looking to build a career in technology consulting. - Assist in collecting, analyzing, and interpreting large volumes of data from various sources to provide market insights. - Support the development and execution of market research methodologies, including survey design and data collection techniques. - Collaborate with marketing and sales teams to understand market needs and provide actionable recommendations. - Work with lead generation software such as ZoomInfo and Seamless to gather relevant market information. - Prepare detailed reports and presentations using Microsoft Excel and PowerPoint, summarizing key findings and market trends. - Stay up to date with industry trends, competitive landscapes, and emerging technologies to contribute to ongoing projects. - Maintain data accuracy, confidentiality, and integrity in all research activities.,
As a Talend Technical Lead, you will collaborate with cross-functional teams to ensure seamless data integration and improve data pipeline efficiency. Your primary responsibilities will include designing, developing, and maintaining ETL processes using Talend Data Integration tools. Your expertise in data transformation, data warehousing, and performance optimization will be crucial for success in this role. Key Responsibilities: - Design, develop, and maintain ETL processes using Talend to support data integration, migration, and transformation. - Develop efficient and scalable data pipelines from multiple sources, including structured and unstructured data. - Provide technical guidance and expertise to the Talend development team, addressing complex technical challenges. - Define and enforce best practices for Talend development, including coding standards and data quality checks. - Optimize data workflows for performance, reliability, and scalability. - Work closely with Enterprise architects, data architect, and business stakeholders to understand data needs and ensure high-quality data delivery. - Mentor and coach junior Talend developers, ensuring knowledge transfer and providing technical support. - Act as a bridge between technical and non-technical teams to align on data integration goals. - Ensure data quality and integrity through validation processes and best practices. - Implement data governance, security, and compliance requirements. - Stay updated with the latest advancements in Talend and data integration technologies. Qualifications Required: - Bachelor's degree in Computer Science, Information Technology, or related field. - Proven experience in designing and implementing ETL processes using Talend Data Integration tools. - Strong understanding of data transformation, data warehousing, and performance optimization concepts. - Excellent communication skills and the ability to collaborate effectively with cross-functional teams. - Experience in mentoring and coaching junior developers is a plus. - Knowledge of data governance, security, and compliance practices.,
Role Summary We are seeking an experienced Data Engineer to design, build, and optimise modern data pipelines and transformation workflows on cloud-based platforms. The role will focus on ingesting raw source data, structuring it into curated layers, and ensuring it is reliable, governed, and optimised for analytics and reporting. Key Responsibilities Design, develop, and maintain data ingestion pipelines using cloud ETL services (e.g., AWS Glue, DMS, Azure Data Factory, or equivalent). Transform and integrate source system data into structured formats across landing, curated, and reporting layers. Collaborate with Data Architects to implement canonical data models and conformed dimensions. Rebuild or migrate existing transformation logic from legacy BI/ETL tools into modern data pipelines. Support migration of historical datasets into cloud storage and analytics layers. Implement logging, monitoring, and exception handling to ensure reliability and auditability of pipelines. Work with BI and application engineers to ensure dashboards and workflows operate effectively on the curated data layer. Participate in data validation and reconciliation exercises against legacy outputs. Contribute to UAT cycles and provide timely fixes for defects. Document pipelines, data lineage, and transformation logic for long-term maintainability.
Role Overview: As a Practice Lead for Data Engineering Solutions, your main responsibility will be to drive the design and implementation of scalable data solutions using cloud platforms such as Databricks, Azure, and AWS. You will lead a team of data engineers, set strategic directions for projects, and ensure adherence to best practices in data architecture, data management, and analytics. Apart from providing technical leadership, you will also manage relationships with OEM partners to enhance service offerings and stay aligned with evolving partner technologies. Key Responsibilities: - Lead the design and implementation of scalable data architectures using Databricks, Azure, and AWS. - Develop and execute a data engineering strategy that is in line with organizational goals and client requirements. - Manage OEM relationships to leverage partner technologies and improve data solutions. - Collaborate with cross-functional teams to gather requirements and deliver data solutions that enhance decision-making processes. - Mentor and guide a team of data engineers, fostering a culture of innovation and continuous improvement. - Establish best practices for data governance, quality, and security across all data solutions. - Stay up-to-date with industry trends and emerging technologies to drive innovation within the practice. - Manage stakeholder relationships and provide technical leadership in client engagements. - Oversee project delivery, ensuring timelines and budgets are met while maintaining high-quality standards. Qualifications Required: - Previous experience as a Practice Lead or similar role in data engineering. - Proficiency in designing and implementing scalable data architectures using cloud platforms such as Databricks, Azure, and AWS. - Strong understanding of data governance, quality, and security best practices. - Excellent communication and leadership skills to mentor and guide a team effectively. - Ability to manage relationships with OEM partners and drive innovation within the practice.,
Role Overview: As an intern in the Data Engineering team at Exponentia.ai, you will be part of the AI Talent Accelerator Program aimed at recent experienced graduates like yourself who are keen on developing careers in Data Engineering, AI, and Cloud technologies. This 6-12 month career accelerator program offers a blend of structured training and hands-on experience through enterprise projects, providing you with exposure to Databricks, AI, and cloud ecosystems. Upon successful completion of the program, you will have the opportunity to transition into a full-time Data Engineer role within the company. Key Responsibilities: - Design, construct, and manage data pipelines and ETL processes for various enterprise projects. - Utilize both structured and unstructured datasets to facilitate AI and analytics solutions. - Collaborate closely with senior data engineers and mentors to address real-world business challenges. - Optimize data workflows to ensure quality, scalability, and performance efficiency. - Acquire practical skills working with platforms such as Databricks, Azure, AWS, and MS Fabric. - Provide support to global clients across industries like BFSI, CPG, and Manufacturing. Qualifications Required: - Recent experienced graduate interested in developing a career in Data Engineering, AI, or Cloud technologies. - Strong understanding of data pipelines, ETL processes, and data manipulation techniques. - Familiarity with platforms like Databricks, Azure, AWS, or willingness to learn. - Ability to work collaboratively in a team environment and communicate effectively with stakeholders. - Eagerness to learn and adapt to new technologies and tools within the data engineering domain.,