Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
40.0 years
0 Lacs
pune/pimpri-chinchwad area
On-site
For more than 40 years, Accelya has been the industry’s partner for change, simplifying airline financial and commercial processes and empowering the air transport community to take better control of the future. Whether partnering with IATA on industry-wide initiatives or enabling digital transformation to simplify airline processes, Accelya drives the airline industry forward and proudly puts control back in the hands of airlines so they can move further, faster. Job Summary The Senior Specialist - Software Development (Artificial Intelligence) leads the design, development, and implementation of AI and machine learning solutions that address complex business challenges. This role requires expertise in AI algorithms, model development, and software engineering best practices. The individual will work closely with cross-functional teams to deliver intelligent systems that enhance business operations and decision-making. Key Responsibilities AI Solution Design & Development: Lead the development of AI-driven applications and platforms using machine learning, deep learning, and NLP techniques. Design, train, and optimize machine learning models using frameworks such as TensorFlow, PyTorch, Keras, or Scikit-learn. Implement advanced algorithms for supervised and unsupervised learning, reinforcement learning, and computer vision. Software Development & Integration: Develop scalable AI models and integrate them into software applications using languages such as Python, R, or Java. Build APIs and microservices to enable the deployment of AI models in cloud environments or on-premise systems. Ensure that AI models are integrated with back-end systems, databases, and other business applications. Data Management & Preprocessing: Collaborate with data scientists and data engineers to gather, preprocess, and analyze large datasets. Develop data pipelines to ensure the continuous availability of clean, structured data for model training and evaluation. Implement feature engineering techniques to enhance the accuracy and performance of machine learning models. AI Model Evaluation & Optimization: Regularly evaluate AI models using performance metrics (e.g., precision, recall, F1 score) and fine-tune them to improve accuracy. Perform hyperparameter tuning and cross-validation to ensure robust model performance. Implement methods for model explainability and transparency (e.g., LIME, SHAP) to ensure trustworthiness in AI decisions. AI Strategy & Leadership: Collaborate with business stakeholders to identify opportunities for AI adoption and develop project roadmaps. Provide technical leadership and mentorship to junior AI developers and data scientists, ensuring adherence to best practices in AI development. Stay current with AI trends and research, introducing innovative techniques and tools to the team. Security & Ethical Considerations: Ensure AI models comply with ethical guidelines, including fairness, accountability, and transparency. Implement security measures to protect sensitive data and AI models from vulnerabilities and attacks. Monitor the performance of AI systems in production, ensuring they operate within ethical and legal boundaries. Collaboration & Cross-Functional Support: Collaborate with DevOps teams to ensure AI models are deployed efficiently in production environments. Work closely with product managers, business analysts, and stakeholders to understand requirements and align AI solutions with business needs. Participate in Agile ceremonies, including sprint planning and retrospectives, to ensure timely delivery of AI projects. Continuous Improvement & Research: Conduct research and stay updated with the latest developments in AI and machine learning technologies. Evaluate new tools, libraries, and methodologies to improve the efficiency and accuracy of AI model development. Drive continuous improvement initiatives to enhance the scalability and robustness of AI systems. Required Skills & Qualifications Bachelor’s degree in Computer Science, Data Science, Artificial Intelligence, or related field. 5+ years of experience in software development with a strong focus on AI and machine learning. Expertise in AI frameworks and libraries (e.g., TensorFlow, PyTorch, Keras, Scikit-learn). Proficiency in programming languages such as Python, R, or Java, and familiarity with AI-related tools (e.g., Jupyter Notebooks, MLflow). Strong knowledge of data science and machine learning algorithms, including regression, classification, clustering, and deep learning models. Experience with cloud platforms (e.g., AWS, Google Cloud, Azure) for deploying AI models and managing data pipelines. Strong understanding of data structures, databases, and large-scale data processing technologies (e.g., Hadoop, Spark). Familiarity with Agile development methodologies and version control systems (Git). Preferred Qualifications Master’s or PhD in Artificial Intelligence, Machine Learning, or related field. Experience with natural language processing (NLP) techniques (e.g., BERT, GPT, LSTM, Transformer models). Knowledge of computer vision technologies (e.g., CNNs, OpenCV). Familiarity with edge computing and deploying AI models on IoT devices. Certification in AI/ML or cloud platforms (e.g., AWS Certified Machine Learning, Google Professional Data Engineer). What does the future of the air transport industry look like to you? Whether you’re an industry veteran or someone with experience from other industries, we want to make your ambitions a reality!
Posted 2 weeks ago
5.0 years
0 Lacs
indore, madhya pradesh, india
On-site
Job Title: Quantitative Trader (Forex) Location: Indore, MP, India (Onsite only - Work from office) Experience Level: 3–5 years Employment Type: Full-time Role Overview Tecnomi, an innovative IT firm, is hiring a skilled Quantitative Trader to drive profitable, systematic forex strategies. You'll collaborate with the CTO, project owner, and dev team to develop, deploy, and refine live trading models that deliver real-world results. This onsite role offers hands-on involvement in cutting-edge AI integrations, risk management, and data-driven decision-making in a fast-paced, collaborative environment focused on long-term innovation and platform growth. Key Responsibilities Strategy Development & Execution: Design, test, and deploy quantitative trading strategies leveraging price data, volatility, sentiment indicators, and macroeconomic factors for live forex markets. Modeling & Backtesting: Create reliable forecasting models (e.g., time-series, LSTM/Transformer hybrids) with thorough backtesting, forward-testing, and optimization for profitability and robustness. Risk Management: Build and integrate risk controls (e.g., Value-at-Risk, drawdown limits, position sizing) to ensure compliance and minimize losses in volatile conditions. Data Integration: Source, clean, and analyze diverse data feeds (e.g., news APIs, order books, economic calendars) to enhance model inputs and trading signals. Mentorship & Knowledge Sharing: Mentor non-technical team members (including the project owner) on forex basics, quantitative trading concepts, and strategy insights through regular sessions. Collaboration & Documentation: Partner with developers to embed models into production systems; maintain detailed documentation of strategies, assumptions, and performance metrics for transparency and iteration. Required Qualifications 3–5 years of hands-on experience in quantitative trading or systematic strategy development, ideally in forex or similar markets. Proven track record with backtesting tools, statistical validation, and live strategy deployment. Deep knowledge of FX market dynamics, macroeconomic influences, and risk modeling. Strong communication and teaching abilities to guide non-experts effectively. Bachelor’s or Master’s degree in Quantitative Finance, Mathematics, Statistics, or a related field. Verifiable 3-year portfolio of past strategies, backtests, performance reports, or live trading results. Preferred Skills Experience trading multiple forex pairs or cross-asset strategies. Proficiency in Python or R (e.g., pandas, NumPy, statsmodels, scikit-learn) for rapid prototyping. Familiarity with Indian regulatory frameworks (e.g., RBI/FEMA). Exposure to platforms like QuantConnect, TradingView, or MLflow for strategy automation.
Posted 2 weeks ago
0 years
0 Lacs
india
On-site
Buscamos personas visionarias y con habilidades técnicas sólidas en inteligencia artificial aplicada al ámbito neurosensorial . Nuestro objetivo no se limita solo al análisis de sonidos, sino a la creación de sistemas capaces de percibir y procesar múltiples estímulos sensoriales humanos (audición, visión, tacto, vibraciones, etc.), con el fin de dar soluciones concretas al ecosistema industrial. La meta es que la IA no solo clasifique datos, sino que interprete y entienda fenómenos del entorno como lo haría un ser humano, apoyando la toma de decisiones en contextos críticos como minería, seguridad, mantenimiento predictivo y ciudades inteligentes. Lo que harás: Desarrollar e implementar modelos de IA inspirados en la percepción sensorial humana. Trabajar con técnicas de análisis de audio, imágenes, vibraciones u otras señales. Aplicar arquitecturas de deep learning (CNN, RNN, LSTM, Transformers, autoencoders). Integrar sensores y sistemas IoT para la recolección de datos en tiempo real. Convertir datos en información interpretable y útil para operaciones industriales. Qué necesitamos de ti: Experiencia demostrable en proyectos de IA o ciencia de datos. Conocimientos prácticos en procesamiento de señales (audio, imagen, vibración, etc.). Manejo avanzado de frameworks de IA (TensorFlow, PyTorch, Keras o equivalentes). Programación sólida en Python (C++ u otros lenguajes, deseable). Creatividad para pensar soluciones fuera del esquema tradicional. Capacidad de trabajar en equipo multidisciplinario (ingeniería, industria, datos). (No pedimos títulos: valoramos la experiencia real, los proyectos y la capacidad de innovación). Neurosensory Artificial Intelligence Developer Description: We are looking for visionary professionals with strong technical skills in neurosensory artificial intelligence . Our goal is not limited to sound analysis but rather to the development of systems capable of perceiving and processing multiple human sensory stimuli (hearing, vision, touch, vibrations, etc.), in order to provide solutions to the industrial ecosystem. The mission is for AI not only to classify data but also to interpret and understand environmental phenomena as a human would, supporting decision-making in critical contexts such as mining, security, predictive maintenance, and smart cities. What you will do: Develop and implement AI models inspired by human sensory perception. Work with techniques in audio, image, vibration, or other signal analysis. Apply deep learning architectures (CNN, RNN, LSTM, Transformers, autoencoders). Integrate IoT devices and sensors for real-time data collection. Transform data into interpretable and actionable information for industrial operations. What we need from you: Proven experience in AI or data science projects. Practical knowledge in signal processing (audio, image, vibration, etc.). Advanced use of AI frameworks (TensorFlow, PyTorch, Keras or equivalents). Strong programming skills in Python (C++ or others, desirable). Creativity to think outside the box and propose innovative solutions. Ability to collaborate within multidisciplinary teams (engineering, industry, data). (We do not require formal degrees: we value real-world experience, projects, and innovation capacity.)
Posted 2 weeks ago
0.0 years
0 Lacs
hyderabad, telangana, india
On-site
Ready to build the future with AI At Genpact, we don&rsquot just keep up with technology&mdashwe set the pace. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what&rsquos possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Manager, Data Scientist We are looking for a highly motivated and technically proficient Senior Data Scientist with deep expertise in time series forecasting, machine learning, natural language processing (NLP), and Python programming. This role is critical to enhancing our forecasting capabilities, driving product segmentation strategies, and supporting data-driven decision-making. The candidate will work closely with cross-functional teams including data engineers, analysts, and project managers to develop, test, and deploy scalable forecasting models and analytical solutions in a cloud-based environment. Responsibilities 1. Forecasting & Model Development . Design, develop, and implement advanced time series forecasting models (e.g., ARIMA, Prophet, LSTM, XGBoost, etc.) tailored to different product categories and business needs. . Evaluate and improve forecast accuracy by establishing robust metrics and conducting regular performance assessments throughout the sales cycle. . Run what-if scenarios and simulations to assess the impact of various business conditions on forecast outcomes. 2. Segmentation & Clustering . Collaborate with the Senior Data Scientist to perform segmentation and clustering of product identifiers using unsupervised learning techniques (e.g., K-means, DBSCAN, hierarchical clustering). . Analyze product behavior patterns to identify slow- and fast-moving items, and generate actionable insights for inventory and sales planning. 3. Data Extraction & Feature Engineering . Extract, clean, and transform data from multiple source systems (e.g., SQL databases, APIs, cloud storage) to support modeling and analysis. . Engineer relevant features and variables to enhance model performance and interpretability. 4. Model Evaluation & Deployment . Conduct comparative analysis of forecasting methods across different segments, tuning parameters and optimizing performance. . Work in close coordination with Data Engineers and cloud platform teams to ensure seamless deployment of models into production environments (e.g., AWS, Azure, GCP). . Monitor deployed models for drift, accuracy, and performance, and implement retraining pipelines as needed. 5. Collaboration & Communication . Partner with cross-functional stakeholders to understand business requirements and translate them into analytical solutions. . Present findings, insights, and recommendations to both technical and non-technical audiences through reports, dashboards, and presentations. Qualifications we seek in you! Minimum Qualifications / Skills . Good years of hands-on experience in data science, with a strong focus on forecasting and predictive analytics. . Proficiency in Python and its data science ecosystem (Pandas, NumPy, Scikit-learn, TensorFlow/PyTorch, Statsmodels, etc.). . Strong understanding of time series analysis, machine learning algorithms, and NLP techniques. . Experience with data extraction and manipulation using SQL and/or cloud-based data tools. . Familiarity with cloud platforms (AWS, Azure, or GCP) and model deployment workflows. . Ability to work in a fast-paced, agile environment with shifting priorities and tight deadlines. . Excellent problem-solving, analytical thinking, and communication skills. Preferred Qualifications/ Skills . Master&rsquos in computer science, Data Science, Statistics, or a related field Why join Genpact . Lead AI-first transformation - Build and scale AI solutions that redefine industries . Make an impact - Drive change for global enterprises and solve business challenges that matter . Accelerate your career&mdashGain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills . Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace . Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything we build . Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 2 weeks ago
3.0 years
0 Lacs
india
On-site
Note: Please do not apply if your salary expectations are higher than the provided Salary Range and experience less than 3 years. If you have experience with Travel Industry and worked on Hotel, Car Rental or Ferry Booking before then we can negotiate the package. Company Description: Our company is involved in promoting Greece for the last 25 years through travel sites visited from all around the world with 10 million visitors per year such www.greeka.com, www.ferriesingreece.com etc Through the websites, we provide a range of travel services for a seamless holiday experience such online car rental reservations, ferry tickets, transfers, tours etc….. Role Description: We are seeking a highly skilled Artificial Intelligence / Machine Learning Engineer to join our dynamic team. You will work closely with our development team and QAs to deliver cutting-edge solutions that improve our candidate screening and employee onboarding processes. Major Responsibilities & Job Requirements include: • Develop and implement NLP/LLM Models. • Minimum of 3-4 years of experience as an AI/ML Developer or similar role, with demonstrable expertise in computer vision techniques. • Develop and implement AI models using Python, TensorFlow, and PyTorch. • Proven experience in computer vision, including fine-tuning OCR models (e.g., Tesseract, Layoutlmv3 , EasyOCR, PaddleOCR, or custom-trained models). • Strong understanding and hands-on experience with RAG (Retrieval-Augmented Generation) architectures and pipelines for building intelligent Q&A, document summarization, and search systems. • Experience working with LangChain, LLM agents, and chaining tools to build modular and dynamic LLM workflows. • Familiarity with agent-based frameworks and orchestration of multi-step reasoning with tools, APIs, and external data sources. • Familiarity with Cloud AI Solutions, such as IBM, Azure, Google & AWS. • Work on natural language processing (NLP) tasks and create language models (LLM) for various applications. • Design and maintain SQL databases for storing and retrieving data efficiently. • Utilize machine learning and deep learning techniques to build predictive models. • Collaborate with cross-functional teams to integrate AI solutions into existing systems. • Stay updated with the latest advancements in AI technologies, including ChatGPT, Gemini, Claude, and Big Data solutions. • Write clean, maintainable, and efficient code when required. • Handle large datasets and perform big data analysis to extract valuable insights. • Fine-tune pre-trained LLMs using specific type of data and ensure optimal performance. • Proficiency in cloud services from Amazon AWS • Extract and parse text from CVs, application forms, and job descriptions using advanced NLP techniques such as Word2Vec, BERT, and GPT-NER. • Develop similarity functions and matching algorithms to align candidate skills with job requirements. • Experience with microservices, Flask, FastAPI, Node.js. • Expertise in Spark, PySpark for big data processing. • Knowledge of advanced techniques such as SVD/PCA, LSTM, NeuralProphet. • Apply debiasing techniques to ensure fairness and accuracy in the ML pipeline. • Experience in coordinating with clients to understand their needs and delivering AI solutions that meet their requirements. Qualifications: • Bachelor's or Master’s degree in Computer Science, Data Science, Artificial Intelligence, or a related field. • In-depth knowledge of NLP techniques and libraries, including Word2Vec, BERT, GPT, and others. • Experience with database technologies and vector representation of data. • Familiarity with similarity functions and distance metrics used in matching algorithms. • Ability to design and implement custom ontologies and classification models. • Excellent problem-solving skills and attention to detail. • Strong communication and collaboration skills.
Posted 2 weeks ago
5.0 years
0 Lacs
india
On-site
AI Engineer, Cybersecurity Platform We are looking for a highly skilled and motivated AI Engineer with deep expertise in large language models (LLMs) and machine learning . In this role, you'll combine your technical knowledge with hands-on experience building intelligent, production-ready systems that improve cybersecurity investigations , prioritization, and response. You'll work at the intersection of LLM-driven automation , workflow orchestration, and classical ML models to improve how alerts are prioritized, classified, and contextualized. Your work will directly reduce analyst fatigue and enable faster, more effective decision-making. This is a key position where your work will directly influence the development of agentic AI systems, workflow automation, and recommendation engines within our cloud security platform. Key Responsibilities LLM Integration & Workflows: Build, fine-tune, and integrate large language models into existing systems. Develop agentic workflows for investigation, classification, and automated response. Apply techniques like Retrieval-Augmented Generation (RAG), prompt engineering, and fine-tuning. Machine Learning Development: Design, implement, and optimize ML models for prioritization, ranking, clustering, anomaly detection, and classification. You'll apply both classical models (AR, ARIMA, SARIMA) and modern architectures (XGBoost, LSTM, DeepAR, Temporal Fusion Transformer). Data Preparation & Feature Engineering: Collect, preprocess, and transform structured and unstructured data, including logs, text, and access patterns. Engineer features to maximize model interpretability and performance. Model Training & Deployment: Train and evaluate models using rigorous metrics (precision, recall, AUC, F1, etc.). Optimize hyperparameters and deploy ML and LLM models at scale into production with strong monitoring, drift detection, and observability. Collaboration: Work closely with data scientists, ML engineers, security researchers, and software teams to build end-to-end solutions. Document models, workflows, and pipelines for reproducibility and knowledge sharing. Requirements A Bachelor’s or Master’s degree in Computer Science, AI/ML, Data Science, or a related field. 5+ years of experience in ML/AI, with at least 3 years deploying production-grade systems. Strong knowledge of machine learning algorithms for classification, clustering, ranking, and anomaly detection. Proficiency with LLM frameworks and APIs (OpenAI, Hugging Face, LangChain, LlamaIndex). Hands-on experience building workflow automation with LLMs and integrating them into applications. Solid programming skills in Python (PyTorch, TensorFlow, scikit-learn). Knowledge of NLP tasks such as text classification, summarization, and semantic search. Experience with cloud platforms (AWS, GCP, Azure), containerization (Docker, Kubernetes), and MLOps best practices. Strong problem-solving, analytical, and communication skills. The ability to thrive in a fast-paced, evolving startup environment. Contact: Write to shruthi.s@careerxperts.com to explore this opportunity. #MachineLearning #AILeadership #NLPJobs #TransformerModels #FoundationModels #AIAgents #EthicalAI
Posted 2 weeks ago
3.0 years
0 Lacs
greater kolkata area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. Those in artificial intelligence and machine learning at PwC will focus on developing and implementing advanced AI and ML solutions to drive innovation and enhance business processes. Your work will involve designing and optimising algorithms, models, and systems to enable intelligent decision-making and automation. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Responsibilities: Position responsibilities and expectations · Designing and building analytical /DL/ ML algorithms using Python, R and other statistical tools. · Strong data representation and lucid presentation (of analysis/modelling output) using Python, R Markdown, Power Point, Excel etc. · Ability to learn new scripting language or analytics platform. Technical Skills required (must have) · HandsOn Exposure to Generative AI (Design, development of GenAI application in production) · Strong understanding of RAG, Vector Database, Lang Chain and multimodal AI applications. · Strong understanding of deploying and optimizing AI application in production. · Strong knowledge of statistical and data mining techniques like Linear & Logistic Regression analysis, Decision trees, Bagging, Boosting, Time Series and Non-parametric analysis. · Strong knowledge of DL & Neural Network Architectures (CNN, RNN, LSTM, Transformers etc.) · Strong knowledge of SQL and R/Python and experience with distribute data/computing tools/IDEs. · Experience in advanced Text Analytics (NLP, NLU, NLG). · Strong hands-on experience of end-to-end statistical model development and implementation · Understanding of LLMOps, ML Ops for scalable ML development. · Basic understanding of DevOps and deployment of models into production (PyTorch, TensorFlow etc.). · Expert level proficiency algorithm building languages like SQL, R and Python and data visualization tools like Shiny, Qlik, Power BI etc. · Exposure to Cloud Platform (Azure or AWS or GCP) technologies and services like Azure AI/ Sage maker/Vertex AI, Auto ML, Azure Index, Azure Functions, OCR, OpenAI, storage, scaling etc. Technical Skills required (Any one or more) · Experience in video/ image analytics (Computer Vision) · Experience in IoT/ machine logs data analysis · Exposure to data analytics platforms like Domino Data Lab, c3.ai, H2O, Alteryx or KNIME · Expertise in Cloud analytics platforms (Azure, AWS or Google) · Experience in Process Mining with expertise in Celonis or other tools · Proven capability in using Generative AI services like OpenAI, Google (Gemini) · Understanding of Agentic AI Framework (Lang Graph, Auto gen etc.) · Understanding of fine-tuning for pre-trained models like GPT, LLaMA, Claude etc. using LoRA, QLoRA and PEFT technique. · Proven capability in building customized models from open-source distributions like Llama, Stable Diffusion Mandatory skill sets: AI chatbots, Data structures, GenAI object-oriented programming, IDE, API, LLM Prompts, Streamlit Preferred skill sets: AI chatbots, Data structures, GenAI object-oriented programming, IDE, API, LLM Prompts, Streamlit Years of experience required: 3-6 Years Education qualification: BE, B. Tech, M. Tech, M. Stat, Ph.D., M.Sc. (Stats / Maths) Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Bachelor of Technology, MBA (Master of Business Administration) Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Generative AI Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, AI Implementation, Analytical Thinking, C++ Programming Language, Communication, Complex Data Analysis, Creativity, Data Analysis, Data Infrastructure, Data Integration, Data Modeling, Data Pipeline, Data Quality, Deep Learning, Embracing Change, Emotional Regulation, Empathy, GPU Programming, Inclusion, Intellectual Curiosity, Java (Programming Language), Learning Agility, Machine Learning {+ 25 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date
Posted 2 weeks ago
3.0 years
0 Lacs
greater kolkata area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. Those in artificial intelligence and machine learning at PwC will focus on developing and implementing advanced AI and ML solutions to drive innovation and enhance business processes. Your work will involve designing and optimising algorithms, models, and systems to enable intelligent decision-making and automation. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Responsibilities: Position responsibilities and expectations · Designing and building analytical /DL/ ML algorithms using Python, R and other statistical tools. · Strong data representation and lucid presentation (of analysis/modelling output) using Python, R Markdown, Power Point, Excel etc. · Ability to learn new scripting language or analytics platform. Technical Skills required (must have) · HandsOn Exposure to Generative AI (Design, development of GenAI application in production) · Strong understanding of RAG, Vector Database, Lang Chain and multimodal AI applications. · Strong understanding of deploying and optimizing AI application in production. · Strong knowledge of statistical and data mining techniques like Linear & Logistic Regression analysis, Decision trees, Bagging, Boosting, Time Series and Non-parametric analysis. · Strong knowledge of DL & Neural Network Architectures (CNN, RNN, LSTM, Transformers etc.) · Strong knowledge of SQL and R/Python and experience with distribute data/computing tools/IDEs. · Experience in advanced Text Analytics (NLP, NLU, NLG). · Strong hands-on experience of end-to-end statistical model development and implementation · Understanding of LLMOps, ML Ops for scalable ML development. · Basic understanding of DevOps and deployment of models into production (PyTorch, TensorFlow etc.). · Expert level proficiency algorithm building languages like SQL, R and Python and data visualization tools like Shiny, Qlik, Power BI etc. · Exposure to Cloud Platform (Azure or AWS or GCP) technologies and services like Azure AI/ Sage maker/Vertex AI, Auto ML, Azure Index, Azure Functions, OCR, OpenAI, storage, scaling etc. Technical Skills required (Any one or more) · Experience in video/ image analytics (Computer Vision) · Experience in IoT/ machine logs data analysis · Exposure to data analytics platforms like Domino Data Lab, c3.ai, H2O, Alteryx or KNIME · Expertise in Cloud analytics platforms (Azure, AWS or Google) · Experience in Process Mining with expertise in Celonis or other tools · Proven capability in using Generative AI services like OpenAI, Google (Gemini) · Understanding of Agentic AI Framework (Lang Graph, Auto gen etc.) · Understanding of fine-tuning for pre-trained models like GPT, LLaMA, Claude etc. using LoRA, QLoRA and PEFT technique. · Proven capability in building customized models from open-source distributions like Llama, Stable Diffusion Mandatory skill sets: AI chatbots, Data structures, GenAI object-oriented programming, IDE, API, LLM Prompts, Streamlit Preferred skill sets: AI chatbots, Data structures, GenAI object-oriented programming, IDE, API, LLM Prompts, Streamlit Years of experience required: 3-6 Years Education qualification: BE, B. Tech, M. Tech, M. Stat, Ph.D., M.Sc. (Stats / Maths) Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Bachelor of Engineering, MBA (Master of Business Administration) Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Generative AI Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, AI Implementation, Analytical Thinking, C++ Programming Language, Communication, Complex Data Analysis, Creativity, Data Analysis, Data Infrastructure, Data Integration, Data Modeling, Data Pipeline, Data Quality, Deep Learning, Embracing Change, Emotional Regulation, Empathy, GPU Programming, Inclusion, Intellectual Curiosity, Java (Programming Language), Learning Agility, Machine Learning {+ 25 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date
Posted 2 weeks ago
5.0 years
0 Lacs
chennai, tamil nadu, india
On-site
Company: Datacrew.ai Location: Chennai (Onsite) Overview Datacrew.ai is seeking a skilled Data Scientist (Demand Forecasting) to join our growing analytics team. In this role, you will design, deploy, and optimize demand prediction models across industries such as food and retail. By leveraging machine learning, time series analysis, and external domain factors , you will deliver actionable insights that drive smarter business decisions. Key Responsibilities Develop and enhance forecasting models using time series, statistical, and ML techniques for SKU- and branch-level demand prediction. Partner with stakeholders to define requirements and identify relevant data sources (historical sales, calendar, promotions, external datasets). Perform feature engineering to integrate seasonality, holidays, promotions, weather, and external drivers into models. Conduct thorough data preprocessing, validation, and analysis to ensure robust and reliable forecasts. Collaborate with data engineering and business teams to deploy models into production and automate recurring forecasts. Track and evaluate model performance using MAE, RMSE, MAPE , and refine models based on results. Prepare documentation, dashboards, and presentations for technical and business audiences. Stay current on advances in time series forecasting, ML, and demand analytics . Required Skills & Experience Bachelor’s/Master’s in Data Science, Statistics, Computer Science, Operations Research , or related field. 4–5 years’ experience in forecasting/analytics (retail, QSR, FMCG, or manufacturing preferred). Hands-on expertise in: Time series models : ARIMA/SARIMA, Exponential Smoothing, Prophet Machine learning models : XGBoost, LightGBM, Random Forest, Neural Networks for time series Proficiency in Python (pandas, scikit-learn, statsmodels, Prophet, etc.); SQL is a plus. Experience in feature engineering with holiday/event calendars and external data. Familiarity with visualization tools (Tableau, Power BI, matplotlib, seaborn). Strong problem-solving, detail orientation, and business translation skills . Excellent communication and collaboration abilities. Preferred Qualifications Experience deploying forecasting models in production/cloud environments . Knowledge of retail/restaurant analytics at SKU and store/branch level. Exposure to big data tools, cloud platforms (AWS, GCP, Azure), and workflow automation. Experience with deep learning models for time series (LSTM, GRU).
Posted 2 weeks ago
5.0 years
0 Lacs
navi mumbai, maharashtra, india
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Lead Data Scientist Job Description For Lead Data Scientist Use the Mastercard standardized job description template to design a simple and engaging vision of the job opportunity you have available. Remember to: Follow the guidelines in each section to write the content for your position. Copy and paste it into the Workday Job Description Summary field. Who is Mastercard? Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. Overview Finicity, a Mastercard company, is leading the Open Banking Initiative to increase the Financial Health of consumers and businesses. The Data Science and Analytics team is looking for a Data Scientist II. The Data Science team works on Intelligent Decisioning; Financial Certainty; Attribute, Feature, and Entity Resolution; Verification Solutions and much more. Join our team to make an impact across all sectors of the economy by consistently innovating and problem-solving. The ideal candidate is passionate about leveraging data to provide high quality customer solutions. Also, the candidate is a strong technical leader who is extremely motivated, intellectually curious, analytical, and possesses an entrepreneurial mindset. Role Develops machine-learning models to monitor open banking transactions in order to glean insights from the data and create data science algorithms to detect data anomaly observed in fraudulent transactions. Manipulates large data sets and applies various technical and statistical analytical techniques (e.g., OLS, multinomial logistic regression, LDA, clustering, segmentation) to draw insights from large datasets. Apply various Machine learning (i.e. SVM, Radom Forest, XGBoost, LightGBM, CATBoost etc), Deep learning techniques (i.e. LSTM, RNN, Transformer etc.) to solve analytical problem statement. Design and implement machine learning models for a number of financial applications including but not limited to: Transaction Classification, Temporal Analysis, Risk modeling from structured and unstructured data. Measure, validate, implement, monitor and improve performance of both internal and external facing machine learning models. Propose creative solutions to existing challenges that are new to the company, the financial industry and to data science. Present technical problems and findings to business leaders internally and to clients succinctly and clearly. Leverage best practices in machine learning and data engineering to develop scalable solutions. Identify areas where resources fall short of needs and provide thoughtful and sustainable solutions to benefit the team Be a strong, confident, and excellent writer and speaker, able to communicate your analysis, vision and roadmap effectively to a wide variety of stakeholders Test current cutting-edge AI technologies to enhance data science modeling work. All About You: 5-7 years in data science/ machine learning model development and deployments Exposure to financial transactional structured and unstructured data, transaction classification, risk evaluation and credit risk modeling is a plus. A strong understanding of NLP, Statistical Modeling, Visualization and advanced Data Science techniques/methods. AI experience is a plus. Gain insights from text, including non-language tokens and use the thought process of annotations in text analysis. Solve problems that are new to the company, the financial industry and to data science SQL / Database experience is preferred. Strong Python programming background/experience. Experience with Kubernetes, Containers, Docker, REST APIs, Event Streams or other delivery mechanisms. Familiarity with relevant technologies (e.g. Tensorflow, Sklearn, Pandas, etc.). Familiarity with Databricks Platform. Strong desire to collaborate and ability to come up with creative solutions. Additional Finance and FinTech experience preferred. Bachelor’s or Master’s Degree in Computer Science, Information Technology, Engineering, Mathematics, Statistics. Corporate Security Responsibility Every Person Working For, Or On Behalf Of, Mastercard Is Responsible For Information Security. All Activities Involving Access To Mastercard Assets, Information, And Networks Comes With An Inherent Risk To The Organization And Therefore, It Is Expected That The Successful Candidate For This Position Must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. NOTE: Candidates go through a thorough screening and interview process. Corporate Security Responsibility All Activities Involving Access To Mastercard Assets, Information, And Networks Comes With An Inherent Risk To The Organization And, Therefore, It Is Expected That Every Person Working For, Or On Behalf Of, Mastercard Is Responsible For Information Security And Must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 2 weeks ago
5.0 years
0 Lacs
indore, madhya pradesh, india
On-site
Job Title: Quantitative Analyst (Forex) Location: Indore, MP, India (Onsite only - Work from office) Experience Level: 2–5 years Employment Type: Full-time Role Overview Tecnomi is seeking a talented Quantitative Analyst to join our team full-time and play a key role in building, validating, and optimizing systematic forex trading strategies. You will work closely with the project owner and development team to develop actionable models, implement risk frameworks, and contribute to data-driven insights. This full-time position offers the opportunity to drive innovative solutions in a collaborative environment, with a focus on long-term project impact. Key Responsibilities Strategy Development: Design, test, and optimize quantitative trading strategies using price data, volatility measures, sentiment indicators, and macroeconomic variables. Modeling & Backtesting: Build robust forecasting models (e.g., time-series, LSTM/Transformer hybrids) and perform rigorous backtesting to ensure reliability. Risk Frameworks: Develop and integrate risk-control tools (e.g., Value-at-Risk, drawdown limits, exposure thresholds) while aligning with compliance best practices. Data Integration: Aggregate and analyze multi-source data (news feeds, order-book snapshots, economic calendars) for model inputs and enhancements. Mentorship: Provide ongoing guidance and knowledge-sharing sessions to educate non-technical team members, including the project owner, on forex fundamentals and quantitative concepts. Collaboration & Documentation: Work with development teams to integrate model outputs into systems; maintain clear, comprehensive documentation of assumptions, processes, and results. Required Qualifications 3–5 years of experience in quantitative research, preferably in forex or systematic trading. Hands-on experience with backtesting platforms and statistical validation techniques. Strong understanding of FX market dynamics, macro drivers, and risk modeling. Excellent communication and teaching skills to mentor non-experts effectively. Bachelor’s or Master’s degree in Quantitative Finance, Mathematics, Statistics, or a related field. Verifiable 2-year portfolio showcasing prior models, backtests, or strategy reports. Preferred Skills Experience across multiple forex instruments or asset classes. Proficiency in Python or R (e.g., pandas, NumPy, statsmodels, scikit-learn). Familiarity with regulatory frameworks (e.g., RBI/FEMA). Exposure to tools like QuantConnect, MLflow, or similar platforms.
Posted 3 weeks ago
2.0 - 5.0 years
0 Lacs
noida, uttar pradesh, india
On-site
Job Title: AI/ML Engineer (OCR & Load Forecasting) Experience: 2 to 5 Years Location: Noida Type: Full-time Domain: OCR, Time Series Forecasting, AI/ML About the Role We are seeking talented and passionate AI/ML Engineers to join our team for exciting projects in Optical Character Recognition (OCR) and Load/Consumption/Revenue Forecasting . You'll be working on cutting-edge solutions that involve image-based data extraction and time series forecasting models to solve real-world energy and utility sector problems. Key Responsibilities Develop and fine-tune OCR models to extract structured data from electric meter images and scanned documents. Deploy models into production environments using tools such as TensorFlow Lite, ONNX, and TensorRT. Build and deploy time series forecasting models (e.g., XGBoost, LSTM, Prophet) for load and consumption prediction at consumer, transformer, and substation levels. Work with large-scale datasets (images and time series) and perform data preprocessing, augmentation, and feature engineering. · Design and maintain robust, scalable, and efficient code in Python. Collaborate with backend and DevOps teams to integrate models into production pipelines. Conduct performance tuning and optimization of models for inference speed and accuracy. Evaluate models using appropriate metrics and validation techniques. Stay updated with the latest research in computer vision and forecasting. Required Skills Strong foundation in Python , NumPy , Pandas , and scikit-learn . Hands-on experience with Deep Learning frameworks : TensorFlow , Keras , or PyTorch . Experience with OCR tools/libraries like Tesseract , EasyOCR , or custom CNN/CRNN models . Good understanding of image preprocessing , object detection (e.g., YOLO, SSD), and image segmentation . Exposure to the annotation tools LabelStudio or LabelImg. Proficient in time series modeling techniques including XGBoost , ARIMA , LSTM , and Prophet . Experience working with tools like OpenCV , Matplotlib , and Seaborn . Familiarity with cloud platforms (AWS/GCP) and model deployment is a plus. Version control using Git and collaborative development practices. Preferred Qualifications Bachelor’s or Master’s degree in Computer Science, Data Science, Electrical Engineering, or related fields. Experience with load forecasting in the utility/energy domain is highly desirable. Exposure to IoT data , metering infrastructure , or smart grid analytics is a plus. Why Join Us? Work on real-world AI problems impacting millions of users. Be part of a dynamic, growing startup team with full ownership of your work. Flexible work culture with opportunities to explore multiple domains. -Candidates can also apply for this job posting on our official website: https://www.alignmycareer.com/job/68ac6eede8b79e43d1ea4eb2
Posted 3 weeks ago
0 years
0 Lacs
india
On-site
We’re Hiring: CTO / Co-Founder – Twinamics 🚀 We are looking for a CTO / Technical Co-Founder to join the founding team and drive the deep tech vision forward. This is not just a job — it’s a chance to co-create the future of enterprise AI from the ground up. Company Description Twinamics is building the Autonomous Brain for Enterprises. We transform messy, siloed data into real-time intelligence and autonomous actions—helping businesses grow revenue without growing headcount. At the core of Twinamics is our proprietary DIPPCAA Engine (Data → Insight → Prediction → Prescription → Command → Action → Adaptation). It powers an end-to-end Data-to-Action Infrastructure that connects data, reasons over it, and executes business-critical decisions through AI Employees (AI-CXOs and their Agent Workforce). If you’re excited to work at the frontier of agentic AI, orchestration frameworks, and enterprise-scale automation —this role is for you. What You’ll Do Own the technical strategy and architecture of Twinamics’ AI infrastructure. Lead development of the DIPPCAA Engine, digital twin knowledge graphs, and multi-agent orchestration layer. Push the boundaries of event prediction, quantum-inspired AI models, and real-time decision inference. Build and mentor a world-class engineering + research team. Translate cutting-edge research into production-grade systems with enterprise impact. Collaborate with product designers and developers to bring enterprise-ready AI infra to life. Collaborate closely with the founder on product vision, fundraising, and scaling. Skillsets We’re Looking For Deep interest in research (PhD / publications are a plus but not mandatory). Core AI & ML Strong background in Machine Learning and Deep Learning (esp. time-series & sequential modeling). Experience with LLMs, SLMs, or hybrid architectures for reasoning + prediction. Knowledge of probabilistic modeling (Bayesian methods, Monte Carlo, Markov Decision Processes). Experience in time-series forecasting (ARIMA, Prophet, RNN/LSTM/GRU, Transformer-based models). Familiarity with anomaly detection techniques to capture unexpected signals. Understanding of multi-signal fusion (internal + external data streams). Strong grasp of causal inference & correlation vs. causation for accurate event detection. Prescriptive AI & Decisioning Ability to move from prediction → prescription (recommend optimal actions, not just forecasts). Knowledge of reinforcement learning, optimization algorithms, or decision theory . Familiarity with control systems for closed-loop feedback in enterprise workflows. Data Layer & Infra Hands-on with data pipelines (ETL/ELT, Apache Kafka, Airflow, dbt, or similar). Experience with vector databases (Pinecone, Weaviate, Milvus, pgvector) for memory/state management. Strong SQL + NoSQL experience (Postgres, Mongo, etc.) for structured/unstructured data. Data architecture skills: schema design, feature engineering, real-time + batch pipelines . Enterprise Integration Ability to connect models into ERP, CRM, Finance, and Supply Chain systems . Strong API design & integration skills (REST, GraphQL, gRPC). Bonus Points Exposure to knowledge graphs or graph databases (Neo4j, TigerGraph) for event relationships. Familiarity with streaming data (IoT, sensor data, transaction logs) . Twinamics is on a mission to build the Autonomous Brain for Enterprises — turning messy, siloed data into real-time intelligence and autonomous actions . Our proprietary DIPPCAA Engine (Data → Insight → Prediction → Prescription → Command → Action → Adaptation) powers a Data-to-Action Infrastructure that unifies enterprise data, reasons over it, and executes decisions through AI Employees (AI-CXOs and their Agent Workforce) . What We’re Looking For Strong background in AI/ML, LLMs, and multi-agent systems . Hands-on expertise with knowledge graphs, RAG pipelines, real-time data systems, or decision intelligence . Experience in building and scaling technical products from zero-to-one. Startup DNA: you thrive in ambiguity, move fast, and love solving hard problems. Bonus: Experience in event-driven architectures, reinforcement learning, or quantum-inspired AI . Why Join Twinamics? Be part of the core founding team shaping a frontier AI company. Work at the intersection of deep research and enterprise-scale impact . Tackle high-value problems across procurement, logistics, hospitality, manufacturing, and retail . Build the AI Employees of the future — from AI CFOs to AI Supply Chain Heads. Equity + ownership: grow with the company you help build. 👉 If you’re excited about building the Autonomous Brain for Enterprises and want to work on the frontier of AI + research + real-world impact , let’s talk. 📩 DM me or drop a note at mmr@twinamics.com / twinamics@gmail.com #AI #Hiring #StartupJobs #CTO #Cofounder #Twinamics
Posted 3 weeks ago
3.0 years
0 Lacs
india
On-site
Note: Please do not apply if your salary expectations are higher than the provided Salary Range and experience less than 3 years. If you have experience with Travel Industry and worked on Hotel, Car Rental or Ferry Booking before then we can negotiate the package. Company Description: Our company is involved in promoting Greece for the last 25 years through travel sites visited from all around the world with 10 million visitors per year such www.greeka.com, www.ferriesingreece.com etc Through the websites, we provide a range of travel services for a seamless holiday experience such online car rental reservations, ferry tickets, transfers, tours etc….. Role Description: We are seeking a highly skilled Artificial Intelligence / Machine Learning Engineer to join our dynamic team. You will work closely with our development team and QAs to deliver cutting-edge solutions that improve our candidate screening and employee onboarding processes. Major Responsibilities & Job Requirements include: • Develop and implement NLP/LLM Models. • Minimum of 3-4 years of experience as an AI/ML Developer or similar role, with demonstrable expertise in computer vision techniques. • Develop and implement AI models using Python, TensorFlow, and PyTorch. • Proven experience in computer vision, including fine-tuning OCR models (e.g., Tesseract, Layoutlmv3 , EasyOCR, PaddleOCR, or custom-trained models). • Strong understanding and hands-on experience with RAG (Retrieval-Augmented Generation) architectures and pipelines for building intelligent Q&A, document summarization, and search systems. • Experience working with LangChain, LLM agents, and chaining tools to build modular and dynamic LLM workflows. • Familiarity with agent-based frameworks and orchestration of multi-step reasoning with tools, APIs, and external data sources. • Familiarity with Cloud AI Solutions, such as IBM, Azure, Google & AWS. • Work on natural language processing (NLP) tasks and create language models (LLM) for various applications. • Design and maintain SQL databases for storing and retrieving data efficiently. • Utilize machine learning and deep learning techniques to build predictive models. • Collaborate with cross-functional teams to integrate AI solutions into existing systems. • Stay updated with the latest advancements in AI technologies, including ChatGPT, Gemini, Claude, and Big Data solutions. • Write clean, maintainable, and efficient code when required. • Handle large datasets and perform big data analysis to extract valuable insights. • Fine-tune pre-trained LLMs using specific type of data and ensure optimal performance. • Proficiency in cloud services from Amazon AWS • Extract and parse text from CVs, application forms, and job descriptions using advanced NLP techniques such as Word2Vec, BERT, and GPT-NER. • Develop similarity functions and matching algorithms to align candidate skills with job requirements. • Experience with microservices, Flask, FastAPI, Node.js. • Expertise in Spark, PySpark for big data processing. • Knowledge of advanced techniques such as SVD/PCA, LSTM, NeuralProphet. • Apply debiasing techniques to ensure fairness and accuracy in the ML pipeline. • Experience in coordinating with clients to understand their needs and delivering AI solutions that meet their requirements. Qualifications: • Bachelor's or Master’s degree in Computer Science, Data Science, Artificial Intelligence, or a related field. • In-depth knowledge of NLP techniques and libraries, including Word2Vec, BERT, GPT, and others. • Experience with database technologies and vector representation of data. • Familiarity with similarity functions and distance metrics used in matching algorithms. • Ability to design and implement custom ontologies and classification models. • Excellent problem-solving skills and attention to detail. • Strong communication and collaboration skills.
Posted 3 weeks ago
0 years
0 Lacs
hyderabad, telangana, india
On-site
Description Keyskills – Must Have Python Microsoft Azure ML Services jupyter Tableau Advance Analytics SQL Snowflake NoSQL Java Spring Boot Git Requirements Job Overview: We are seeking an experienced and results-driven Data Scientist to join our growing analytics team. As a Data Scientist, you will work at the intersection of data, business, and technology. Your role is to extract actionable insights from large volumes of structured and unstructured data, develop predictive and different types of models, and contribute to strategic decision-making through advanced analytics and machine learning. You will partner with cross-functional teams including engineering, product management, and business operations to create scalable, production-grade models that support the company’s Digital First Approach, enabling innovation and enhancing customer experiences. Key Responsibilities: Conduct end-to-end data science projects: from understanding business needs to delivering ML solutions that solve real-world problems. Perform exploratory data analysis (EDA) to identify trends, patterns, and outliers in large datasets. Engineer meaningful features from raw data to improve model performance. Develop and validate predictive models using supervised and unsupervised learning algorithms. Apply advanced statistical techniques and machine learning methods to solve classification, regression, and clustering problems. Utilize and compare various algorithms including Isolation Forest, Autoencoders, One-Class SVM, ARIMA, and LSTM-based models. Collaborate with data engineers and software developers to deploy models in production using tools like Spring Boot, GitLab, and APIs. Evaluate and monitor model performance over time and recommend improvements. Use visualization platforms like Tableau, Power BI, and Google Analytics to communicate insights and findings to business stakeholders and leadership. Maintain proper documentation for all models and methodologies used. Stay current with advancements in ML/AI tools, libraries, and techniques. Participate in Agile ceremonies and manage tasks using Jira. Technical Skills & Tools: Languages: Python (NumPy, Pandas, Scikit-learn, TensorFlow, etc.), R (ggplot2, caret, etc.) ML Platforms & IDEs: Azure ML Studio, Jupyter Notebook, RStudio, Anaconda Modeling: Supervised and unsupervised methods Visualization: Tableau, Power BI, Google Analytics Database Knowledge: SQL (Snowflake, MySQL), NoSQL (MongoDB) Deployment & Integration: Understanding of Java APIs, Spring Boot, and GitLab for model integration Statistical Analysis: Hypothesis testing, A/B testing, regression, classification, clustering Qualifications: Bachelor’s or Master’s degree in Data Science, Computer Science, Statistics, Mathematics, or related field. Proven experience as a Data Scientist or similar role. Proficiency in Python, R, SQL, and data manipulation libraries (e.g., pandas, numpy). Experience with machine learning frameworks (e.g., scikit-learn, TensorFlow, PyTorch). Familiarity with data visualization tools (e.g., Tableau, Power BI, matplotlib, seaborn). Strong understanding of statistics, probability, and mathematical modeling. Experience working with structured and unstructured data Ability to convey complex technical results in a clear and concise manner Nice to Have: Experience in telecom domain Exposure to big data tools such as Apache Spark or Hadoop Knowledge of CI/CD practices in data science Job responsibilities Job Overview: We are seeking an experienced and results-driven Data Scientist to join our growing analytics team. As a Data Scientist, you will work at the intersection of data, business, and technology. Your role is to extract actionable insights from large volumes of structured and unstructured data, develop predictive and different types of models, and contribute to strategic decision-making through advanced analytics and machine learning. You will partner with cross-functional teams including engineering, product management, and business operations to create scalable, production-grade models that support the company’s Digital First Approach, enabling innovation and enhancing customer experiences. Key Responsibilities: Conduct end-to-end data science projects: from understanding business needs to delivering ML solutions that solve real-world problems. Perform exploratory data analysis (EDA) to identify trends, patterns, and outliers in large datasets. Engineer meaningful features from raw data to improve model performance. Develop and validate predictive models using supervised and unsupervised learning algorithms. Apply advanced statistical techniques and machine learning methods to solve classification, regression, and clustering problems. Utilize and compare various algorithms including Isolation Forest, Autoencoders, One-Class SVM, ARIMA, and LSTM-based models. Collaborate with data engineers and software developers to deploy models in production using tools like Spring Boot, GitLab, and APIs. Evaluate and monitor model performance over time and recommend improvements. Use visualization platforms like Tableau, Power BI, and Google Analytics to communicate insights and findings to business stakeholders and leadership. Maintain proper documentation for all models and methodologies used. Stay current with advancements in ML/AI tools, libraries, and techniques. Participate in Agile ceremonies and manage tasks using Jira. Technical Skills & Tools: Languages: Python (NumPy, Pandas, Scikit-learn, TensorFlow, etc.), R (ggplot2, caret, etc.) ML Platforms & IDEs: Azure ML Studio, Jupyter Notebook, RStudio, Anaconda Modeling: Supervised and unsupervised methods Visualization: Tableau, Power BI, Google Analytics Database Knowledge: SQL (Snowflake, MySQL), NoSQL (MongoDB) Deployment & Integration: Understanding of Java APIs, Spring Boot, and GitLab for model integration Statistical Analysis: Hypothesis testing, A/B testing, regression, classification, clustering Qualifications: Bachelor’s or Master’s degree in Data Science, Computer Science, Statistics, Mathematics, or related field. Proven experience as a Data Scientist or similar role. Proficiency in Python, R, SQL, and data manipulation libraries (e.g., pandas, numpy). Experience with machine learning frameworks (e.g., scikit-learn, TensorFlow, PyTorch). Familiarity with data visualization tools (e.g., Tableau, Power BI, matplotlib, seaborn). Strong understanding of statistics, probability, and mathematical modeling. Experience working with structured and unstructured data Ability to convey complex technical results in a clear and concise manner Nice to Have: Experience in telecom domain Exposure to big data tools such as Apache Spark or Hadoop Knowledge of CI/CD practices in data science What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Posted 3 weeks ago
5.0 - 8.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Skills: API Integration & Deployment (Flask/FastAPI, MLOps), Cloud Platforms (AWS/GCP/Azure), Model Optimization & Performance Tuning, Machine Learning & Data Analysis, Deep Learning (TensorFlow / PyTorch), Python Programming, Job Title: Senior AI/ML Engineer (5 to 8 Years of EXP) Location: Onsite Bangalore, Karnataka Company: VectorStack Employment Type: Full-time Joining: Immediate (Urgent Requirement) About VectorStack VectorStack is a technology solutions provider driving digital transformation and business performance. With expertise across Tech Advancement, Product Evolution, Design Innovation, and Business Transformation, we deliver strategies that create measurable impact. Our solutions power industries like Retail Tech, Ad Tech, FinTech, and EdTech, enabling businesses to unlock their true potential. Learn more at: www.vectorstack.co About The Role We are seeking a highly experienced and passionate AI Engineer with overall experience and 5-8 years of relevant expertise in AI/ML solutioning. This is a senior role ideal for a technologist who thrives on solving complex problems through machine learning, deep learning, and data-driven innovation. As a Senior Contributor, You Will Lead the end-to-end design, development, and deployment of AI/ML solutions. Provide technical leadership and mentorship to junior engineers. Collaborate cross-functionally to ensure AI initiatives deliver business value and innovation at scale. Key Responsibilities AI Solution Delivery: Design, develop, and deploy scalable AI/ML models across multiple domains. Model Development & Optimization: Build, fine-tune, and optimize ML/DL models for performance and efficiency. Leadership & Mentoring: Guide junior engineers, influence architectural decisions, and foster knowledge sharing. Data Engineering: Oversee preprocessing, transformation, and feature engineering for large and complex datasets. Cross-Functional Collaboration: Partner with Product, Data, and Engineering teams to align AI projects to business needs. Deployment & Monitoring: Deploy models in production using AWS/GCP/Azure with MLOps best practices. Research & Innovation: Stay current with AI/ML advancements and integrate emerging tools and frameworks. Code Quality & Documentation: Ensure clean, efficient, and version-controlled code with strong documentation. Required Qualifications Experience: 58 years overall in software/AI development, with 56 years hands-on AI/ML expertise. Education: Bachelors/Masters in Computer Science, Data Science, AI/ML, or related fields. Programming Proficiency: Expert-level Python (NumPy, Pandas, Scikit-learn, Matplotlib, etc.). Machine Learning & AI: Strong knowledge of regression, classification, clustering, recommendation systems, cross-validation, and feature selection. Deep Learning & Neural Networks: Expertise in CNNs, RNNs, LSTM, ResNet, VGG, etc. Experience in computer vision tasks (image classification, detection, segmentation). Version Control: Proficient with Git and collaborative development tools. Software Engineering: Solid background in algorithms, data structures, API development, and modular code. Preferred Skills Web Frameworks: Experience with Flask/Django for AI-driven applications. Computer Vision: Hands-on with OpenCV, real-time image processing, and object tracking. NLP: Experience in NER, text classification, sentiment analysis, transformers, and BERT models. Cloud & MLOps: Deployment on AWS/GCP/Azure with Docker, Kubernetes, MLflow, or SageMaker. Big Data & Pipelines: Exposure to Spark, Hadoop, or Airflow for large-scale data pipelines. Soft Skills Excellent analytical and problem-solving abilities. Strong communication skills to convey complex technical concepts simply. Proven leadership and mentoring capabilities. Strong sense of ownership, accountability, and detail orientation.
Posted 3 weeks ago
5.0 - 8.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Skills: API Integration & Deployment (Flask/FastAPI, MLOps), Cloud Platforms (AWS/GCP/Azure), Model Optimization & Performance Tuning, Machine Learning & Data Analysis, Deep Learning (TensorFlow / PyTorch), Python Programming, Job Title: Senior AI/ML Engineer (5 to 8 Years of EXP) Location: Onsite Bangalore, Karnataka Company: VectorStack Employment Type: Full-time Joining: Immediate (Urgent Requirement) About VectorStack VectorStack is a technology solutions provider driving digital transformation and business performance. With expertise across Tech Advancement, Product Evolution, Design Innovation, and Business Transformation, we deliver strategies that create measurable impact. Our solutions power industries like Retail Tech, Ad Tech, FinTech, and EdTech, enabling businesses to unlock their true potential. Learn more at: www.vectorstack.co About The Role We are seeking a highly experienced and passionate AI Engineer with overall experience and 5-8 years of relevant expertise in AI/ML solutioning. This is a senior role ideal for a technologist who thrives on solving complex problems through machine learning, deep learning, and data-driven innovation. As a Senior Contributor, You Will Lead the end-to-end design, development, and deployment of AI/ML solutions. Provide technical leadership and mentorship to junior engineers. Collaborate cross-functionally to ensure AI initiatives deliver business value and innovation at scale. Key Responsibilities AI Solution Delivery: Design, develop, and deploy scalable AI/ML models across multiple domains. Model Development & Optimization: Build, fine-tune, and optimize ML/DL models for performance and efficiency. Leadership & Mentoring: Guide junior engineers, influence architectural decisions, and foster knowledge sharing. Data Engineering: Oversee preprocessing, transformation, and feature engineering for large and complex datasets. Cross-Functional Collaboration: Partner with Product, Data, and Engineering teams to align AI projects to business needs. Deployment & Monitoring: Deploy models in production using AWS/GCP/Azure with MLOps best practices. Research & Innovation: Stay current with AI/ML advancements and integrate emerging tools and frameworks. Code Quality & Documentation: Ensure clean, efficient, and version-controlled code with strong documentation. Required Qualifications Experience: 58 years overall in software/AI development, with 56 years hands-on AI/ML expertise. Education: Bachelors/Masters in Computer Science, Data Science, AI/ML, or related fields. Programming Proficiency: Expert-level Python (NumPy, Pandas, Scikit-learn, Matplotlib, etc.). Machine Learning & AI: Strong knowledge of regression, classification, clustering, recommendation systems, cross-validation, and feature selection. Deep Learning & Neural Networks: Expertise in CNNs, RNNs, LSTM, ResNet, VGG, etc. Experience in computer vision tasks (image classification, detection, segmentation). Version Control: Proficient with Git and collaborative development tools. Software Engineering: Solid background in algorithms, data structures, API development, and modular code. Preferred Skills Web Frameworks: Experience with Flask/Django for AI-driven applications. Computer Vision: Hands-on with OpenCV, real-time image processing, and object tracking. NLP: Experience in NER, text classification, sentiment analysis, transformers, and BERT models. Cloud & MLOps: Deployment on AWS/GCP/Azure with Docker, Kubernetes, MLflow, or SageMaker. Big Data & Pipelines: Exposure to Spark, Hadoop, or Airflow for large-scale data pipelines. Soft Skills Excellent analytical and problem-solving abilities. Strong communication skills to convey complex technical concepts simply. Proven leadership and mentoring capabilities. Strong sense of ownership, accountability, and detail orientation.
Posted 3 weeks ago
5.0 - 8.0 years
0 Lacs
bangalore rural, karnataka, india
On-site
Skills: API Integration & Deployment (Flask/FastAPI, MLOps), Cloud Platforms (AWS/GCP/Azure), Model Optimization & Performance Tuning, Machine Learning & Data Analysis, Deep Learning (TensorFlow / PyTorch), Python Programming, Job Title: Senior AI/ML Engineer (5 to 8 Years of EXP) Location: Onsite Bangalore, Karnataka Company: VectorStack Employment Type: Full-time Joining: Immediate (Urgent Requirement) About VectorStack VectorStack is a technology solutions provider driving digital transformation and business performance. With expertise across Tech Advancement, Product Evolution, Design Innovation, and Business Transformation, we deliver strategies that create measurable impact. Our solutions power industries like Retail Tech, Ad Tech, FinTech, and EdTech, enabling businesses to unlock their true potential. Learn more at: www.vectorstack.co About The Role We are seeking a highly experienced and passionate AI Engineer with overall experience and 5-8 years of relevant expertise in AI/ML solutioning. This is a senior role ideal for a technologist who thrives on solving complex problems through machine learning, deep learning, and data-driven innovation. As a Senior Contributor, You Will Lead the end-to-end design, development, and deployment of AI/ML solutions. Provide technical leadership and mentorship to junior engineers. Collaborate cross-functionally to ensure AI initiatives deliver business value and innovation at scale. Key Responsibilities AI Solution Delivery: Design, develop, and deploy scalable AI/ML models across multiple domains. Model Development & Optimization: Build, fine-tune, and optimize ML/DL models for performance and efficiency. Leadership & Mentoring: Guide junior engineers, influence architectural decisions, and foster knowledge sharing. Data Engineering: Oversee preprocessing, transformation, and feature engineering for large and complex datasets. Cross-Functional Collaboration: Partner with Product, Data, and Engineering teams to align AI projects to business needs. Deployment & Monitoring: Deploy models in production using AWS/GCP/Azure with MLOps best practices. Research & Innovation: Stay current with AI/ML advancements and integrate emerging tools and frameworks. Code Quality & Documentation: Ensure clean, efficient, and version-controlled code with strong documentation. Required Qualifications Experience: 58 years overall in software/AI development, with 56 years hands-on AI/ML expertise. Education: Bachelors/Masters in Computer Science, Data Science, AI/ML, or related fields. Programming Proficiency: Expert-level Python (NumPy, Pandas, Scikit-learn, Matplotlib, etc.). Machine Learning & AI: Strong knowledge of regression, classification, clustering, recommendation systems, cross-validation, and feature selection. Deep Learning & Neural Networks: Expertise in CNNs, RNNs, LSTM, ResNet, VGG, etc. Experience in computer vision tasks (image classification, detection, segmentation). Version Control: Proficient with Git and collaborative development tools. Software Engineering: Solid background in algorithms, data structures, API development, and modular code. Preferred Skills Web Frameworks: Experience with Flask/Django for AI-driven applications. Computer Vision: Hands-on with OpenCV, real-time image processing, and object tracking. NLP: Experience in NER, text classification, sentiment analysis, transformers, and BERT models. Cloud & MLOps: Deployment on AWS/GCP/Azure with Docker, Kubernetes, MLflow, or SageMaker. Big Data & Pipelines: Exposure to Spark, Hadoop, or Airflow for large-scale data pipelines. Soft Skills Excellent analytical and problem-solving abilities. Strong communication skills to convey complex technical concepts simply. Proven leadership and mentoring capabilities. Strong sense of ownership, accountability, and detail orientation.
Posted 3 weeks ago
5.0 - 8.0 years
0 Lacs
bengaluru east, karnataka, india
On-site
Skills: API Integration & Deployment (Flask/FastAPI, MLOps), Cloud Platforms (AWS/GCP/Azure), Model Optimization & Performance Tuning, Machine Learning & Data Analysis, Deep Learning (TensorFlow / PyTorch), Python Programming, Job Title: Senior AI/ML Engineer (5 to 8 Years of EXP) Location: Onsite Bangalore, Karnataka Company: VectorStack Employment Type: Full-time Joining: Immediate (Urgent Requirement) About VectorStack VectorStack is a technology solutions provider driving digital transformation and business performance. With expertise across Tech Advancement, Product Evolution, Design Innovation, and Business Transformation, we deliver strategies that create measurable impact. Our solutions power industries like Retail Tech, Ad Tech, FinTech, and EdTech, enabling businesses to unlock their true potential. Learn more at: www.vectorstack.co About The Role We are seeking a highly experienced and passionate AI Engineer with overall experience and 5-8 years of relevant expertise in AI/ML solutioning. This is a senior role ideal for a technologist who thrives on solving complex problems through machine learning, deep learning, and data-driven innovation. As a Senior Contributor, You Will Lead the end-to-end design, development, and deployment of AI/ML solutions. Provide technical leadership and mentorship to junior engineers. Collaborate cross-functionally to ensure AI initiatives deliver business value and innovation at scale. Key Responsibilities AI Solution Delivery: Design, develop, and deploy scalable AI/ML models across multiple domains. Model Development & Optimization: Build, fine-tune, and optimize ML/DL models for performance and efficiency. Leadership & Mentoring: Guide junior engineers, influence architectural decisions, and foster knowledge sharing. Data Engineering: Oversee preprocessing, transformation, and feature engineering for large and complex datasets. Cross-Functional Collaboration: Partner with Product, Data, and Engineering teams to align AI projects to business needs. Deployment & Monitoring: Deploy models in production using AWS/GCP/Azure with MLOps best practices. Research & Innovation: Stay current with AI/ML advancements and integrate emerging tools and frameworks. Code Quality & Documentation: Ensure clean, efficient, and version-controlled code with strong documentation. Required Qualifications Experience: 58 years overall in software/AI development, with 56 years hands-on AI/ML expertise. Education: Bachelors/Masters in Computer Science, Data Science, AI/ML, or related fields. Programming Proficiency: Expert-level Python (NumPy, Pandas, Scikit-learn, Matplotlib, etc.). Machine Learning & AI: Strong knowledge of regression, classification, clustering, recommendation systems, cross-validation, and feature selection. Deep Learning & Neural Networks: Expertise in CNNs, RNNs, LSTM, ResNet, VGG, etc. Experience in computer vision tasks (image classification, detection, segmentation). Version Control: Proficient with Git and collaborative development tools. Software Engineering: Solid background in algorithms, data structures, API development, and modular code. Preferred Skills Web Frameworks: Experience with Flask/Django for AI-driven applications. Computer Vision: Hands-on with OpenCV, real-time image processing, and object tracking. NLP: Experience in NER, text classification, sentiment analysis, transformers, and BERT models. Cloud & MLOps: Deployment on AWS/GCP/Azure with Docker, Kubernetes, MLflow, or SageMaker. Big Data & Pipelines: Exposure to Spark, Hadoop, or Airflow for large-scale data pipelines. Soft Skills Excellent analytical and problem-solving abilities. Strong communication skills to convey complex technical concepts simply. Proven leadership and mentoring capabilities. Strong sense of ownership, accountability, and detail orientation.
Posted 3 weeks ago
5.0 years
0 Lacs
pune, maharashtra, india
On-site
Job Description: We are seeking a highly skilled and motivated AI Engineer with expertise in large language models (LLMs), AI workflows, and machine learning. This role combines deep technical knowledge in ML/AI with hands-on experience building intelligent, production-ready systems that enhance cybersecurity investigation, prioritization, and response. You will work at the intersection of LLM-driven automation, workflow orchestration, and classical ML models to improve how alerts are prioritized, classified, and contextualized—reducing fatigue and enabling faster, more effective decision-making. Your work will directly influence the development of agentic AI systems, workflow automation, and recommendation engines within cloud security platform. Key Responsibilities LLM Integration & Workflows: Build, fine-tune, and integrate large language models (LLMs) into existing systems. Develop agentic workflows for investigation, classification, and automated response in cybersecurity. Apply techniques like retrieval-augmented generation (RAG), prompt engineering, and fine-tuning for domain-specific tasks. Machine Learning Development: Design, implement, and optimize ML models for prioritization, ranking, clustering, anomaly detection, and classification. Apply both classical forecasting models (AR, ARIMA, SARIMA, ES) and modern architectures (XGBoost, LSTM, DeepAR, N-BEATS, Temporal Fusion Transformer). Data Preparation & Feature Engineering: Collect, preprocess, and transform structured and unstructured data (including logs, text, and access patterns). Engineer features to maximize model interpretability and performance. Model Training, Evaluation, and Deployment: Train and evaluate models using rigorous metrics (precision, recall, AUC, F1, etc.). Optimize hyperparameters and fine-tune LLMs for task-specific improvements. Deploy ML/LLM models into production at scale with strong monitoring, drift detection, and observability. Collaboration & Documentation: Work closely with data scientists, ML engineers, security researchers, and software teams to build end-to-end solutions. Document models, workflows, and pipelines for clarity, reproducibility, and knowledge sharing. Requirements Bachelor’s/Master’s degree in Computer Science, AI/ML, Data Science, or a related field. 5+ years of experience in ML/AI, including 3+ years deploying production-grade systems. Experience contributing to publications (patents, libraries, or peer-reviewed papers) is a plus. Strong knowledge of machine learning algorithms for classification, clustering, ranking, and anomaly detection. Proficiency with LLM frameworks and APIs (OpenAI, Hugging Face Transformers, LangChain, LlamaIndex). Hands-on experience building workflow automation with LLMs and integrating them into applications. Solid programming skills in Python (experience with PyTorch, TensorFlow, scikit-learn). Knowledge of NLP tasks (text classification, summarization, embeddings, semantic search). Experience with recommendation systems or reinforcement learning is a strong plus. Proven track record of deploying ML/AI models into production environments with scalability in mind. Familiarity with cloud platforms (AWS, GCP, Azure), containerization (Docker, Kubernetes). Understanding of MLOps best practices (CI/CD for ML, monitoring, retraining strategies). Strong problem-solving and analytical mindset. Excellent communication and teamwork skills. Ability to work in a fast-paced, evolving startup environment. Write to me at rajeshwari.vh@careerxperts.com for more details.
Posted 3 weeks ago
0 years
0 Lacs
india
On-site
Company Description Twinamics is building the Autonomous Brain for Enterprises. We transform messy, siloed data into real-time intelligence and autonomous actions—helping businesses grow revenue without growing headcount. At the core of Twinamics is our proprietary DIPPCAA Engine (Data → Insight → Prediction → Prescription → Command → Action → Adaptation). It powers an end-to-end Data-to-Action Infrastructure that connects data, reasons over it, and executes business-critical decisions through AI Employees (AI-CXOs and their Agent Workforce). If you’re excited to work at the frontier of agentic AI, orchestration frameworks, and enterprise-scale automation —this role is for you. What You’ll Do As an AI Engineer at Twinamics , you’ll: Design and implement agentic AI systems with stateful reasoning, memory, and orchestration. Develop Digital Twin AI Employees (e.g., Finance CXO, Sales Lead) using LLMs, SLMs, and custom orchestration logic. Integrate AI agents with enterprise systems (ERP, CRM, WhatsApp, Accounting APIs, Voice Platforms). Build plug-and-play AI templates for cross-industry use cases (hospitality, manufacturing, supply chain, logistics). Optimize for scalability: retries, fail-safes, state persistence, and performance tuning. Collaborate with product designers and developers to bring enterprise-ready AI infra to life. Skillsets We’re Looking For Core AI & ML Strong background in Machine Learning and Deep Learning (esp. time-series & sequential modeling). Experience with LLMs, SLMs, or hybrid architectures for reasoning + prediction. Knowledge of probabilistic modeling (Bayesian methods, Monte Carlo, Markov Decision Processes). Experience in time-series forecasting (ARIMA, Prophet, RNN/LSTM/GRU, Transformer-based models). Familiarity with anomaly detection techniques to capture unexpected signals. Understanding of multi-signal fusion (internal + external data streams). Strong grasp of causal inference & correlation vs. causation for accurate event detection. Prescriptive AI & Decisioning Ability to move from prediction → prescription (recommend optimal actions, not just forecasts). Knowledge of reinforcement learning, optimization algorithms, or decision theory . Familiarity with control systems for closed-loop feedback in enterprise workflows. Data Layer & Infra Hands-on with data pipelines (ETL/ELT, Apache Kafka, Airflow, dbt, or similar). Experience with vector databases (Pinecone, Weaviate, Milvus, pgvector) for memory/state management. Strong SQL + NoSQL experience (Postgres, Mongo, etc.) for structured/unstructured data. Data architecture skills: schema design, feature engineering, real-time + batch pipelines . Enterprise Integration Ability to connect models into ERP, CRM, Finance, and Supply Chain systems . Strong API design & integration skills (REST, GraphQL, gRPC). Bonus Points Exposure to knowledge graphs or graph databases (Neo4j, TigerGraph) for event relationships. Familiarity with streaming data (IoT, sensor data, transaction logs) .
Posted 3 weeks ago
2.0 years
0 Lacs
india
On-site
Position: Data Scientist Desired Experience: 2-4 years You will act as a key member of the Data consulting team, working directly with the partners and senior stakeholders of the clients. You will collaboratively work with interdisciplinary scientists, IT and engineering professionals across the organization to solve critical problems and answer important questions that drive key decisions for our business. Communication and organisation skills are keys for this position, along with a problem solving attitude. Your expertise 2+ years in an analytics role, preferably with direct experience in big data solutions & approaches SME in statistics, analytics, big data, data science, machine learning, deep learning, cloud, mobile, and full stack technologies. Hands-on experience analysing large amounts of data to derive actionable insights Database background (SQL, Hadoop, Apache Spark or other Big Data platforms) will be a distinct advantage. Broad Machine Learning experience - Algorithm Evaluation, Preparation, Analysis, Modelling and Execution. Working knowledge on traditional statistical model building( Example: Regression, Classification, Time series, Segmentation etc.), machine learning( Random forest, Boosting algos, SVM, KNN etc), deep learning(CNN,RNN,LSTM,Transfer learning) and NLP( Stemming, Lemitization,Named entity extraction, Latent semantic analysis etc). Hands on experience in Python & R with good proficiency level Experience in building analytical/data science solution proposals Good communication skills and partnerships with technology, product strategy, and data office. Experience collaborating and influencing senior leaders, business partners across time zones. Job Type: Full-time Work Location: In person
Posted 3 weeks ago
3.0 years
0 Lacs
gurugram, haryana, india
On-site
Responsibilities: Research: Stay up-to-date with the latest research in NLP/NLU space and work on various problems like topic modeling, machine translation, token classification / NER, semantic understanding and knowledge graph. Design: Take ownership for pythonic applications within the team and work independently to ensure its launch, improvement and maintenance and participate in product design discussion and decisions. Implement: Enhance current product features and proactively develop new features which have a reasonable value proposition for the end-user and suggest improvement and optimizations in deployed algorithms and implement the same while participating in agile development. Collaborate: Partner with product managers to make data driven decisions and help solve problems for end-users as well as other data science teams to establish and follow organizational best practices. Leadership: Perform technical reviews and bring overall leadership within the team and provide mentorship and training to junior team members. Requirements: B.Tech / M.Tech from a top-tier engineering college with a strong mathematical background. 3+ years of experience in Machine learning, 2+ years of experience working with NLP/NLU based technologies. Experience in machine / deep learning frameworks such as TensorFlow, Keras, Pytorch and good knowledge of CNN, RNN/LSTM/GRU, Transformer Models, Transfer Learning, Ensemble Learning as well as classical ML techniques like LDA, SVD and Clustering. Strong Python programming skills and familiarity with back-end development / devops-Flask, uWSGI, MySQL, Celery, Docker will be a plus. Bonus: Prior experience with AWS related services and familiarity with platforms like MLFlow for managing ML lifecycle and experimentation. Strong problem solving skills and readiness to learn new concepts and apply them to find practical solutions to complex problems Ability to work in a fast-paced start up with strict deadlines Reasonable communication and teamwork skills
Posted 3 weeks ago
5.0 years
0 Lacs
indore, madhya pradesh, india
On-site
Job Title: Quantitative Analyst (Forex) Location: Indore, MP, India (Onsite only - Work from office) Experience Level: 3–5 years Employment Type: Full-time Role Overview Tecnomi is seeking a talented Quantitative Analyst to join our team full-time and play a key role in building, validating, and optimising systematic forex trading strategies. You will work closely with the project owner and development team to develop actionable models, implement risk frameworks, and contribute to data-driven insights. This full-time position offers the opportunity to drive innovative solutions in a collaborative environment, with a focus on long-term project impact. Key Responsibilities Strategy Development: Design, test, and optimise quantitative trading strategies using price data, volatility measures, sentiment indicators, and macroeconomic variables. Modelling & Backtesting: Build robust forecasting models (e.g., time-series, LSTM/Transformer hybrids) and perform rigorous backtesting to ensure reliability. Risk Frameworks: Develop and integrate risk-control tools (e.g., Value-at-Risk, drawdown limits, exposure thresholds) while aligning with compliance best practices. Data Integration: Aggregate and analyse multi-source data (news feeds, order-book snapshots, economic calendars) for model inputs and enhancements. Mentorship: Provide ongoing guidance and knowledge-sharing sessions to educate non-technical team members, including the project owner, on forex fundamentals and quantitative concepts. Collaboration & Documentation: Work with development teams to integrate model outputs into systems; maintain clear, comprehensive documentation of assumptions, processes, and results. Required Qualifications 3–5 years of experience in quantitative research, preferably in forex or systematic trading. Hands-on experience with backtesting platforms and statistical validation techniques. Strong understanding of FX market dynamics, macro drivers, and risk modelling. Excellent communication and teaching skills to mentor non-experts effectively. Bachelor’s or Master’s degree in Quantitative Finance, Mathematics, Statistics, or a related field. Verifiable 3-year portfolio showcasing prior models, backtests, or strategy reports. Preferred Skills Experience across multiple forex instruments or asset classes. Proficiency in Python or R (e.g., pandas, NumPy, statsmodels, scikit-learn). Familiarity with regulatory frameworks (e.g., RBI/FEMA). Exposure to tools like QuantConnect, MLflow, or similar platforms.
Posted 3 weeks ago
3.0 - 6.0 years
7 - 13 Lacs
gurugram
Work from Office
Develop and deploy AI/ML and Deep Learning models Apply Neural Networks, CNNs, RNNs, LSTMs, Transformers, and Autoencoders Work on NLP techniques Build predictive analytics and optimization pipelines Perform exploratory data analysis (EDA) Required Candidate profile Strong proficiency in Python 3–4 years of hands-on experience in AI/ML model development Expertise in Deep Learning architectures (CNN) Solid background in statistical analysis Strong knowledge of SQL
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |