Home
Jobs
Companies
Resume

585 Preprocess Jobs - Page 4

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

5 - 9 Lacs

India

On-site

We are looking for a skilled and passionate AI/ML Engineer to join our team and help us build intelligent systems that leverage machine learning and artificial intelligence. You will design, develop, and deploy machine learning models, work closely with cross-functional teams, and contribute to cutting-edge solutions that solve real-world problems. Key Responsibilities: Design and implement machine learning models and algorithms for various use cases (e.g., prediction, classification, NLP, computer vision). Analyze and preprocess large datasets to build robust training pipelines. Conduct research to stay up to date with the latest AI/ML advancements and integrate relevant techniques into projects. Train, fine-tune, and optimize models for performance and scalability. Deploy models to production using tools such as Docker, Kubernetes, or cloud services (AWS, GCP, Azure). Collaborate with software engineers, data scientists, and product teams to integrate AI/ML solutions into applications. Monitor model performance in production and continuously iterate for improvements. Document design choices, code, and models for transparency and reproducibility. Preferred Qualifications: Knowledge in NLP, LLM and GenAI. Knowledge in deep learning architectures such as CNNs, RNNs, transformer models. Experience with NLP libraries (e.g., Hugging Face Transformers, spaCy) or computer vision tools (e.g., OpenCV). Background in deep learning architectures such as CNNs, RNNs, GANs, or transformer models. Knowledge of MLOps practices and tools (e.g., MLflow, Kubeflow, SageMaker). Contributions to open-source AI/ML projects or publications in relevant conferences/journals. Required 3+ years experience. Work Mode : Onsite. Job Types: Full-time, Permanent Pay: ₹500,000.00 - ₹900,000.00 per year Benefits: Flexible schedule Health insurance Leave encashment Provident Fund Schedule: Day shift Fixed shift Monday to Friday Supplemental Pay: Performance bonus Work Location: In person

Posted 5 days ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Designation: - ML / MLOPs Engineer Location: - Noida (Sector- 132) Key Responsibilities: • Model Development & Algorithm Optimization : Design, implement, and optimize ML models and algorithms using libraries and frameworks such as TensorFlow , PyTorch , and scikit-learn to solve complex business problems. • Training & Evaluation : Train and evaluate models using historical data, ensuring accuracy, scalability, and efficiency while fine-tuning hyperparameters. • Data Preprocessing & Cleaning : Clean, preprocess, and transform raw data into a suitable format for model training and evaluation, applying industry best practices to ensure data quality. • Feature Engineering : Conduct feature engineering to extract meaningful features from data that enhance model performance and improve predictive capabilities. • Model Deployment & Pipelines : Build end-to-end pipelines and workflows for deploying machine learning models into production environments, leveraging Azure Machine Learning and containerization technologies like Docker and Kubernetes . • Production Deployment : Develop and deploy machine learning models to production environments, ensuring scalability and reliability using tools such as Azure Kubernetes Service (AKS) . • End-to-End ML Lifecycle Automation : Automate the end-to-end machine learning lifecycle, including data ingestion, model training, deployment, and monitoring, ensuring seamless operations and faster model iteration. • Performance Optimization : Monitor and improve inference speed and latency to meet real- time processing requirements, ensuring efficient and scalable solutions. • NLP, CV, GenAI Programming : Work on machine learning projects involving Natural Language Processing (NLP) , Computer Vision (CV) , and Generative AI (GenAI) , applying state-of-the-art techniques and frameworks to improve model performance. • Collaboration & CI/CD Integration : Collaborate with data scientists and engineers to integrate ML models into production workflows, building and maintaining continuous integration/continuous deployment (CI/CD) pipelines using tools like Azure DevOps , Git , and Jenkins . • Monitoring & Optimization : Continuously monitor the performance of deployed models, adjusting parameters and optimizing algorithms to improve accuracy and efficiency. • Security & Compliance : Ensure all machine learning models and processes adhere to industry security standards and compliance protocols , such as GDPR and HIPAA . • Documentation & Reporting : Document machine learning processes, models, and results to ensure reproducibility and effective communication with stakeholders. Required Qualifications: • Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or a related field. • 3+ years of experience in machine learning operations (MLOps), cloud engineering, or similar roles. • Proficiency in Python , with hands-on experience using libraries such as TensorFlow , PyTorch , scikit-learn , Pandas , and NumPy . • Strong experience with Azure Machine Learning services, including Azure ML Studio , Azure Databricks , and Azure Kubernetes Service (AKS) . • Knowledge and experience in building end-to-end ML pipelines, deploying models, and automating the machine learning lifecycle. • Expertise in Docker , Kubernetes , and container orchestration for deploying machine learning models at scale. • Experience in data engineering practices and familiarity with cloud storage solutions like Azure Blob Storage and Azure Data Lake . • Strong understanding of NLP , CV , or GenAI programming, along with the ability to apply these techniques to real-world business problems. • Experience with Git , Azure DevOps , or similar tools to manage version control and CI/CD pipelines. • Solid experience in machine learning algorithms , model training , evaluation , and hyperparameter tuning Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

AI and Machine Learning Intern Company: INLIGHN TECH Location: Remote (100% Virtual) Duration: 3 Months Stipend for Top Interns: ₹15,000 Certificate Provided | Letter of Recommendation | Full-Time Offer Based on Performance About the Company: INLIGHN TECH empowers students and fresh graduates with real-world experience through hands-on, project-driven internships. The AI and Machine Learning Internship is crafted to provide practical exposure to building intelligent systems, enabling interns to bridge theoretical knowledge with real-world applications. Role Overview: As an AI and Machine Learning Intern, you will work on projects involving data preprocessing, model development, and performance evaluation. This internship will strengthen your skills in algorithm design, model optimization, and deploying AI solutions to solve real-world problems. Key Responsibilities: Collect, clean, and preprocess datasets for training machine learning models Implement machine learning algorithms for classification, regression, and clustering Develop deep learning models using frameworks like TensorFlow or PyTorch Evaluate model performance using metrics such as accuracy, precision, and recall Collaborate on AI-driven projects, such as chatbots, recommendation engines, or prediction systems Document code, methodologies, and results for reproducibility and knowledge sharing Qualifications: Pursuing or recently completed a degree in Computer Science, Data Science, Artificial Intelligence, or a related field Strong foundation in Python and understanding of libraries such as Scikit-learn, NumPy, Pandas, and Matplotlib Familiarity with machine learning concepts like supervised and unsupervised learning Experience or interest in deep learning frameworks (TensorFlow, Keras, PyTorch) Good problem-solving skills and a passion for AI innovation Eagerness to learn and contribute to real-world ML applications Internship Benefits: Hands-on experience with real-world AI and ML projects Certificate of Internship upon successful completion Letter of Recommendation for top performers Build a strong portfolio of AI models and machine learning solutions Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Data Science Intern Company: INLIGHN TECH Location: Remote (100% Virtual) Duration: 3 Months Stipend for Top Interns: ₹15,000 Certificate Provided | Letter of Recommendation | Full-Time Offer Based on Performance About the Company: INLIGHN TECH empowers students and fresh graduates with real-world experience through hands-on, project-driven internships. The Data Science Internship is designed to equip you with the skills required to extract insights, build predictive models, and solve complex problems using data. Role Overview: As a Data Science Intern, you will work on real-world datasets to develop machine learning models, perform data wrangling, and generate actionable insights. This internship will help you strengthen your technical foundation in data science while working on projects that have a tangible business impact. Key Responsibilities: Collect, clean, and preprocess data from various sources Apply statistical methods and machine learning techniques to extract insights Build and evaluate predictive models for classification, regression, or clustering tasks Visualize data using libraries like Matplotlib, Seaborn, or tools like Power BI Document findings and present results to stakeholders in a clear and concise manner Collaborate with team members on data-driven projects and innovations Qualifications: Pursuing or recently completed a degree in Data Science, Computer Science, Mathematics, or a related field Proficiency in Python and data science libraries (NumPy, Pandas, Scikit-learn, etc.) Understanding of statistical analysis and machine learning algorithms Familiarity with SQL and data visualization tools or libraries Strong analytical, problem-solving, and critical thinking skills Eagerness to learn and apply data science techniques to solve real-world problems Internship Benefits: Hands-on experience with real datasets and end-to-end data science projects Certificate of Internship upon successful completion Letter of Recommendation for top performers Build a strong portfolio of data science projects and models Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Data Science Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About WebBoost Solutions by UM WebBoost Solutions by UM provides aspiring professionals with hands-on experience in data science , offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities ✅ Collect, preprocess, and analyze large datasets. ✅ Develop predictive models and machine learning algorithms . ✅ Perform exploratory data analysis (EDA) to extract meaningful insights. ✅ Create data visualizations and dashboards for effective communication of findings. ✅ Collaborate with cross-functional teams to deliver data-driven solutions . Requirements 🎓 Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field . 🐍 Proficiency in Python or R for data analysis and modeling. 🧠 Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) . 📊 Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib) . 🧐 Strong analytical and problem-solving skills. 🗣 Excellent communication and teamwork abilities. Stipend & Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based). ✔ Hands-on experience in data science projects . ✔ Certificate of Internship & Letter of Recommendation . ✔ Opportunity to build a strong portfolio of data science models and applications. ✔ Potential for full-time employment based on performance. How to Apply 📩 Submit your resume and a cover letter with the subject line "Data Science Intern Application." 📅 Deadline: 12th June 2025 Equal Opportunity WebBoost Solutions by UM is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Let me know if you need any modifications! 🚀 Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Data Science Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About WebBoost Solutions by UM WebBoost Solutions by UM provides aspiring professionals with hands-on experience in data science , offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities ✅ Collect, preprocess, and analyze large datasets. ✅ Develop predictive models and machine learning algorithms . ✅ Perform exploratory data analysis (EDA) to extract meaningful insights. ✅ Create data visualizations and dashboards for effective communication of findings. ✅ Collaborate with cross-functional teams to deliver data-driven solutions . Requirements 🎓 Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field . 🐍 Proficiency in Python or R for data analysis and modeling. 🧠 Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) . 📊 Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib) . 🧐 Strong analytical and problem-solving skills. 🗣 Excellent communication and teamwork abilities. Stipend & Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based). ✔ Hands-on experience in data science projects . ✔ Certificate of Internship & Letter of Recommendation . ✔ Opportunity to build a strong portfolio of data science models and applications. ✔ Potential for full-time employment based on performance. How to Apply 📩 Submit your resume and a cover letter with the subject line "Data Science Intern Application." 📅 Deadline: 12th June 2025 Equal Opportunity WebBoost Solutions by UM is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Let me know if you need any modifications! 🚀 Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Machine Learning Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship Application Deadline: 12th June 2025 About WebBoost Solutions by UM WebBoost Solutions by UM provides students and graduates with hands-on learning and career growth opportunities in machine learning and data science . Role Overview As a Machine Learning Intern , you’ll work on real-world projects , gaining practical experience in machine learning and data analysis . Responsibilities ✅ Design, test, and optimize machine learning models. ✅ Analyze and preprocess datasets. ✅ Develop algorithms and predictive models for various applications. ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn . ✅ Document findings and create reports to present insights. Requirements 🎓 Enrolled in or graduate of a relevant program (AI, ML, Data Science, Computer Science, or related field) 📊 Knowledge of machine learning concepts and algorithms . 🐍 Proficiency in Python or R (preferred). 🤝 Strong analytical and teamwork skills . Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Practical machine learning experience . ✔ Internship Certificate & Letter of Recommendation . ✔ Build your portfolio with real-world projects . How to Apply 📩 Submit your application by 12Th June 2025 with the subject: "Machine Learning Intern Application" . Equal Opportunity WebBoost Solutions by UM is an equal opportunity employer , welcoming candidates from all backgrounds . Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

CryptoChakra is a leading cryptocurrency analytics and education platform committed to demystifying digital asset markets for traders, investors, and enthusiasts worldwide. By integrating cutting-edge AI-driven predictions, blockchain analytics, and immersive learning modules, we empower users to navigate market volatility with confidence. Our platform combines advanced tools like Python, TensorFlow, and AWS to deliver actionable insights, risk assessments, and educational content that bridge the gap between complex data and strategic decision-making. As a remote-first innovator, we champion accessibility in decentralized finance, fostering a future where crypto literacy is universal. Position: Fresher Data Scientist Intern Remote | Full-Time Internship | Compensation: Paid/Unpaid based on suitability Role Summary Join CryptoChakra’s data science team to gain hands-on experience in transforming raw blockchain data into impactful insights. This role is tailored for recent graduates or students eager to apply foundational skills in machine learning, statistical analysis, and data storytelling to real-world crypto challenges. Key Responsibilities Data Processing: Clean and preprocess blockchain datasets from sources like Etherscan or CoinGecko using Python/R. Predictive Modeling: Assist in building and testing ML models for price forecasting or DeFi trend analysis. Insight Generation: Create visualizations (Tableau, Matplotlib) to simplify complex trends for educational content. Collaboration: Work with engineers and educators to refine analytics tools and tutorials. Documentation: Maintain clear records of methodologies and findings for team reviews. Who We’re Looking For Technical Skills Foundational knowledge of Python/R for data manipulation (Pandas, NumPy). Basic understanding of statistics (regression, hypothesis testing). Familiarity with data visualization tools (Tableau, Power BI) or libraries (Seaborn). Curiosity about blockchain technology, DeFi, or crypto markets. Soft Skills Eagerness to learn and adapt in a fast-paced remote environment. Strong problem-solving mindset and attention to detail. Ability to communicate technical concepts clearly. Preferred (Not Required) Academic projects involving data analysis or machine learning. Exposure to SQL, AWS, or big data tools. Pursuing a degree in Data Science, Computer Science, Statistics, or related fields. What We Offer Mentorship: Guidance from experienced data scientists and blockchain experts. Skill Development: Training in real-world tools like TensorFlow and Tableau. Portfolio Projects: Contribute to live projects featured on CryptoChakra’s platform. Flexibility: Remote work with adaptable hours for students. Show more Show less

Posted 5 days ago

Apply

15.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

We are looking for a passionate and curious AI/ML Engineer (Fresher) to join our growing engineering team. This is a unique opportunity to work on real-world machine learning applications and contribute to building cutting-edge AI solutions. Your Responsibilities Assist in designing, developing, and training machine learning models using structured and unstructured data Collect, clean, and preprocess large datasets for model building Perform exploratory data analysis and statistical modeling Collaborate with senior data scientists and engineers to build scalable AI systems Run experiments, tune hyperparameters, and evaluate model performance using industry-standard metrics Document models, processes, and experiment results clearly and consistently Support in integrating AI/ML models into production environments Stay updated with the latest trends and techniques in machine learning, deep learning, and AI Participate in code reviews, sprint planning, and product discussions Follow best practices in software development, version control, and model reproducibility Skill Sets / Experience We Require Strong understanding of machine learning fundamentals (regression, classification, clustering, etc.) Hands-on experience with Python and ML libraries such as scikit-learn, pandas, NumPy Basic familiarity with deep learning frameworks like TensorFlow, PyTorch, or Keras Knowledge of data preprocessing, feature engineering, and model validation techniques Understanding of probability, statistics, and linear algebra Familiarity with tools like Jupyter, Git, and cloud-based notebooks Problem-solving mindset and eagerness to learn Good communication skills and the ability to work in a team Internship/project experience in AI/ML is a plus Education B.Tech / M.Tech / M.Sc in Computer Science, Data Science, Artificial Intelligence, or related field Relevant certifications in AI/ML (Coursera, edX, etc.) are a plus About Us TechAhead is a global digital transformation company with a strong presence in the USA and India. We specialize in AI-first product design thinking and bespoke development solutions . With over 15 years of proven expertise, we have partnered with Fortune 500 companies and leading global brands to drive digital innovation and deliver excellence. At TechAhead, we are committed to continuous learning, growth and crafting tailored solutions that meet the unique needs of our clients. Join us to shape the future of digital innovation worldwide and drive impactful results with cutting-edge AI tools and strategies! Show more Show less

Posted 5 days ago

Apply

4.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description: Senior AI Engineer (Tech Lead) Role Overview: We are seeking a highly skilled and experienced Senior AI Engineers with a minimum of 4 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Leading a team of 4-6 developers Assist in the development and implementation of AI models and systems, leveraging techniques such as Large Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, Agentic Framework to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Minimum 4 years of experience in Python, Data Science, Machine Learning, OCR and document intelligence Experience in leading a team of 4-6 developers Demonstrated ability to conceptualize technical solutions, apply accurate estimation techniques, and effectively engage with customer stakeholders In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Strong knowledge of Python frameworks such as Django, Flask, or FastAPI. Experience with RESTful API design and development. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Understanding of agentic AI concepts and frameworks Proficiency in designing or interacting with agent-based AI architectures Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 5 days ago

Apply

4.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description: Senior AI Engineer (Tech Lead) Role Overview: We are seeking a highly skilled and experienced Senior AI Engineers with a minimum of 4 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Leading a team of 4-6 developers Assist in the development and implementation of AI models and systems, leveraging techniques such as Large Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, Agentic Framework to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Minimum 4 years of experience in Python, Data Science, Machine Learning, OCR and document intelligence Experience in leading a team of 4-6 developers Demonstrated ability to conceptualize technical solutions, apply accurate estimation techniques, and effectively engage with customer stakeholders In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Strong knowledge of Python frameworks such as Django, Flask, or FastAPI. Experience with RESTful API design and development. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Understanding of agentic AI concepts and frameworks Proficiency in designing or interacting with agent-based AI architectures Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 5 days ago

Apply

2.0 - 3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Understanding of English grammar – Parts of Speech, Subject Verb Agreement rules and Articles. 2.Conducting Training Needs Analysis and Training Needs Identification 3.Quartile management? 4.Conducting Root Cause Analysis – 5 Why technique Desired Profile - •At least 2-3 years of relevant experience as a voice coach or trainer. •Must have conducted email, call or chat audits, and have knowledge of different feedback mechanisms (Sandwich model, GROW model, etc.) •Must have some background of customer service in order to know about culture sensitization, first contact resolution, interpersonal skills, etc. •Must speak well with no obvious sound/pronunciation errors, particularly s/sh errors, v/w errors. •Check if the person attained a coach/trainer position via IJP alone, or underwent a proper Train the Trainer program to be certified. For CL10: 1.Understanding of English grammar – Subject Verb Agreement rules, Tenses and Transliterations. 2.Different feedback mechanisms, particularly the GROW model. 3.Training Needs Identification and Analysis, Root Cause Analysis and Quartile Management. 4.Sounds in English – Vowel sounds (Monophthongs and Diphthongs), Consonant sounds (Fricatives, Affricates, Plosives, Nasal). 5.Manner of articulation and Place of articulation of English sounds. 6.How would you train new hires on English sounds? How would you correct sound corruptions? What are some common sound corruptions in people from different regions of India, and how would you correct them? (South – v/w, East – v/b, z/j, s/sh, North – incorrect syllable stress, West – s/sh) Desired Profile - •At least 4-5 years of relevant experience as a preprocess trainer, having conducted New Hire trainings on grammar and voice & accent. •Must have conducted email, call or chat audits, and have knowledge of different feedback mechanisms (Sandwich model, GROW model, etc.) •Must have strong knowledge of customer service, culture sensitization, first contact resolution, interpersonal skills, etc. •No observable errors in grammar and pronunciations. •Strong knowledge of English sounds (20 vowel and 24 consonant sounds), ways to correct sound corruptions, knowledge of major sound corruptions in people from different regions. Show more Show less

Posted 5 days ago

Apply

4.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description: Senior AI Engineer (Tech Lead) Role Overview: We are seeking a highly skilled and experienced Senior AI Engineers with a minimum of 4 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Leading a team of 4-6 developers Assist in the development and implementation of AI models and systems, leveraging techniques such as Large Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, Agentic Framework to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Minimum 4 years of experience in Python, Data Science, Machine Learning, OCR and document intelligence Experience in leading a team of 4-6 developers Demonstrated ability to conceptualize technical solutions, apply accurate estimation techniques, and effectively engage with customer stakeholders In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Strong knowledge of Python frameworks such as Django, Flask, or FastAPI. Experience with RESTful API design and development. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Understanding of agentic AI concepts and frameworks Proficiency in designing or interacting with agent-based AI architectures Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 5 days ago

Apply

1.0 - 2.0 years

0 Lacs

Delhi

On-site

NASSCOM Campus, Sector 126, Noida, NCR About the Role We are seeking a dynamic and technically proficient AI/ML Engineer to support our AI/ML R&D initiatives in cybersecurity and take ownership of TechSagar.in — a knowledge repository for India's emerging technology capabilities. The ideal candidate will possess hands-on experience in generative AI, emerging technologies, and product management. This is a hybrid role combining deep technical development with stakeholder engagement and platform evangelism. Key Responsibilities AI/ML & Cybersecurity Innovation Support R&D efforts to prototype generative AI models for real-time threat detection and cybersecurity. Design, develop, and deploy machine learning models tailored to cyber threat intelligence and anomaly detection. Research and implement novel AI approaches, including multi-agent and reasoning-based systems. Develop distributed security monitoring frameworks using tools like AutoGen , CrewAI , etc. Build LLM-powered threat analysis tools using LangChain , LlamaIndex , and integrate with enterprise infrastructure. Apply MLOps best practices for model deployment, performance monitoring, and continuous integration. Optimize vector stores (Qdrant, FAISS, Pinecone, etc.) for RAG-based systems. Create synthetic datasets for AI training and model evaluation. Use Pydantic for data validation within AI pipelines. TechSagar Product Responsibilities Manage and evolve the TechSagar.in platform—enhancing features, ensuring data integrity, and driving usage. Liaise with tech partners, government bodies, startups, and academia to enrich platform content. Strategize and execute industry engagement plans to market TechSagar and establish its relevance. Represent TechSagar in external forums, conferences, and industry meetings. Collect user feedback, define product roadmap, and ensure alignment with AI/ML advancements. Required Qualifications Bachelor’s or Master’s degree in Computer Science, Artificial Intelligence, or related field. 1–2 years of hands-on experience in AI/ML model development and deployment. Strong programming expertise in Python . Familiarity with LangChain , LlamaIndex , and large language models (LLMs). Experience in applying AI to cybersecurity or vulnerability analysis. Good understanding of machine learning algorithms, data pipelines, and model evaluation. Excellent communication skills for technical and stakeholder engagement. Preferred Skills Exposure to generative AI , LLMs, and chain-of-thought reasoning techniques. Working knowledge of MLOps tools such as MLflow , Docker , etc. Familiarity with FastAPI or Flask for API development. Ability to preprocess, clean, and analyze large datasets efficiently. Experience in integrating AI tools with legacy or existing security systems. Technologies & Frameworks LLM Frameworks: LangChain, LlamaIndex Multi-agent Systems: AutoGen, CrewAI Vector Databases: FAISS, Pinecone, Qdrant, Elasticsearch, AstraDB MLOps Tools: MLflow, Docker Programming & APIs: Python, FastAPI/Flask Data Validation: Pydantic Why Join Us? Be at the forefront of AI innovation in cybersecurity and national technology initiatives. Lead and shape a strategic tech product (TechSagar) with national impact. Collaborate with thought leaders in the AI, cybersecurity, and emerging tech ecosystem.

Posted 6 days ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Designation: - ML / MLOPs Engineer Location: - Noida (Sector- 132) Key Responsibilities: • Model Development & Algorithm Optimization : Design, implement, and optimize ML models and algorithms using libraries and frameworks such as TensorFlow , PyTorch , and scikit-learn to solve complex business problems. • Training & Evaluation : Train and evaluate models using historical data, ensuring accuracy, scalability, and efficiency while fine-tuning hyperparameters. • Data Preprocessing & Cleaning : Clean, preprocess, and transform raw data into a suitable format for model training and evaluation, applying industry best practices to ensure data quality. • Feature Engineering : Conduct feature engineering to extract meaningful features from data that enhance model performance and improve predictive capabilities. • Model Deployment & Pipelines : Build end-to-end pipelines and workflows for deploying machine learning models into production environments, leveraging Azure Machine Learning and containerization technologies like Docker and Kubernetes . • Production Deployment : Develop and deploy machine learning models to production environments, ensuring scalability and reliability using tools such as Azure Kubernetes Service (AKS) . • End-to-End ML Lifecycle Automation : Automate the end-to-end machine learning lifecycle, including data ingestion, model training, deployment, and monitoring, ensuring seamless operations and faster model iteration. • Performance Optimization : Monitor and improve inference speed and latency to meet real- time processing requirements, ensuring efficient and scalable solutions. • NLP, CV, GenAI Programming : Work on machine learning projects involving Natural Language Processing (NLP) , Computer Vision (CV) , and Generative AI (GenAI) , applying state-of-the-art techniques and frameworks to improve model performance. • Collaboration & CI/CD Integration : Collaborate with data scientists and engineers to integrate ML models into production workflows, building and maintaining continuous integration/continuous deployment (CI/CD) pipelines using tools like Azure DevOps , Git , and Jenkins . • Monitoring & Optimization : Continuously monitor the performance of deployed models, adjusting parameters and optimizing algorithms to improve accuracy and efficiency. • Security & Compliance : Ensure all machine learning models and processes adhere to industry security standards and compliance protocols , such as GDPR and HIPAA . • Documentation & Reporting : Document machine learning processes, models, and results to ensure reproducibility and effective communication with stakeholders. Required Qualifications: • Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or a related field. • 3+ years of experience in machine learning operations (MLOps), cloud engineering, or similar roles. • Proficiency in Python , with hands-on experience using libraries such as TensorFlow , PyTorch , scikit-learn , Pandas , and NumPy . • Strong experience with Azure Machine Learning services, including Azure ML Studio , Azure Databricks , and Azure Kubernetes Service (AKS) . • Knowledge and experience in building end-to-end ML pipelines, deploying models, and automating the machine learning lifecycle. • Expertise in Docker , Kubernetes , and container orchestration for deploying machine learning models at scale. • Experience in data engineering practices and familiarity with cloud storage solutions like Azure Blob Storage and Azure Data Lake . • Strong understanding of NLP , CV , or GenAI programming, along with the ability to apply these techniques to real-world business problems. • Experience with Git , Azure DevOps , or similar tools to manage version control and CI/CD pipelines. • Solid experience in machine learning algorithms , model training , evaluation , and hyperparameter tuning Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Data Scientist – Roles And Responsibilities We are seeking a skilled Data Scientist to join our team and leverage data to create actionable insights and innovative solutions. The ideal candidate will have strong analytical skills, expertise in statistical modeling, and proficiency in programming and machine learning techniques. You will work closely with cross-functional teams to identify business opportunities, optimize processes, and develop data-driven strategies. Key Responsibilities Data Collection & Preparation: Gather, clean, and preprocess large datasets from various sources to ensure data quality and usability. Exploratory Data Analysis: Perform in-depth analysis to identify trends, patterns, and correlations that inform business decisions. Model Development : Design, build, and deploy machine learning models and statistical algorithms to solve complex problems, such as predictive analytics, classification, or recommendation systems. Data Visualization : Create compelling visualizations and dashboards to communicate insights to stakeholders using tools like Tableau, Power BI, or Python libraries (e.g., Matplotlib, Seaborn). Collaboration : Work with team leads, engineers, and business leaders to understand requirements, define key metrics, and translate insights into actionable strategies. Experimentation : Design and analyze A/B tests or other experiments to evaluate the impact of business initiatives. Automation : Develop pipelines and scripts to automate data processing and model deployment. Keep up with advancements in data science, machine learning, and industry trends to implement cutting-edge techniques. Preferred Qualifications Experience with deep learning, natural language processing (NLP), or computer vision. Knowledge of software engineering practices, such as version control (Git) and CI/CD pipelines. Contributions to open-source projects or publications in data science. Technical Skills Proficiency in programming languages like Python Experience with SQL for querying and managing databases. Knowledge of machine learning frameworks (e.g., TensorFlow, PyTorch, Scikit-learn). Familiarity with big data tools (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, GCP, Azure) is a plus. Experience with data visualization tools (e.g., Tableau, Power BI, or Plotly). Strong understanding of statistics, probability, and experimental design Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Data Scientist – Roles And Responsibilities We are seeking a skilled Data Scientist to join our team and leverage data to create actionable insights and innovative solutions. The ideal candidate will have strong analytical skills, expertise in statistical modeling, and proficiency in programming and machine learning techniques. You will work closely with cross-functional teams to identify business opportunities, optimize processes, and develop data-driven strategies. Key Responsibilities Data Collection & Preparation: Gather, clean, and preprocess large datasets from various sources to ensure data quality and usability. Exploratory Data Analysis: Perform in-depth analysis to identify trends, patterns, and correlations that inform business decisions. Model Development : Design, build, and deploy machine learning models and statistical algorithms to solve complex problems, such as predictive analytics, classification, or recommendation systems. Data Visualization : Create compelling visualizations and dashboards to communicate insights to stakeholders using tools like Tableau, Power BI, or Python libraries (e.g., Matplotlib, Seaborn). Collaboration : Work with team leads, engineers, and business leaders to understand requirements, define key metrics, and translate insights into actionable strategies. Experimentation : Design and analyze A/B tests or other experiments to evaluate the impact of business initiatives. Automation : Develop pipelines and scripts to automate data processing and model deployment. Keep up with advancements in data science, machine learning, and industry trends to implement cutting-edge techniques. Preferred Qualifications Experience with deep learning, natural language processing (NLP), or computer vision. Knowledge of software engineering practices, such as version control (Git) and CI/CD pipelines. Contributions to open-source projects or publications in data science. Technical Skills Proficiency in programming languages like Python Experience with SQL for querying and managing databases. Knowledge of machine learning frameworks (e.g., TensorFlow, PyTorch, Scikit-learn). Familiarity with big data tools (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, GCP, Azure) is a plus. Experience with data visualization tools (e.g., Tableau, Power BI, or Plotly). Strong understanding of statistics, probability, and experimental design Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Data Science Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Application Deadline: 12th June 2025 Opportunity: Full-time role based on performance + Internship Certificate About Unified Mentor Unified Mentor provides aspiring professionals with hands-on experience in data science through industry-relevant projects, helping them build successful careers. Responsibilities Collect, preprocess, and analyze large datasets Develop predictive models and machine learning algorithms Perform exploratory data analysis (EDA) to extract insights Create data visualizations and dashboards for effective communication Collaborate with cross-functional teams to deliver data-driven solutions Requirements Enrolled in or a graduate of Data Science, Computer Science, Statistics, or a related field Proficiency in Python or R for data analysis and modeling Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) Familiarity with data visualization tools like Tableau, Power BI, or Matplotlib Strong analytical and problem-solving skills Excellent communication and teamwork abilities Stipend & Benefits Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) Hands-on experience in data science projects Certificate of Internship & Letter of Recommendation Opportunity to build a strong portfolio of data science models and applications Potential for full-time employment based on performance How to Apply Submit your resume and a cover letter with the subject line "Data Science Intern Application." Equal Opportunity: Unified Mentor welcomes applicants from all backgrounds. Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Data Science Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About WebBoost Solutions by UM WebBoost Solutions by UM provides aspiring professionals with hands-on experience in data science , offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities ✅ Collect, preprocess, and analyze large datasets. ✅ Develop predictive models and machine learning algorithms . ✅ Perform exploratory data analysis (EDA) to extract meaningful insights. ✅ Create data visualizations and dashboards for effective communication of findings. ✅ Collaborate with cross-functional teams to deliver data-driven solutions . Requirements 🎓 Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field . 🐍 Proficiency in Python or R for data analysis and modeling. 🧠 Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) . 📊 Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib) . 🧐 Strong analytical and problem-solving skills. 🗣 Excellent communication and teamwork abilities. Stipend & Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based). ✔ Hands-on experience in data science projects . ✔ Certificate of Internship & Letter of Recommendation . ✔ Opportunity to build a strong portfolio of data science models and applications. ✔ Potential for full-time employment based on performance. How to Apply 📩 Submit your resume and a cover letter with the subject line "Data Science Intern Application." 📅 Deadline: 12th June 2025 Equal Opportunity WebBoost Solutions by UM is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Let me know if you need any modifications! 🚀 Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Job Description The Data Scientist – Operations will play a key role in transforming operational processes through advanced analytics and data-driven decision-making. This role focuses on optimizing supply chain, manufacturing, and overall operations by developing predictive models, streamlining workflows, and uncovering insights to enhance efficiency and reduce costs. Key Responsibilities Advanced Analytics and Data Modeling Develop predictive models for demand forecasting, inventory optimization, and supply chain resilience. Leverage machine learning techniques to optimize production schedules, logistics, and procurement. Build algorithms to predict and mitigate risks in operational processes. Operational Efficiency Analyze manufacturing and supply chain data to identify bottlenecks and recommend process improvements. Implement solutions for waste reduction, cost optimization, and improved throughput. Conduct root cause analysis for operational inefficiencies and develop actionable insights. Collaboration with Stakeholders Partner with operations, supply chain, and procurement teams to understand analytical needs and deliver insights. Collaborate with IT and data engineering teams to ensure data availability and accuracy. Present findings and recommendations to non-technical stakeholders in an accessible manner. Data Management and Tools Work with large datasets to clean, preprocess, and analyze data Location(s) Ahmedabad - Venus Stratum GCC Kraft Heinz is an Equal Opportunity Employer – Underrepresented Ethnic Minority Groups/Women/Veterans/Individuals with Disabilities/Sexual Orientation/Gender Identity and other protected classes . Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

AI and Machine Learning Intern Company: INLIGHN TECH Location: Remote (100% Virtual) Duration: 3 Months Stipend for Top Interns: ₹15,000 Certificate Provided | Letter of Recommendation | Full-Time Offer Based on Performance About the Company: INLIGHN TECH empowers students and fresh graduates with real-world experience through hands-on, project-driven internships. The AI and Machine Learning Internship is crafted to provide practical exposure to building intelligent systems, enabling interns to bridge theoretical knowledge with real-world applications. Role Overview: As an AI and Machine Learning Intern, you will work on projects involving data preprocessing, model development, and performance evaluation. This internship will strengthen your skills in algorithm design, model optimization, and deploying AI solutions to solve real-world problems. Key Responsibilities: Collect, clean, and preprocess datasets for training machine learning models Implement machine learning algorithms for classification, regression, and clustering Develop deep learning models using frameworks like TensorFlow or PyTorch Evaluate model performance using metrics such as accuracy, precision, and recall Collaborate on AI-driven projects, such as chatbots, recommendation engines, or prediction systems Document code, methodologies, and results for reproducibility and knowledge sharing Qualifications: Pursuing or recently completed a degree in Computer Science, Data Science, Artificial Intelligence, or a related field Strong foundation in Python and understanding of libraries such as Scikit-learn, NumPy, Pandas, and Matplotlib Familiarity with machine learning concepts like supervised and unsupervised learning Experience or interest in deep learning frameworks (TensorFlow, Keras, PyTorch) Good problem-solving skills and a passion for AI innovation Eagerness to learn and contribute to real-world ML applications Internship Benefits: Hands-on experience with real-world AI and ML projects Certificate of Internship upon successful completion Letter of Recommendation for top performers Build a strong portfolio of AI models and machine learning solutions Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Machine Learning Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship Application Deadline: 12th June 2025 About WebBoost Solutions by UM WebBoost Solutions by UM provides students and graduates with hands-on learning and career growth opportunities in machine learning and data science . Role Overview As a Machine Learning Intern , you’ll work on real-world projects , gaining practical experience in machine learning and data analysis . Responsibilities ✅ Design, test, and optimize machine learning models. ✅ Analyze and preprocess datasets. ✅ Develop algorithms and predictive models for various applications. ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn . ✅ Document findings and create reports to present insights. Requirements 🎓 Enrolled in or graduate of a relevant program (AI, ML, Data Science, Computer Science, or related field) 📊 Knowledge of machine learning concepts and algorithms . 🐍 Proficiency in Python or R (preferred). 🤝 Strong analytical and teamwork skills . Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Practical machine learning experience . ✔ Internship Certificate & Letter of Recommendation . ✔ Build your portfolio with real-world projects . How to Apply 📩 Submit your application by 12Th June 2025 with the subject: "Machine Learning Intern Application" . Equal Opportunity WebBoost Solutions by UM is an equal opportunity employer , welcoming candidates from all backgrounds . Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Data Science Intern Company: INLIGHN TECH Location: Remote (100% Virtual) Duration: 3 Months Stipend for Top Interns: ₹15,000 Certificate Provided | Letter of Recommendation | Full-Time Offer Based on Performance About the Company: INLIGHN TECH empowers students and fresh graduates with real-world experience through hands-on, project-driven internships. The Data Science Internship is designed to equip you with the skills required to extract insights, build predictive models, and solve complex problems using data. Role Overview: As a Data Science Intern, you will work on real-world datasets to develop machine learning models, perform data wrangling, and generate actionable insights. This internship will help you strengthen your technical foundation in data science while working on projects that have a tangible business impact. Key Responsibilities: Collect, clean, and preprocess data from various sources Apply statistical methods and machine learning techniques to extract insights Build and evaluate predictive models for classification, regression, or clustering tasks Visualize data using libraries like Matplotlib, Seaborn, or tools like Power BI Document findings and present results to stakeholders in a clear and concise manner Collaborate with team members on data-driven projects and innovations Qualifications: Pursuing or recently completed a degree in Data Science, Computer Science, Mathematics, or a related field Proficiency in Python and data science libraries (NumPy, Pandas, Scikit-learn, etc.) Understanding of statistical analysis and machine learning algorithms Familiarity with SQL and data visualization tools or libraries Strong analytical, problem-solving, and critical thinking skills Eagerness to learn and apply data science techniques to solve real-world problems Internship Benefits: Hands-on experience with real datasets and end-to-end data science projects Certificate of Internship upon successful completion Letter of Recommendation for top performers Build a strong portfolio of data science projects and models Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Machine Learning Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship Application Deadline: 12th June 2025 About WebBoost Solutions by UM WebBoost Solutions by UM provides students and graduates with hands-on learning and career growth opportunities in machine learning and data science . Role Overview As a Machine Learning Intern , you’ll work on real-world projects , gaining practical experience in machine learning and data analysis . Responsibilities ✅ Design, test, and optimize machine learning models. ✅ Analyze and preprocess datasets. ✅ Develop algorithms and predictive models for various applications. ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn . ✅ Document findings and create reports to present insights. Requirements 🎓 Enrolled in or graduate of a relevant program (AI, ML, Data Science, Computer Science, or related field) 📊 Knowledge of machine learning concepts and algorithms . 🐍 Proficiency in Python or R (preferred). 🤝 Strong analytical and teamwork skills . Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Practical machine learning experience . ✔ Internship Certificate & Letter of Recommendation . ✔ Build your portfolio with real-world projects . How to Apply 📩 Submit your application by 12Th June 2025 with the subject: "Machine Learning Intern Application" . Equal Opportunity WebBoost Solutions by UM is an equal opportunity employer , welcoming candidates from all backgrounds . Show more Show less

Posted 6 days ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Role: Data Scientist Location: Noida (Hybrid) Required skills and qualifications Bachelor's/Master's/Ph.D. degree in Data Science, Computer Science, Statistics, Mathematics, or a related field 5+ years prior experience in a data science role or related field is preferred Proficiency in programming languages such as Python or R for data analysis and modeling Proficiency with data mining, mathematics, and statistical analysis using Python and R Strong understanding of machine learning techniques, algorithms, and their applications Advanced experience in pattern recognition and predictive modeling Experience with Excel, PowerPoint, Tableau, SQL Familiarity with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, Azure) is a plus Great problem-solving skills and the ability to translate business questions into data science tasks Responsibilities Identify relevant data sources and sets to mine for client business needs, and collect large structured and unstructured datasets and variables Collaborate with domain experts, engineers, and stakeholders to understand business requirements and translate them into analytical solutions Communicate complex findings and insights to both technical and non-technical audiences through clear visualizations, presentations, and reports Clean, preprocess, and transform data to ensure its quality and suitability for analysis Perform data and error analysis to improve models using Python and R Conduct exploratory data analysis to identify patterns, trends, and anomalies, and translate them into actionable recommendations with clear objectives in mind Continuously evaluate model performance and make improvements based on real-world outcomes Various classical Statistical techniques such as Regression, Multivariate Analysis etc . Time Series based forecasting modeling Experience with SQL and data warehousing (e.g. GCP/Hadoop/Teradata/Oracle/DB2) Experience using tools in BI, ETL, Reporting /Visualization/Dashboards Programming experience in languages like Python strongly desired Exposure to Bigdata based analytical solutions and hands-on experience with data lakes/ data cleansing/ data management. Ability to get Insights from Data, provide visualization, and data storytelling . Show more Show less

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies