Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
India
Remote
Job Title: Machine Learning Developer Company: Lead India Location: Remote Job Type: Full-Time Salary: ₹3.5 LPA About Lead India: Lead India is a forward-thinking organization focused on creating social impact through technology, innovation, and data-driven solutions. We believe in empowering individuals and building platforms that make governance more participatory and transparent. Job Summary: We are looking for a Machine Learning Developer to join our remote team. You will be responsible for building and deploying predictive models, working with large datasets, and delivering intelligent solutions that enhance our platform’s capabilities and user experience. Key Responsibilities: Design and implement machine learning models for classification, regression, and clustering tasks Collect, clean, and preprocess data from various sources Evaluate model performance using appropriate metrics Deploy machine learning models into production environments Collaborate with data engineers, analysts, and software developers Continuously research and implement state-of-the-art ML techniques Maintain documentation for models, experiments, and code Required Skills and Qualifications: Bachelor’s degree in Computer Science, Data Science, or a related field (or equivalent practical experience) Solid understanding of machine learning algorithms and statistical techniques Hands-on experience with Python libraries such as scikit-learn, pandas, NumPy, and matplotlib Familiarity with Jupyter notebooks and experimentation workflows Experience working with datasets using tools like SQL or Excel Strong problem-solving skills and attention to detail Ability to work independently in a remote environment Nice to Have: Experience with deep learning frameworks like TensorFlow or PyTorch Exposure to cloud-based ML platforms (e.g., AWS SageMaker, Google Vertex AI) Understanding of model deployment using Flask, FastAPI, or Docker Knowledge of natural language processing or computer vision What We Offer: Fixed annual salary of ₹3.5 LPA 100% remote work and flexible hours Opportunity to work on impactful, mission-driven projects using real-world data Supportive and collaborative environment for continuous learning and innovation
Posted 5 days ago
0.0 - 3.0 years
0 Lacs
Kochi, Kerala
On-site
We are a fast-growing technology startup based in Cochin, Kerala, focused on building innovative AI-powered software solutions for the healthcare, retail, and hospitality industries. We’re looking for a passionate AI Developer / Engineer to join our team and help us take our products to the next level. Key Responsibilities: Design, develop, and deploy AI/ML models for real-world applications. Build and optimize NLP, Computer Vision, or Predictive Analytics modules. Preprocess data and build datasets for training and inference. Integrate AI models into production-ready software using Python and REST APIs. Collaborate with software developers, product managers, and domain experts. Required Skills: 2–3 years of experience in AI/ML development. Proficient in Python and frameworks like TensorFlow, PyTorch, Scikit-learn . Experience with NLP, OCR, or Computer Vision projects. Solid understanding of data preprocessing , model training , and evaluation metrics . Ability to work with APIs , Databases (SQL/NoSQL) , and cloud tools. Experience with version control systems (e.g., Git). Preferred Skills (Bonus): Experience with AI in healthcare or OCR for documents/prescriptions . Knowledge of LLMs (e.g., GPT, LLaMA) or Generative AI . Deployment experience using Docker , Kubernetes , or AWS/GCP/Azure . Familiarity with Flutter , Node.js , or full-stack environments. Job Type: Full-time Pay: From ₹24,000.00 per month Work Location: In person Expected Start Date: 19/08/2025
Posted 5 days ago
0.6 - 5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Company: Omnipresent Robot Technologies Pvt. Ltd. Location: Noida Sector-80| Type: Full-Time About Us: Omnipresent Robot Tech Pvt. Ltd. is an innovative startup pushing the boundaries of robotics, drones, and space tech. We recently contributed to ISRO’s Chandrayaan-3 mission by developing the perception and navigation module for the Pragyaan rover. Join our dynamic team to work on satellite-based defense projects and grow your career! Position Overview: We are looking for an AI/ML Engineers for Senior and junior role to assist in the development of AI models and algorithms for our satellite-based defense project. You will work with a skilled team to train, test, and deploy ML models, gaining hands-on experience in cutting-edge AI applications. Key Responsibilities: • Assist in designing and developing AI models using ML/DL techniques. • Implement, test, and fine-tune ML models using popular frameworks (e.g., TensorFlow, PyTorch). • Load and deploy models on embedded platforms (like Jetson Orin NX). • Analyze datasets, preprocess data, and extract features for training. • Support code compatibility and optimization on embedded systems. • Monitor and evaluate model performance, suggesting improvements. • Collaborate with senior engineers to integrate AI models into production environments. • Stay updated on the latest AI trends and apply new techniques to projects. Qualifications: • B.Tech. in Computer Science, IT, or related field. • 0.6-5 years of experience in ML model development. • Proficiency in Python and familiarity with ML frameworks (e.g., TensorFlow, PyTorch, Scikit-learn). • Understanding of data preprocessing, model training, and deployment. • Basic knowledge of GPU acceleration (CUDA) and embedded platforms (Jetson Orin NX). • Familiarity with data processing tools (e.g., NumPy, Pandas). • Strong problem-solving and analytical skills. • Effective communication and team collaboration abilities. Why Join Us? • Be part of high-impact satellite defense projects. • Learn from experts in AI and embedded systems. • Work in a start-up environment that fosters innovation and creativity.
Posted 5 days ago
0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Work Level : Individual Core : Communication Skills, Problem Solving, Execution Leadership : Decisive, Team Alignment, Working Independently Industry Type : IT Services & Consulting Function : Data Analyst Key Skills : Data Analytics,Data Analysis,Python,R,MySQL,Cloud,AWS,Bigdata,Big Data Platforms,Business Intelligence (BI),Tableau,Data Science,Statistical Modeling Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Primary Responsibility: This is a Remote Position. Collect, clean, and preprocess data from various sources. Perform exploratory data analysis (EDA) to identify trends and patterns. Develop dashboards and reports using tools like Excel, Power BI, or Tableau. Use SQL to query and manipulate large datasets. Assist in building predictive models and performing statistical analyses. Present insights and recommendations based on data findings. Collaborate with cross-functional teams to support data-driven decision-making. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 5 days ago
0 years
0 Lacs
India
Remote
Data Science Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About WebBoost Solutions by UM WebBoost Solutions by UM provides aspiring professionals with hands-on experience in data science , offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities ✅ Collect, preprocess, and analyze large datasets. ✅ Develop predictive models and machine learning algorithms . ✅ Perform exploratory data analysis (EDA) to extract meaningful insights. ✅ Create data visualizations and dashboards for effective communication of findings. ✅ Collaborate with cross-functional teams to deliver data-driven solutions . Requirements 🎓 Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field . 🐍 Proficiency in Python for data analysis and modeling. 🧠 Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) . 📊 Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib) . 🧐 Strong analytical and problem-solving skills. 🗣 Excellent communication and teamwork abilities. Stipend & Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based). ✔ Hands-on experience in data science projects . ✔ Certificate of Internship & Letter of Recommendation . ✔ Opportunity to build a strong portfolio of data science models and applications. ✔ Potential for full-time employment based on performance. How to Apply 📩 Submit your resume and a cover letter with the subject line "Data Science Intern Application." 📅 Deadline: 29th July 2025 Equal Opportunity WebBoost Solutions by UM is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Let me know if you need any modifications! 🚀
Posted 5 days ago
0 years
0 Lacs
India
Remote
Data Science Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About WebBoost Solutions by UM WebBoost Solutions by UM provides aspiring professionals with hands-on experience in data science , offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities ✅ Collect, preprocess, and analyze large datasets. ✅ Develop predictive models and machine learning algorithms . ✅ Perform exploratory data analysis (EDA) to extract meaningful insights. ✅ Create data visualizations and dashboards for effective communication of findings. ✅ Collaborate with cross-functional teams to deliver data-driven solutions . Requirements 🎓 Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field . 🐍 Proficiency in Python for data analysis and modeling. 🧠 Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) . 📊 Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib) . 🧐 Strong analytical and problem-solving skills. 🗣 Excellent communication and teamwork abilities. Stipend & Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based). ✔ Hands-on experience in data science projects . ✔ Certificate of Internship & Letter of Recommendation . ✔ Opportunity to build a strong portfolio of data science models and applications. ✔ Potential for full-time employment based on performance. How to Apply 📩 Submit your resume and a cover letter with the subject line "Data Science Intern Application." 📅 Deadline: 28th July 2025 Equal Opportunity WebBoost Solutions by UM is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Let me know if you need any modifications! 🚀
Posted 6 days ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title : Payer Analytics Specialist. Position Summary The Payer Analytics Specialist is responsible for driving insights and supporting decision-making by analyzing healthcare payer data, creating data pipelines, and managing complex analytics projects. This role involves collaborating with cross-functional teams (Operations, Product, IT, and external partners) to ensure robust data integration, reporting, and advanced analytics capabilities. The ideal candidate will have strong technical skills, payer domain expertise, and the ability to manage 3rd-party data sources effectively. Key Responsibilities Data Integration and ETL Pipelines : Develop, maintain, and optimize end-to-end data pipelines, including ingestion, transformation, and loading of internal and external data sources. Collaborate with IT and Data Engineering teams to design scalable, secure, and high-performing data workflows. Implement best practices in data governance, version control, data security, and documentation. Analytics And Reporting Data Analysis : Analyze CPT-level data to identify trends, patterns, and insights relevant to healthcare services and payer rates. Benchmarking : Compare and benchmark rates provided by different health insurance payers within designated zip codes to assess competitive positioning. Build and maintain analytical models for cost, quality, and utilization metrics, leveraging tools such as Python, R, or SQL-based BI tools. Develop dashboards and reports to communicate findings to stakeholders across the organization. 3rd-Party Data Management Ingest and preprocess multiple 3rd party data from multiple sources and transform it into unified structures for analytics and reporting. Ensure compliance with transparency requirements and enable downstream analytics. Design automated workflows to update and validate data, working closely with external vendors and technical teams. Establish best practices for data quality checks (i.e., encounter completeness, claim-level validations) and troubleshooting. Project Management And Stakeholder Collaboration Manage analytics project lifecycles : requirement gathering, project scoping, resource planning, timeline monitoring, and delivery. Partner with key stakeholders (Finance, Operations, Population Health) to define KPIs, data needs, and reporting frameworks. Communicate technical concepts and results to non-technical audiences, providing clear insights and recommendations. Quality Assurance And Compliance Ensure data quality by implementing validation checks, audits, and anomaly detection frameworks. Maintain compliance with HIPAA, HITECH, and other relevant healthcare regulations and data privacy requirements. Participate in internal and external audits of data processes. Continuous Improvement and Thought Leadership. Stay current with industry trends, analytics tools, and regulatory changes affecting payer analytics. Identify opportunities to enhance existing data processes, adopt new technologies, and promote data-driven culture within the organization. Mentor junior analysts and share best practices in data analytics, reporting, and pipeline development. Required Qualifications Education & Experience : Bachelor's degree in Health Informatics, Data Science, Computer Science, Statistics, or a related field (Master's degree a plus). 3-5+ years of experience in healthcare analytics, payer operations, or related fields. Technical Skills Data Integration & ETL : Proficiency in building data pipelines using tools like SQL, Python, R, or ETL platforms (i.e., Talend, Airflow, or Data Factory). Databases & Cloud : Experience working with relational databases (SQL Server, PostgreSQL) and cloud environments (AWS, Azure, GCP). BI & Visualization : Familiarity with BI tools (Tableau, Power BI, Looker) for dashboard creation and data storytelling. MRF, All Claims, & Definitive Healthcare Data : Hands-on experience (or strong familiarity) with healthcare transparency data sets, claims data ingestion strategies, and provider/facility-level data from 3rd-party sources like Definitive Healthcare. Healthcare Domain Expertise Strong understanding of claims data structures (UB-04, CMS-1500), coding systems (ICD, CPT, HCPCS), and payer processes. Knowledge of healthcare regulations (HIPAA, HITECH, transparency rules) and how they impact data sharing and management. Analytical & Problem-Solving Skills Proven ability to synthesize large datasets, pinpoint issues, and recommend data-driven solutions. Comfort with statistical analysis and predictive modeling using Python or R. Soft Skills Excellent communication and presentation skills, with the ability to convey technical concepts to non-technical stakeholders. Strong project management and organizational skills, with the ability to handle multiple tasks and meet deadlines. Collaborative mindset and willingness to work cross-functionally to achieve shared objectives. Preferred/Additional Qualifications Advanced degree (MBA, MPH, MS in Analytics, or similar). Experience with healthcare cost transparency regulations and handling MRF data specifically for compliance. Familiarity with Data Ops or DevOps practices to automate and streamline data pipelines. Certification in BI or data engineering (i.e., Microsoft Certified : Azure Data Engineer, AWS Data Analytics Specialty). Experience establishing data stewardship programs and leading data governance initiatives. Why Join Us Impactful Work - Play a key role in leveraging payer data to reduce costs, improve quality, and shape population health strategies. Innovation - Collaborate on advanced analytics projects using state-of-the-art tools and platforms. Growth Opportunity - Be part of an expanding analytics team where you can lead initiatives, mentor others, and deepen your healthcare data expertise. Supportive Culture - Work in an environment that values open communication, knowledge sharing, and continuous learning. (ref:hirist.tech)
Posted 6 days ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Position : MLOps Engineer US Healthcare Claims Management. Location : Gurgaon, Sector 18, Delhi NCR, India,. UK Shift. Company : Neolytix. About The Role We are looking for an AI/ML Engineer to develop and implement intelligent claims analysis, prioritization, and denial resolution models for our AI-driven app for healthcare claims management. The ideal candidate will work closely with healthcare experts and engineers to orchestrate Large Language Models (LLMs) and Small Language Models (SLMs) that optimize revenue cycle processes, minimize Accounts Receivable (AR) dollars, and improve claim resolution efficiency. Key Responsibilities Design and implement AI/ML models for claims prioritization, denial resolution, and revenue cycle optimization. Develop and fine-tune LLMs and SLMs to enhance automation in claims management workflows. Preprocess and structure claims data for effective model training. Deploy, manage, and scale AI models using Azure-based cloud services. Continuously monitor model performance, retrain, and refine algorithms to improve accuracy and operational efficiency. Ensure compliance with HIPAA, PHI security standards, and healthcare data privacy regulations. Document technical solutions and create best practices for scalable AI-driven claims management. Qualifications Bachelor's or master's degree in computer science, AI/ML, Engineering, or a related field. 3+ years of experience in AI/ML engineering, preferably in healthcare claims automation. Proficiency in Python, with strong knowledge of LLMs, NLP, and text analytics for claims processing. Experience with Azure ML services for deploying machine learning models. Familiarity with healthcare claims data formats (EDI 837/835) and revenue cycle processes. Strong analytical, problem-solving, and teamwork skills. Preferred Skills Experience with AI-driven denial resolution and automated claims adjudication. Understanding of EHR/EMR systems and insurance reimbursement processes. Familiarity with DevOps, CI/CD pipelines, and MLOps for scalable AI deployments. What We Offer Competitive salary and benefits package. Opportunity to contribute to innovative AI solutions in the healthcare industry. Dynamic and collaborative work environment. Opportunities for continuous learning and professional growth. (ref:hirist.tech)
Posted 6 days ago
2.0 years
2 - 4 Lacs
Cochin
On-site
Design, develop, and deploy machine learning and deep learning models. Work on NLP, computer vision, and recommendation systems. Analyze and preprocess datasets for model training and testing. Optimize models for scalability, accuracy, and efficiency. Stay updated with the latest AI research, tools, and frameworks. Collaborate with data engineers, product managers, and developers. Job Type: Full-time Pay: ₹20,000.00 - ₹40,000.00 per month Benefits: Provident Fund Schedule: Day shift Ability to commute/relocate: Kochi, Kerala: Reliably commute or planning to relocate before starting work (Preferred) Experience: AI/ML: 2 years (Required) Location: Kochi, Kerala (Preferred) Work Location: In person Expected Start Date: 28/07/2025
Posted 6 days ago
2.0 years
1 - 4 Lacs
Mohali
On-site
Overview Tricky WebSolutions is seeking a skilled and enthusiastic AI/ML Developer with at least 2 years of experience to join our innovative team. The successful candidate will be responsible for developing and deploying machine learning models, conducting data analysis, and collaborating with cross-functional teams to integrate AI solutions into our products and services. Responsibilities Model Development : Design, develop, and deploy machine learning models for various applications. Implement algorithms and conduct experiments to test and validate models. Optimize and improve the performance of existing models. Data Analysis : Collect, preprocess, and analyze large datasets to extract meaningful insights. Develop data pipelines and ensure data quality and integrity. Collaboration : Work closely with data scientists, software developers, and product managers to integrate AI solutions into web applications. Communicate findings and technical concepts to non-technical stakeholders. Research and Innovation : Stay updated with the latest advancements in AI and machine learning technologies. Explore and propose new techniques and tools to enhance AI capabilities. Deployment and Maintenance : Implement machine learning models in production environments. Monitor model performance and maintain AI systems. Requirements Educational Background : Bachelor’s or Master’s degree in Computer Science, Data Science, Artificial Intelligence, or a related field. Technical Skills : Proficiency in programming languages such as Python or R. Experience with machine learning frameworks and libraries (e.g., TensorFlow, PyTorch, Scikit-learn). Strong understanding of machine learning algorithms and principles. Familiarity with data preprocessing techniques and tools. Experience with cloud platforms (e.g., AWS, Google Cloud, Azure) is a plus. Analytical Skills : Strong problem-solving abilities and attention to detail. Ability to analyze complex data and interpret results. Communication Skills : Excellent written and verbal communication skills. Ability to explain technical concepts to a non-technical audience. Experience : At least 2 years of experience in AI/ML development or a related field. Experience with natural language processing (NLP) and computer vision is a plus. Experience in deploying machine learning models in production environments. Preferred Qualifications Certifications : Relevant certifications in AI/ML are advantageous. Project Management : Experience with agile methodologies and project management tools. Publications : Contributions to research papers or relevant publications in the field of AI/ML. How to Apply Interested candidates are invited to send their CV to hr@trickywebsolutions.com . Please include "AI/ML Developer Application" in the subject line of the email. Job Type: Full-time Pay: ₹10,000.00 - ₹40,000.00 per month Work Location: In person
Posted 6 days ago
2.0 years
4 - 6 Lacs
Ahmedabad
On-site
Why Glasier Inc. For Your Dream Job? We're a passionate group of tech enthusiasts & creatives who live and breathe innovation. We're looking for energetic innovators, thinkers, and doers who thrive on learning, adapting quickly, and executing in real-time if you're a creative thinker with design, a marketer with a story to tell, or a passionate professional. Apply Now 01 Python Developer Openings: 01 Exp.: 2 - 2.5 Years Job Description: Design, build, and deploy ML models and algorithms. Preprocess and analyze large datasets for training and evaluation. Work with data scientists and engineers to integrate models into applications. Optimize model performance and accuracy. Stay up to date with AI/ML trends, libraries, and tools. Strong experience with Python and libraries such as NumPy, Pandas, Scikit-learn, TensorFlow, or PyTorch. Solid understanding of machine learning algorithms and principles. Experience working with data preprocessing and model deployment. Familiarity with cloud platforms (AWS, GCP, Azure) is a plus. PERKS & BENEFITS Working With Glasier Inc. We take care of our team members, so they can deliver their best work. Here are a few of the benefits and perks we offer to our employees: 5 Days Working Per Week Mentorship Mindfulness Flexible Working Hours International Exposures Dedicated Pantry Area Free Snacks & Drinks Open Work Culture Competitive Salary And Benefits Festival, Birthday & Work Anniversary Celebration Performance Appreciation Bonus & Rewards Employee Friendly Leave Policies Join our team now Send us an email hr@glasierinc.com Whats app on +91 95102 61901 Call at +91 95102 61901
Posted 6 days ago
1.0 years
0 Lacs
Gwalior
On-site
Job Title: Data Science Intern Company: Techieshubhdeep IT Solutions Pvt. Ltd. Location: 21 Nehru Colony, Thatipur, Gwalior, Madhya Pradesh Contact: +91 7880068399 About Us: Techieshubhdeep IT Solutions Pvt. Ltd. is a growing technology company specializing in IT services, software development, and innovative digital solutions. We are committed to nurturing talent and providing a platform for aspiring professionals to learn and excel in their careers. Role Overview: We are seeking a Data Science Intern who will assist our team in developing data-driven solutions, performing statistical analysis, and creating machine learning models to solve real-world business challenges. Key Responsibilities: Collect, clean, and preprocess structured and unstructured data. Perform exploratory data analysis (EDA) to identify trends and patterns. Assist in building, testing, and optimizing machine learning models. Work with large datasets and perform statistical modeling. Document processes, findings, and model performance. Collaborate with senior data scientists and software engineers on live projects. Required Skills & Qualifications: Currently pursuing or recently completed a degree in Computer Science, Data Science, Statistics, Mathematics, or related fields. Basic understanding of Python/R and libraries like NumPy, Pandas, Scikit-learn, Matplotlib, etc. Familiarity with SQL and database management. Strong analytical skills and problem-solving abilities. Good communication skills and willingness to learn. What We Offer: Hands-on training on real-world projects. Guidance from experienced industry professionals. Internship certificate upon successful completion. Potential for full-time employment based on performance. Job Types: Full-time, Internship, Fresher, Walk-In Pay: ₹5,000.00 - ₹15,000.00 per year Schedule: Day shift Monday to Friday Morning shift Ability to commute/relocate: Gwalior, Madhya Pradesh: Reliably commute or planning to relocate before starting work (Required) Experience: total work: 1 year (Preferred) Data science: 1 year (Preferred) Language: Hindi (Preferred) English (Preferred) Work Location: In person
Posted 6 days ago
0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Work Level : Individual Core : Communication Skills, Problem Solving, Execution Leadership : Decisive, Team Alignment, Working Independently Industry Type : IT Services & Consulting Function : Data Analyst Key Skills : Data Analytics,Data Analysis,Python,R,MySQL,Cloud,AWS,Bigdata,Big Data Platforms,Business Intelligence (BI),Tableau,Data Science,Statistical Modeling Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Primary Responsibility: Collect, clean, and analyze data from various sources. Assist in creating dashboards, reports, and visualizations. We are looking for a highly motivated Data Analyst Intern to join our team remotely. As a Data Analyst Intern, you will work closely with our data team to collect, clean, analyze, and visualize data to provide actionable insights. This internship is an excellent opportunity to gain hands-on experience in data analytics while working on real-world projects. Responsibilities: This is a Remote Position. Collect, clean, and preprocess data from various sources. Perform exploratory data analysis (EDA) to identify trends and patterns. Develop dashboards and reports using tools like Excel, Power BI, or Tableau. Use SQL to query and manipulate large datasets. Assist in building predictive models and performing statistical analyses. Present insights and recommendations based on data findings. Collaborate with cross-functional teams to support data-driven decision-making. Requirement: Currently pursuing or recently completed a degree in Data Science, Statistics, Mathematics, Computer Science, or a related field. Strong analytical and problem-solving skills. Proficiency in Excel and SQL for data analysis. Experience with data visualization tools like Power BI, Tableau, or Google Data Studio. Basic knowledge of Python or R for data analysis is a plus. Understanding of statistical methods and data modeling concepts. Strong attention to detail and ability to work independently. Excellent communication skills to present insights clearly. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Work Level : Individual Core : Communication Skills, Problem Solving, Execution Leadership : Decisive, Team Alignment, Working Independently Industry Type : IT Services & Consulting Function : Data Analyst Key Skills : MySQL,Python,Bigdata,Data Science,Data Analytics,Data Analysis,Cloud,AWS,Business Intelligence (BI),Statistical Modeling,R,Big Data Platforms,Tableau Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Primary Responsibility: Collect, clean, and analyze data from various sources. Assist in creating dashboards, reports, and visualizations. We are looking for a highly motivated Data Analyst Intern to join our team remotely. As a Data Analyst Intern, you will work closely with our data team to collect, clean, analyze, and visualize data to provide actionable insights. This internship is an excellent opportunity to gain hands-on experience in data analytics while working on real-world projects. Responsibilities: This is a Remote Position. Collect, clean, and preprocess data from various sources. Perform exploratory data analysis (EDA) to identify trends and patterns. Develop dashboards and reports using tools like Excel, Power BI, or Tableau. Use SQL to query and manipulate large datasets. Assist in building predictive models and performing statistical analyses. Present insights and recommendations based on data findings. Collaborate with cross-functional teams to support data-driven decision-making. Requirement: Currently pursuing or recently completed a degree in Data Science, Statistics, Mathematics, Computer Science, or a related field. Strong analytical and problem-solving skills. Proficiency in Excel and SQL for data analysis. Experience with data visualization tools like Power BI, Tableau, or Google Data Studio. Basic knowledge of Python or R for data analysis is a plus. Understanding of statistical methods and data modeling concepts. Strong attention to detail and ability to work independently. Excellent communication skills to present insights clearly. Preferred Skills: Experience with big data technologies (Google BigQuery, AWS, etc.). Familiarity with machine learning techniques and predictive modeling. Knowledge of business intelligence (BI) tools and reporting frameworks. What We Offer: Fully remote internship with flexible working hours. Hands-on experience with real-world datasets and business problems. Mentorship from experienced data analysts and industry professionals. Opportunity to contribute to meaningful projects and make an impact. Certificate of completion and potential for a full-time opportunity based on performance. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 week ago
0 years
0 Lacs
India
Remote
Job Title: Data Science Intern Company: Coorix.ai Location: Remote Duration: 3 months Opportunity: Full-time role based on performance + Internship Certificate Coorix.ai Coorix.ai provides students and graduates with hands-on experience in Data Science and AI, helping them build skills and portfolios through real-world projects. Responsibilities Collect, preprocess, and analyze large datasets Develop predictive models and machine learning algorithms Perform exploratory data analysis (EDA) to extract insights Create data visualizations and dashboards for effective communication Collaborate with cross-functional teams to deliver data-driven solutions Requirements Enrolled in or a graduate of Data Science, Computer Science, Statistics, or a related field Proficiency in Python or R for data analysis and modeling Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) Familiarity with data visualization tools like Tableau, Power BI, or Matplotlib Strong analytical and problem-solving skills Excellent communication and teamwork abilities Stipend & Benefits Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) Hands-on experience in data science projects Certificate of Internship & Letter of Recommendation Opportunity to build a strong portfolio of data science models and applications Potential for full-time employment based on performance How to Apply Submit your application with the subject line "Data Science Intern Application." 📅 Deadline: 26th July 2025 Note:- Coorix.ai is an equal opportunity employer, welcoming diverse applicants.
Posted 1 week ago
2.0 - 5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title : AI/ML Engineer Experience : 2 - 5 Years Location : Gurgaon Key Skills : Python, TensorFlow, Machine Learning Algorithms Employment Type : Full-Time Compensation : 8 - 15 LPA Job Description We are seeking a highly motivated and skilled AI/ML Engineer to join our growing team in Gurgaon. The ideal candidate should have hands-on experience in developing and deploying machine learning models using Python and TensorFlow. You will work on designing intelligent systems and solving real-world problems using cutting-edge ML algorithms. Key Responsibilities Design, develop, and deploy robust ML models for classification, regression, recommendation, and anomaly detection tasks. Implement and optimize deep learning models using TensorFlow or related frameworks. Work with cross-functional teams to gather requirements, understand data pipelines, and deliver ML-powered features. Clean, preprocess, and explore large datasets to uncover patterns and extract insights. Evaluate model performance using standard metrics and implement strategies for model improvement. Automate model training and deployment pipelines using best practices in MLOps. Collaborate with data scientists, software developers, and product managers to bring AI features into production. Document model architecture, data workflows, and code in a clear and organized manner. Stay updated with the latest research and advancements in machine learning and AI. Requirements Bachelors or Masters degree in Computer Science, Data Science, AI/ML, or a related field. 2-5 years of hands-on experience in developing and deploying ML models in real-world projects. Strong programming skills in Python and proficiency in ML libraries like TensorFlow, scikit-learn, NumPy, Pandas. Solid understanding of supervised, unsupervised, and deep learning algorithms. Experience with data wrangling, feature engineering, and model evaluation techniques. Familiarity with version control tools (Git) and deployment tools is a plus. Good communication and problem-solving skills. Preferred (Nice To Have) Experience with cloud platforms (AWS, GCP, Azure) and ML services. Exposure to NLP, computer vision, or reinforcement learning. Familiarity with Docker, Kubernetes, or CI/CD pipelines for ML projects. (ref:hirist.tech)
Posted 1 week ago
0 years
0 Lacs
India
On-site
Job Summary: We are seeking a talented and driven Machine Learning Engineer to design, build, and deploy ML models that solve complex business problems and enhance decision-making capabilities. You will work closely with data scientists, engineers, and product teams to develop scalable machine learning pipelines, deploy models into production, and continuously improve their performance. Key Responsibilities: Design, develop, and deploy machine learning models for classification, regression, clustering, recommendation, NLP, or computer vision tasks. Collaborate with data scientists to prepare and preprocess large-scale datasets for training and evaluation. Implement and optimize machine learning pipelines and workflows using tools like MLflow, Airflow, or Kubeflow. Integrate models into production environments and ensure model performance, monitoring, and retraining. Conduct A/B testing and performance evaluations to validate model accuracy and business impact. Stay up-to-date with the latest advancements in ML/AI research and tools. Write clean, efficient, and well-documented code for reproducibility and scalability. Requirements: Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field. Strong knowledge of machine learning algorithms, data structures, and statistical methods. Proficient in Python and ML libraries/frameworks (e.g., scikit-learn, TensorFlow, PyTorch, XGBoost). Experience with data manipulation libraries (e.g., pandas, NumPy) and visualization tools (e.g., Matplotlib, Seaborn). Familiarity with cloud platforms (AWS, GCP, or Azure) and model deployment tools. Experience with version control systems (Git) and software engineering best practices. Preferred Qualifications: Experience in deep learning, natural language processing (NLP), or computer vision. Knowledge of big data technologies like Spark, Hadoop, or Hive. Exposure to containerization (Docker), orchestration (Kubernetes), and CI/CD pipelines. Familiarity with MLOps practices and tools.
Posted 1 week ago
1.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Assoicate AIML Engineer– Global Data Analytics, Technology (Maersk) This position will be based in India – Bangalore/Pune A.P. Moller - Maersk A.P. Moller – Maersk is the global leader in container shipping services. The business operates in 130 countries and employs 80,000 staff. An integrated container logistics company, Maersk aims to connect and simplify its customers’ supply chains. Today, we have more than 180 nationalities represented in our workforce across 131 Countries and this mean, we have elevated level of responsibility to continue to build inclusive workforce that is truly representative of our customers and their customers and our vendor partners too. We are responsible for moving 20 % of global trade & is on a mission to become the Global Integrator of Container Logistics. To achieve this, we are transforming into an industrial digital giant by combining our assets across air, land, ocean, and ports with our growing portfolio of digital assets to connect and simplify our customer’s supply chain through global end-to-end solutions, all the while rethinking the way we engage with customers and partners. The Brief In this role as an Associate AIML Engineer on the Global Data and Analytics (GDA) team, you will support the development of strategic, visibility-driven recommendation systems that serve both internal stakeholders and external customers. This initiative aims to deliver actionable insights that enhance supply chain execution, support strategic decision-making, and enable innovative service offerings. Data AI/ML (Artificial Intelligence and Machine Learning) Engineering involves the use of algorithms and statistical models to enable systems to analyse data, learn patterns, and make data-driven predictions or decisions without explicit human programming. AI/ML applications leverage vast amounts of data to identify insights, automate processes, and solve complex problems across a wide range of fields, including healthcare, finance, e-commerce, and more. AI/ML processes transform raw data into actionable intelligence, enabling automation, predictive analytics, and intelligent solutions. Data AI/ML combines advanced statistical modelling, computational power, and data engineering to build intelligent systems that can learn, adapt, and automate decisions. What I'll be doing – your accountabilities? Build and maintain machine learning models for various applications, such as natural language processing, computer vision, and recommendation systems Perform exploratory data analysis (EDA) to identify patterns and trends in data Clean, preprocess, perform hyperparameter tuning and analyze large datasets to prepare them for AI/ML model training Build, test, and optimize machine learning models and experiment with algorithms and frameworks to improve model performance Use programming languages, machine learning frameworks and libraries, algorithms, data structures, statistics and databases to optimize and fine-tune machine learning models to ensure scalability and efficiency Learn to define user requirements and align solutions with business needs Work on AI/ML engineering projects, perform feature engineering and collaborate with teams to understand business problems Learn best practices in data / AI/ML engineering and performance optimization Contribute to research papers and technical documentation Contribute to project documentation and maintain data quality standards Foundational Skills Understands Programming skills beyond the fundamentals and can demonstrate this skill in most situations without guidance. Understands the below skills beyond the fundamentals and can demonstrate in most situations without guidance AI & Machine Learning Data Analysis Machine Learning Pipelines Model Deployment Specialized Skills To be able to understand beyond the fundamentals and can demonstrate in most situations without guidance for the following skills: Deep Learning Statistical Analysis Data Engineering Big Data Technologies Natural Language Processing (NPL) Data Architecture Data Processing Frameworks Proficiency in Python programming. Proficiency in Python-based statistical analysis and data visualization tool While having limited understanding of Technical Documentation but are focused on growing this skill Qualifications & Requirements BSc/MSc/PhD in computer science, data science or related discipline with 1+ years of industry experience building cloud-based ML solutions for production at scale, including solution architecture and solution design experience Good problem solving skills, for both technical and non-technical domains Good broad understanding of ML and statistics covering standard ML for regression and classification, forecasting and time-series modeling, deep learning 3+ years of hands-on experience building ML solutions in Python, incl knowledge of common python data science libraries (e.g. scikit-learn, PyTorch, etc) Hands-on experience building end-to-end data products based on AI/ML technologies Some experience with scenario simulations. Experience with collaborative development workflow: version control (we use github), code reviews, DevOps (incl automated testing), CI/CD Team player, eager to collaborate and good collaborator Preferred Experiences In addition to basic qualifications, would be great if you have… Hands-on experience with common OR solvers such as Gurobi Experience with a common dashboarding technology (we use PowerBI) or web-based frontend such as Dash, Streamlit, etc. Experience working in cross-functional product engineering teams following agile development methodologies (scrum/Kanban/…) Experience with Spark and distributed computing Strong hands-on experience with MLOps solutions, including open-source solutions. Experience with cloud-based orchestration technologies, e.g. Airflow, KubeFlow, etc Experience with containerization (Kubernetes & Docker) As a performance-oriented company, we strive to always recruit the best person for the job – regardless of gender, age, nationality, sexual orientation or religious beliefs. We are proud of our diversity and see it as a genuine source of strength for building high-performing teams. Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing accommodationrequests@maersk.com.
Posted 1 week ago
3.0 years
0 Lacs
Jaipur, Rajasthan, India
On-site
Role Summary 1. Demonstrate solid proficiency in Python development, writing clean, maintainable code. 2. Collaborator in the design and implementation of AI-driven applications leveraging large language models (LLMs). 3. Develop and maintain Django-based RESTful APIs to support backend services. 4. Integrate with LLM provider APIs (e.g., GPT, Claude, Cohere) and agent frameworks (LangChain, AgentStudio). 5. Build and optimize data pipelines for model training and inference using Pandas, NumPy, and Scikit-learn. 6. Ensure robust unit and integration testing via pytest to maintain high code quality. 7. Participate in agile ceremonies, contributing estimations, design discussions, and retrospectives. 8. Troubleshoot, debug, and optimize performance in multi-threaded and distributed environments. 9. Document code, APIs, and data workflows in accordance with software development best practices. 10. Continuously learn and apply new AI/ML tools, frameworks, and cloud services. Key Responsibilities 1. Write, review, and optimize Python code for backend services and data science workflows. 2. Design and implement Django REST APIs, ensuring scalability and security. 3. Integrate LLMs into applications: handle prompt construction, API calls, and result processing. 4. Leverage agent frameworks (LangChain, AgentStudio) to orchestrate complex LLM workflows. 5. Develop and maintain pytest suites covering unit, integration, and end-to-end tests. 6. Build ETL pipelines to preprocess data for model training and feature engineering. 7. Work with relational databases (PostgreSQL) and vector stores (FAISS, Weaviate, Milvus). 8. Containerize applications using Docker and deploy on Kubernetes or serverless platforms (AWS, GCP, Azure). 9. Monitor and troubleshoot application performance, logging, and error handling. 10. Collaborate with data scientists to deploy and serve ML models via FastAPI or vLLM. 11. Maintain CI/CD pipelines for automated testing and deployment. 12. Engage in technical learning sessions and share best practices across the team. Desired Skills & Qualifications - 1–3 years of hands-on experience in Python application development. - Proven pytest expertise, with a focus on test-driven development. - Practical knowledge of Django (or FastAPI) for building RESTful services. - Experience with LLM APIs (OpenAI, Anthropic, Cohere) and prompt engineering. - Familiarity with at least one agent framework (LangChain, AgentStudio). - Working experience in data science libraries: NumPy, Pandas, Scikit-learn. - Exposure to ML model serving tools (MLflow, FastAPI, vLLM). - Experience with container orchestration (Docker, Kubernetes, Docker Swarm). - Basic understanding of cloud platforms (AWS, Azure, or GCP). - Knowledge of SQL and database design; familiarity with vector databases. - Eagerness to learn emerging AI/ML technologies and frameworks. - Excellent problem-solving, debugging, and communication skills. Education & Attitude - Bachelor’s or Master’s in Computer Science, Data Science, Statistics, Mathematics, or related field. - Growth-mindset learner: proactive in upskilling and sharing knowledge. - Strong collaboration ethos and adaptability in a fast-paced AI environment.
Posted 1 week ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Greetings ! One our our client TOP MNC Giant looking for GEN AI and Machine Learning Engineer's Important Notes: Please share only those profiles who can join immediately or within 7 days. Base Locations: Gurgaon and Bengaluru (hybrid setup 3 days work from office). Role : Associate and Sr Associate L1/L2 (Multiple Positions) SKILLS : Bachelor's or master’s degree in Computer Science, Data Science, Engineering, or a related field. Experience on Agentic AI/ Frameworks Strong programming skills in languages such as Python, SQL/NoSQL etc. Build analytical approach based on business requirements, then develop, train, and deploy machine learning models and AI algorithms Exposure to GEN AI models such as OpenAI, Google Gemini, Runway ML etc. Experience in developing and deploying AI/ML and deep learning solutions with libraries and frameworks, such as TensorFlow, PyTorch, Scikit-learn, OpenCV and/or Keras. Knowledge of math, probability, and statistics. Familiarity with a variety of Machine Learning, NLP, and deep learning algorithms. Exposure in developing API using Flask/Django. Good experience in cloud infrastructure such as AWS, Azure or GCP Exposure to Gen AI, Vector DB/Embeddings, LLM (Large language Model) GOOD TO HAVE : Experience with MLOps: MLFlow, Kubeflow, CI/CD Pipeline etc. Good to have experience in Docker, Kubernetes etc Exposure in HTML, CSS, Javascript/JQuery, Node.js, Angular/React Experience in Flask/Django is a bonus RESPONSIBILITIES : Collaborate with software engineers, business stake holders and/or domain experts to translate business requirements into product features, tools, projects, AI/ML, NLP/NLU and deep learning solutions. Develop, implement, and deploy AI/ML solutions. Preprocess and analyze large datasets to identify patterns, trends, and insights. Evaluate, validate, and optimize AI/ML models to ensure their accuracy, efficiency, and generalizability. Deploy applications and AI/ML model into cloud environment such as AWS/Azure/GCP etc. Monitor and maintain the performance of AI/ML models in production environments, identifying opportunities for improvement and updating models as needed. Document AI/ML model development processes, results, and lessons learned to facilitate knowledge sharing and continuous improvement. INTERESTED CANDIDATES PERFECT MATCH TO THE JD AND WHO CAN JOIN ASAP ONLY DO APPLY ALONG WITH BELOW MENTIONED DETAILS : Total exp : Relevant exp in AI/ ML : Applying for Gurgaon and Bengaluru : Open for Hybrid : Current CTC : Expected CTC : Can join ASAP : Will call you once we receive your updated profile along with above mentioned details. Thanks, Venkat Solti solti.v@anlage.co.in
Posted 1 week ago
0 years
0 Lacs
Rajasthan, India
Remote
Job Title: Jr. AI Engineer Location: Remote Job Type: Full-Time Job Summary: We are looking for a Jr. AI Engineer with a passion for Artificial Intelligence, Natural Language Processing (NLP), and Large Language Models (LLMs). The ideal candidate should have prior internship experience in AI/ML, familiarity with modern AI frameworks, and a strong desire to learn and grow in this field. You will work with the team to develop, fine-tune, and integrate LLMs, and contribute to building intelligent and scalable AI solutions. Key Responsibilities: 1. AI & LLM Development ● Assist in fine-tuning and customizing LLMs for specific use cases. ● Curate, preprocess, and manage datasets for training and evaluation. ● Experiment with AI model architectures and fine-tuning strategies under guidance. ● Support the integration of LLMs into web applications and APIs. 2. Data Handling & Analysis ● Perform data cleaning, transformation, and preparation for AI pipelines. ● Conduct exploratory data analysis and model evaluation. ● Work on data annotation and augmentation tasks when required. 3. AI Research Support ● Stay updated with the latest AI and NLP trends, research papers, and tools. ● Assist senior engineers in implementing state-of-the-art AI techniques. ● Contribute to internal technical documentation and knowledge sharing. 4. Testing & Optimization ● Help test AI models for accuracy, performance, and reliability. ● Identify issues in model predictions and suggest improvements. ● Support the deployment and monitoring of AI models on cloud or edge environments. 5. Collaboration & Learning ● Collaborate with developers, data scientists, and AI engineers. ● Participate in code reviews and brainstorming sessions. ● Continuously learn new AI technologies and frameworks. Qualifications: ● Education: Bachelor’s degree in Computer Science, Data Science, AI, or related field (or equivalent experience). ● Experience: Internship experience in AI/ML, NLP, or data science projects is mandatory. ● Skills: ○ Basic understanding of machine learning algorithms and NLP concepts. ○ Familiarity with Python and AI/ML libraries (e.g., TensorFlow, PyTorch, Hugging Face Transformers). ○ Knowledge of data preprocessing and dataset management. ○ Experience with REST APIs and version control (Git) is a plus. ○ Strong problem-solving and analytical skills. What We Offer: ● Competitive salary and growth opportunities. ● Hands-on exposure to LLMs and cutting-edge AI technologies. ● Collaborative and learning-focused environment. ● Guidance from senior AI engineers to build your career.
Posted 1 week ago
0.0 - 2.0 years
0 - 0 Lacs
Kochi, Kerala
On-site
Design, develop, and deploy machine learning and deep learning models. Work on NLP, computer vision, and recommendation systems. Analyze and preprocess datasets for model training and testing. Optimize models for scalability, accuracy, and efficiency. Stay updated with the latest AI research, tools, and frameworks. Collaborate with data engineers, product managers, and developers. Job Type: Full-time Pay: ₹20,000.00 - ₹40,000.00 per month Benefits: Provident Fund Schedule: Day shift Ability to commute/relocate: Kochi, Kerala: Reliably commute or planning to relocate before starting work (Preferred) Experience: AI/ML: 2 years (Required) Location: Kochi, Kerala (Preferred) Work Location: In person Expected Start Date: 28/07/2025
Posted 1 week ago
0 years
0 Lacs
India
Remote
🤖 Machine Learning Intern – Remote | Learn AI by Building It 📍 Location: Remote / Virtual 💼 Type: Internship (Unpaid) 🎁 Perks: Certificate After Completion || Letter of Recommendation (6 Months) 🕒 Schedule: 5–7 hrs/week | Flexible Timing Join Skillfied Mentor as a Machine Learning Intern and move beyond online courses. You’ll work on real datasets, build models, and see your algorithms in action — all while gaining experience that hiring managers actually look for. Whether you're aiming for a career in AI, data science, or automation — this internship will build your foundation with hands-on learning. 🔧 What You’ll Do: Work with real datasets to clean, preprocess, and transform data Build machine learning models using Python, NumPy, Pandas, Scikit-learn Perform classification, regression, and clustering tasks Use Jupyter Notebooks for experimentation and documentation Collaborate on mini-projects and model evaluation tasks Present insights in simple, digestible formats 🎓 What You’ll Gain: ✅ Full Python course included during the internship ✅ Hands-on projects to showcase on your resume or portfolio ✅ Certificate of Completion + LOR (6-month internship) ✅ Experience with industry-relevant tools & techniques ✅ Remote flexibility — manage your time with just 5–7 hours/week 🗓️ Application Deadline: 30th July 2025 👉 Apply now to start your ML journey with Skillfied Mentor
Posted 1 week ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Role Description: We are seeking a full-time Computer Vision Developer for an on-site role based in Gurugram . You will be responsible for designing, developing, and optimizing computer vision pipelines and algorithms for real-time applications. The role involves working on object detection, tracking, OCR, and edge AI deployment. You’ll collaborate closely with cross-functional teams to integrate vision models into embedded systems and smart devices, contributing to the development of AI-powered products from prototype to production. Key Responsibilities: Develop and implement computer vision algorithms Optimize models for real-time inference on edge devices (e.g., Jetson, Raspberry Pi, ARM boards) Preprocess and annotate image/video datasets for training and validation Train, fine-tune, and evaluate deep learning models using frameworks like PyTorch or TensorFlow Integrate models into production pipelines, working with embedded engineers where required Conduct research on state-of-the-art vision techniques and evaluate applicability Debug, profile, and optimize performance for low-latency deployments Collaborate across software, hardware, and product teams for end-to-end solution delivery Qualifications: Strong hands-on experience in computer vision Proficiency with deep learning frameworks (PyTorch, TensorFlow, ONNX) Experience with pattern recognition, object detection, segmentation, or OCR Familiarity with embedded systems, NVIDIA Jetson, or ARM-based platforms is a plus Solid understanding of image processing, linear algebra, and optimization techniques Experience with data annotation tools (LabelImg, CVAT, Roboflow) Strong problem-solving and debugging skills Bachelor’s or Master’s degree in Computer Science, AI, or related fields Bonus: Experience with real-time video processing, GStreamer, or edge model quantization
Posted 1 week ago
0 years
0 Lacs
India
Remote
Data Science Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About WebBoost Solutions by UM WebBoost Solutions by UM provides aspiring professionals with hands-on experience in data science , offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities ✅ Collect, preprocess, and analyze large datasets. ✅ Develop predictive models and machine learning algorithms . ✅ Perform exploratory data analysis (EDA) to extract meaningful insights. ✅ Create data visualizations and dashboards for effective communication of findings. ✅ Collaborate with cross-functional teams to deliver data-driven solutions . Requirements 🎓 Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field . 🐍 Proficiency in Python for data analysis and modeling. 🧠 Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) . 📊 Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib) . 🧐 Strong analytical and problem-solving skills. 🗣 Excellent communication and teamwork abilities. Stipend & Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based). ✔ Hands-on experience in data science projects . ✔ Certificate of Internship & Letter of Recommendation . ✔ Opportunity to build a strong portfolio of data science models and applications. ✔ Potential for full-time employment based on performance. How to Apply 📩 Submit your resume and a cover letter with the subject line "Data Science Intern Application." 📅 Deadline: 26th July 2025 Equal Opportunity WebBoost Solutions by UM is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Let me know if you need any modifications! 🚀
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough