Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
0 years
0 Lacs
India
Remote
Machine Learning Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship Application Deadline: 25th May 2025 About WebBoost Solutions by UM WebBoost Solutions by UM provides students and graduates with hands-on learning and career growth opportunities in machine learning and data science . Role Overview As a Machine Learning Intern , youβll work on real-world projects , gaining practical experience in machine learning and data analysis . Responsibilities β Design, test, and optimize machine learning models. β Analyze and preprocess datasets. β Develop algorithms and predictive models for various applications. β Use tools like TensorFlow, PyTorch, and Scikit-learn . β Document findings and create reports to present insights. Requirements π Enrolled in or graduate of a relevant program (AI, ML, Data Science, Computer Science, or related field). π Knowledge of machine learning concepts and algorithms . π Proficiency in Python or R (preferred). π€ Strong analytical and teamwork skills . Benefits π° Stipend: βΉ7,500 - βΉ15,000 (Performance-Based) (Paid) β Practical machine learning experience . β Internship Certificate & Letter of Recommendation . β Build your portfolio with real-world projects . How to Apply π© Submit your application by 25th May 2025 with the subject: "Machine Learning Intern Application" . Equal Opportunity WebBoost Solutions by UM is an equal opportunity employer , welcoming candidates from all backgrounds . Show more Show less
Posted 3 weeks ago
1.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Title: Data Scientist Contract Duration: 1 Year Location: Mumbai Experience Required: 3β7 Years Project Requirement We are seeking Junior Data Scientists with strong Python skills and experience in Jupyter Notebooks to support ongoing initiatives within the Teradata Data Lake/Lakehouse Platform . Responsibilities Collect, clean, and preprocess data from various sources for analysis. Perform exploratory data analysis to identify patterns, trends, and insights. Assist in developing and implementing machine learning models and algorithms using Python and Jupyter Notebooks. Collaborate with senior data scientists and cross-functional teams to understand business needs and propose data-driven solutions. Create clear data reports and visualizations to effectively communicate findings. Continuously improve data quality and stay updated with the latest trends in data science. Document all procedures related to data handling and analysis. Support the implementation of on-premise AI/ML solutions. Learn and apply Explainable AI (XAI) techniques to improve model transparency. Requirements Bachelor's degree in Computer Science, Statistics, Mathematics, or a related field. 3β7 years of relevant experience in Data Science or Analytics. Strong proficiency in Python and Jupyter Notebooks. Familiarity with Teradata and data lake/lakehouse architectures. Basic understanding of machine learning algorithms and statistical methods. Strong problem-solving and analytical skills. Effective communication skills and the ability to work collaboratively in a team. Exposure to on-premise AI/ML tools and solutions. Basic knowledge of Explainable AI concepts and practices. Skills Mandatory: Python (Data Science), Jupyter Notebooks Preferred: Teradata, Machine Learning, Data Lakehouse, Explainable AI Show more Show less
Posted 3 weeks ago
1.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Title: Data Scientist Contract Duration: 1 Year Location: Mumbai Experience Required: 3β7 Years Project Requirement We are seeking Junior Data Scientists with strong Python skills and experience in Jupyter Notebooks to support ongoing initiatives within the Teradata Data Lake/Lakehouse Platform . Responsibilities Collect, clean, and preprocess data from various sources for analysis. Perform exploratory data analysis to identify patterns, trends, and insights. Assist in developing and implementing machine learning models and algorithms using Python and Jupyter Notebooks. Collaborate with senior data scientists and cross-functional teams to understand business needs and propose data-driven solutions. Create clear data reports and visualizations to effectively communicate findings. Continuously improve data quality and stay updated with the latest trends in data science. Document all procedures related to data handling and analysis. Support the implementation of on-premise AI/ML solutions. Learn and apply Explainable AI (XAI) techniques to improve model transparency. Requirements Bachelor's degree in Computer Science, Statistics, Mathematics, or a related field. 3β7 years of relevant experience in Data Science or Analytics. Strong proficiency in Python and Jupyter Notebooks. Familiarity with Teradata and data lake/lakehouse architectures. Basic understanding of machine learning algorithms and statistical methods. Strong problem-solving and analytical skills. Effective communication skills and the ability to work collaboratively in a team. Exposure to on-premise AI/ML tools and solutions. Basic knowledge of Explainable AI concepts and practices. Skills Mandatory: Python (Data Science), Jupyter Notebooks Preferred: Teradata, Machine Learning, Data Lakehouse, Explainable AI Show more Show less
Posted 3 weeks ago
1.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Title: Data Scientist Contract Duration: 1 Year Location: Mumbai Experience Required: 3β7 Years Project Requirement We are seeking Junior Data Scientists with strong Python skills and experience in Jupyter Notebooks to support ongoing initiatives within the Teradata Data Lake/Lakehouse Platform . Responsibilities Collect, clean, and preprocess data from various sources for analysis. Perform exploratory data analysis to identify patterns, trends, and insights. Assist in developing and implementing machine learning models and algorithms using Python and Jupyter Notebooks. Collaborate with senior data scientists and cross-functional teams to understand business needs and propose data-driven solutions. Create clear data reports and visualizations to effectively communicate findings. Continuously improve data quality and stay updated with the latest trends in data science. Document all procedures related to data handling and analysis. Support the implementation of on-premise AI/ML solutions. Learn and apply Explainable AI (XAI) techniques to improve model transparency. Requirements Bachelor's degree in Computer Science, Statistics, Mathematics, or a related field. 3β7 years of relevant experience in Data Science or Analytics. Strong proficiency in Python and Jupyter Notebooks. Familiarity with Teradata and data lake/lakehouse architectures. Basic understanding of machine learning algorithms and statistical methods. Strong problem-solving and analytical skills. Effective communication skills and the ability to work collaboratively in a team. Exposure to on-premise AI/ML tools and solutions. Basic knowledge of Explainable AI concepts and practices. Skills Mandatory: Python (Data Science), Jupyter Notebooks Preferred: Teradata, Machine Learning, Data Lakehouse, Explainable AI Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Company Description NIQ is the worldβs leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insightsβdelivered with advanced analytics through state-of-the-art platformsβNIQ delivers the Full Viewβ’. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the worldβs population. Job Description Responsibilities: Develop and maintain Power BI dashboards and reports. Utilize Power Query for data transformation and manipulation. Write and optimize SQL queries for data extraction and analysis. Perform statistical analysis using R. Develop and implement data analysis scripts in Python. Collaborate with cross-functional teams to understand data needs and deliver solutions. Present findings and recommendations to stakeholders. This role requires a strong analytical mindset, proficiency in BI tools, and the ability to translate complex data into actionable insights. Produce comprehensive Discover reports for our clients worldwide, which includes gathering and analysing data, ensuring accuracy and relevance, and presenting findings in a clear and actionable format. Collaborate closely with various teams to understand client needs and deliver Discover reports that drive business decisions. Collaborate with cross-functional teams to understand business requirements and translate them into specifications. Ensure data accuracy, integrity, and consistency across all BI solutions. Stay updated with the latest BI technologies and industry trends to continuously improve BI processes. Key Duties: Collect, clean, and preprocess data from various sources. Design and implement data models to support reporting and analytics. Execute, monitor, and continuously improve assigned production tasks, including maintenance and data quality checks. Identify trends, patterns, and anomalies in data sets. Provide technical support and training to team members on data tools and techniques. Stay updated with the latest industry trends and best practices in data analysis. Demonstrate proficiency in VBA (for creating custom excel dashboards for real-time reporting) and Power BI, as these skills are highly advantageous. Understand the regular execution process with thorough attention to detail and identify opportunities for automation and improvement. Maintain the high quality of setups across various markets reported by NielsenIQ and analyse any potential client data concerns. Engage frequently in cross-departmental collaboration as a crucial link in the chain of NielsenIQ activities. Execute production tasks to ensure data accuracy and trend analysis within scheduled deadlines. Investigate data inquiries and challenges in collaboration with local, regional, and offshore teams. Prepare accurate tracking KPIs to monitor and improve quality performance promptly. Qualifications Minimum 3+ yrs industry experience Advanced proficiency in Power BI, Python and SQL. Intermediate to advanced proficiency in Tableau. Strong SQL & Python skills for data querying and manipulation. Experience with R for statistical analysis. Excellent analytical and problem-solving skills. Ability to communicate complex data insights effectively. Strong attention to detail and organizational skills. Show interest in the Market Research domain and possess knowledge in collating, cleansing, analysing, interpreting, and visualizing large volumes of data. Demonstrate good communication skills. Be enthusiastic about learning and growing within the function. Be capable of learning upstream and downstream processes to ensure efficiency and quality delivery in the current role. Be flexible with shift timings, including night shifts. Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the worldβs leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insightsβdelivered with advanced analytics through state-of-the-art platformsβNIQ delivers the Full Viewβ’. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the worldβs population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
India
Remote
CryptoChakra is a leading cryptocurrency analytics and education platform committed to demystifying digital asset markets for traders, investors, and enthusiasts worldwide. By integrating cutting-edge AI-driven predictions, blockchain analytics, and immersive learning modules, we empower users to navigate market volatility with confidence. Our platform combines advanced tools like Python, TensorFlow, and AWS to deliver actionable insights, risk assessments, and educational content that bridge the gap between complex data and strategic decision-making. As a remote-first innovator, we champion accessibility in decentralized finance, fostering a future where crypto literacy is universal. Position: Fresher Data Scientist Intern Remote | Full-Time Internship | Compensation: Paid/Unpaid based on suitability Role Summary Join CryptoChakraβs data science team to gain hands-on experience in transforming raw blockchain data into impactful insights. This role is tailored for recent graduates or students eager to apply foundational skills in machine learning, statistical analysis, and data storytelling to real-world crypto challenges. Key Responsibilities Data Processing: Clean and preprocess blockchain datasets from sources like Etherscan or CoinGecko using Python/R. Predictive Modeling: Assist in building and testing ML models for price forecasting or DeFi trend analysis. Insight Generation: Create visualizations (Tableau, Matplotlib) to simplify complex trends for educational content. Collaboration: Work with engineers and educators to refine analytics tools and tutorials. Documentation: Maintain clear records of methodologies and findings for team reviews. Who Weβre Looking For Technical Skills Foundational knowledge of Python/R for data manipulation (Pandas, NumPy). Basic understanding of statistics (regression, hypothesis testing). Familiarity with data visualization tools (Tableau, Power BI) or libraries (Seaborn). Curiosity about blockchain technology, DeFi, or crypto markets. Soft Skills Eagerness to learn and adapt in a fast-paced remote environment. Strong problem-solving mindset and attention to detail. Ability to communicate technical concepts clearly. Preferred (Not Required) Academic projects involving data analysis or machine learning. Exposure to SQL, AWS, or big data tools. Pursuing a degree in Data Science, Computer Science, Statistics, or related fields. What We Offer Mentorship: Guidance from experienced data scientists and blockchain experts. Skill Development: Training in real-world tools like TensorFlow and Tableau. Portfolio Projects: Contribute to live projects featured on CryptoChakraβs platform. Flexibility: Remote work with adaptable hours for students. Show more Show less
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
At EY, youβll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And weβre counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description : EY GDS β Data and Analytics (D And A) β Senior β Senior Data Scientist Role Overview: We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus. Minimum 3-7 years of experience in Data Science and Machine Learning. In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
At EY, youβll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And weβre counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description : EY GDS β Data and Analytics (D And A) β Senior β Senior Data Scientist Role Overview: We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus. Minimum 3-7 years of experience in Data Science and Machine Learning. In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, youβll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And weβre counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description : EY GDS β Data and Analytics (D And A) β Senior β Senior Data Scientist Role Overview: We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus. Minimum 3-7 years of experience in Data Science and Machine Learning. In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
At EY, youβll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And weβre counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description : EY GDS β Data and Analytics (D And A) β Senior β Senior Data Scientist Role Overview: We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus. Minimum 3-7 years of experience in Data Science and Machine Learning. In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
India
Remote
Job Title: Machine Learning Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with Certificate of Internship Application Deadline : 24th May 2025 About Unified Mentor Unified Mentor provides students and graduates with hands-on learning opportunities and career growth in Machine Learning and Data Science. Role Overview As a Machine Learning Intern, you will work on real-world projects, enhancing your practical skills in data analysis and model development. Responsibilities β Design, test, and optimize machine learning models β Analyze and preprocess datasets β Develop algorithms and predictive models β Use tools like TensorFlow, PyTorch, and Scikit-learn β Document findings and create reports Requirements π Enrolled in or a graduate of a relevant program (Computer Science, AI, Data Science, or related field) π§ Knowledge of machine learning concepts and algorithms π» Proficiency in Python or R (preferred) π€ Strong analytical and teamwork skills Benefits π° Stipend: βΉ7,500 - βΉ15,000 (Performance-Based) (Paid) β Hands-on machine learning experience β Internship Certificate & Letter of Recommendation β Real-world project contributions for your portfolio Equal Opportunity Unified Mentor is an equal-opportunity employer, welcoming candidates from all backgrounds. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
India
Remote
Job Title: Data Science Intern Job Type: Internship (3 to 6 Months) Location: Remote / Pune, India Stipend: Unpaid (with opportunity for full-time offer upon completion) Work Mode: Remote (with optional in-office collaboration) About Coreline Solutions Coreline Solutions is an innovation-led IT services and consulting company helping organizations leverage the power of data and technology. We specialize in custom software development, digital transformation, and building intelligent data solutions. With a culture rooted in learning and growth, we offer opportunities that challenge, empower, and elevate your career. π Website: [www.corelinesolutions.site] π§ Email: hr@corelinesolutions.site π Address: 2nd Floor, TechHub Plaza,Pune, India About the Role We are looking for a highly motivated Data Science Intern to join our team. This is an exciting opportunity for students or recent graduates who are eager to apply theoretical knowledge to real-world datasets and gain hands-on experience in data science projects. Youβll work closely with our data science and engineering teams on impactful initiatives involving predictive modeling, data wrangling, and algorithm development. Key Responsibilities Assist in designing and building machine learning models. Collect, clean, and preprocess structured and unstructured datasets. Perform exploratory data analysis (EDA) to identify patterns and insights. Support data science projects by implementing algorithms and validating results. Work on statistical modeling, feature engineering, and model evaluation. Contribute to the development of automation tools and pipelines. Qualifications Pursuing or recently completed a degree in Data Science, Computer Science, Statistics, Engineering, or a related field. Strong foundation in Python and relevant libraries (NumPy, pandas, scikit-learn, Matplotlib, etc.). Understanding of statistics, linear regression, classification, clustering, and model evaluation. Experience with Jupyter Notebooks, Git, and collaborative coding. Basic knowledge of SQL and database systems. Strong problem-solving and analytical thinking skills. Preferred (Nice to Have): Exposure to deep learning frameworks like TensorFlow or PyTorch. Knowledge of cloud platforms (AWS, Google Cloud, or Azure). Experience with real-world datasets or open-source projects. Understanding of business problem framing and solution deployment. What Youβll Gain Exposure to real-time data science problems and solutions. Mentorship and feedback from experienced data scientists and engineers. Access to in-house training materials and tools. Internship Certificate on successful completion. Letter of Recommendation for exceptional performance. Strong chance for full-time placement based on performance. Equal Opportunity Statement Coreline Solutions is an equal opportunity employer. We are committed to fostering an inclusive workplace where diversity is valued and discrimination of any kind is not tolerated. Application Instructions Send your resume and a short cover letter hr@corelinesolutions.site with the subject line: βApplication for Data Science Intern β [Your Full Name]β πΌ Stay connected and follow our LinkedIn page to keep up with more openings and updates from Coreline Solutions. Show more Show less
Posted 3 weeks ago
1.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Title: Data Scientist Contract Duration: 1 Year Location: Mumbai Experience Required: 3β7 Years Project Requirement We are seeking Junior Data Scientists with strong Python skills and experience in Jupyter Notebooks to support ongoing initiatives within the Teradata Data Lake/Lakehouse Platform . Responsibilities Collect, clean, and preprocess data from various sources for analysis. Perform exploratory data analysis to identify patterns, trends, and insights. Assist in developing and implementing machine learning models and algorithms using Python and Jupyter Notebooks. Collaborate with senior data scientists and cross-functional teams to understand business needs and propose data-driven solutions. Create clear data reports and visualizations to effectively communicate findings. Continuously improve data quality and stay updated with the latest trends in data science. Document all procedures related to data handling and analysis. Support the implementation of on-premise AI/ML solutions. Learn and apply Explainable AI (XAI) techniques to improve model transparency. Requirements Bachelor's degree in Computer Science, Statistics, Mathematics, or a related field. 3β7 years of relevant experience in Data Science or Analytics. Strong proficiency in Python and Jupyter Notebooks. Familiarity with Teradata and data lake/lakehouse architectures. Basic understanding of machine learning algorithms and statistical methods. Strong problem-solving and analytical skills. Effective communication skills and the ability to work collaboratively in a team. Exposure to on-premise AI/ML tools and solutions. Basic knowledge of Explainable AI concepts and practices. Skills Mandatory: Python (Data Science), Jupyter Notebooks Preferred: Teradata, Machine Learning, Data Lakehouse, Explainable AI Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
India
Remote
Job Title: Data Science Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About WebBoost Solutions by UM WebBoost Solutions by UM provides aspiring professionals with hands-on experience in data science , offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities β Collect, preprocess, and analyze large datasets. β Develop predictive models and machine learning algorithms . β Perform exploratory data analysis (EDA) to extract meaningful insights. β Create data visualizations and dashboards for effective communication of findings. β Collaborate with cross-functional teams to deliver data-driven solutions . Requirements π Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field . π Proficiency in Python or R for data analysis and modeling. π§ Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) . π Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib) . π§ Strong analytical and problem-solving skills. π£ Excellent communication and teamwork abilities. Stipend & Benefits π° Stipend: βΉ7,500 - βΉ15,000 (Performance-Based). β Hands-on experience in data science projects . β Certificate of Internship & Letter of Recommendation . β Opportunity to build a strong portfolio of data science models and applications. β Potential for full-time employment based on performance. How to Apply π© Submit your resume and a cover letter with the subject line "Data Science Intern Application." π Deadline: 24th May 2025 Equal Opportunity WebBoost Solutions by UM is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Let me know if you need any modifications! π Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
What we're looking for: At least 5 years of experience in designing & building AI applications for customer and deploying them into production. Software engineering experience in building Secure, scalable and performant applications for customers Experience with Document extraction using AI, Conversational AI, Vision AI, NLP or Gen AI Design, develop, and operationalize existing ML models by fine tuning, personalizing it Evaluate machine learning models and perform necessary tuning Develop prompts that instruct LLM to generate relevant and accurate responses Collaborate with data scientists and engineers to analyze and preprocess datasets for prompt development, including data cleaning, transformation, and augmentation Conduct thorough analysis to evaluate LLM responses, iteratively modify prompts to improve LLM performance Hands on customer experience with RAG solution or fine tuning of LLM model Build and deploy scalable machine learning pipelines on GCP or any equivalent cloud platform involving data warehouses, machine learning platforms, dashboards or CRM tools Experience working with the end-to-end steps involving but not limited to data cleaning, exploratory data analysis, dealing outliers, handling imbalances, analyzing data distributions (univariate, bivariate, multivariate), transforming numerical and categorical data into features, feature selection, model selection, model training and deployment Proven experience building and deploying machine learning models in production environments for real life applications Good understanding of natural language processing, computer vision or other deep learning techniques Expertise in Python, Numpy, Pandas and various ML libraries (e.g., XGboost, TensorFlow, PyTorch, Scikit-learn, LangChain) Familiarity with Google Cloud or any other Cloud Platform and its machine learning services Excellent communication, collaboration, and problem-solving skills Good to Have Google Cloud Certified Professional Machine Learning or TensorFlow Certified Developer certifications or equivalent Experience of working with one or more public cloud platforms - namely GCP, AWS or Azure Experience with AutoML and vision techniques Masterβs degree in statistics, machine learning or related fields Show more Show less
Posted 3 weeks ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About the Business Manupatra provides legal, regulatory, and analytics that help customers increase their productivity, improve decision-making, achieve better outcomes, and advance the rule of law. As a digital pioneer, the company was the first to bring legal and business information online in India. About Our Team Manupatra, serves customers in more than 20 countries is a leading provider of information-based analytics and decision tools for professional and business customers. Our company has been a long-time leader in deploying advanced technologies to the legal market to improve productivity and transform the overall business and practice of law, About the Role As the Senior AI Engineer/ Data Engineer, we are looking for a skilled LLM Application Developer to join our team. You will be responsible for implementing large language model (LLM) based applications, working with proprietary and open-source models as well as popular frameworks to ensure seamless integration and deployment. Responsibilities Building standalone applications that interact with LLM models Building RAG-based applications Understanding of Vector DBs like Solr for LLM. Preprocess and manage data for training and deployment. Collaborate with cross-functional teams to define, design, and ship new features. Write clean, maintainable, and efficient code. Document development processes, code, and APIs. Design, prototype, implement, deploy, and maintain features for NLP or AI-related projects Requirements B.Tech in Computer Science, Data Science, or related field or equivalent experience. Proven experience in building customer facing ML based APIs Experience in developing applications that are scalable to handle TBs of data Strong knowledge of API integration (RESTful, GraphQL). Experience with data preprocessing, SQL, and NoSQL databases as well as vector stores (e.g., Postgres, MySQL, Solr, Elasticsearch/OpenSearch, etc.) Familiarity with deployment tools (Docker, Kubernetes). Experience with DevOps tools like Jenkins, Terraform, Cloud Formation templates is highly preferred. Excellent problem-solving and communication skills. Experience with Spark/Hadoop, EMR or any other Big Data technology would be a plus Certifications in machine learning, data science, or cloud computing. Portfolio showcasing past projects or contributions to open-source projects. Have at least 3+ years of Software Engineering experiences as a team member or team mentor in a mid to large technical company Experience working with Python, and optional at least one other programming language such as Flask, Django, FastAPI , Golang, Java, SQL, etc. Experience of successfully implement development processes, coding best practices, and code reviews, familiar with CI/CD, DevOps, Redis, Docker,K8S,AZURE Good sense of software architecture design, application scaling, performance, and security. Solid verbal and written communication skills Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
India
Remote
Data Science Intern Company: INLIGHN TECH Location: Remote (100% Virtual) Duration: 3 Months Stipend for Top Interns: βΉ15,000 Certificate Provided | Letter of Recommendation | Full-Time Offer Based on Performance About the Company: INLIGHN TECH empowers students and fresh graduates with real-world experience through hands-on, project-driven internships. The Data Science Internship is designed to equip you with the skills required to extract insights, build predictive models, and solve complex problems using data. Role Overview: As a Data Science Intern, you will work on real-world datasets to develop machine learning models, perform data wrangling, and generate actionable insights. This internship will help you strengthen your technical foundation in data science while working on projects that have a tangible business impact. Key Responsibilities: Collect, clean, and preprocess data from various sources Apply statistical methods and machine learning techniques to extract insights Build and evaluate predictive models for classification, regression, or clustering tasks Visualize data using libraries like Matplotlib, Seaborn, or tools like Power BI Document findings and present results to stakeholders in a clear and concise manner Collaborate with team members on data-driven projects and innovations Qualifications: Pursuing or recently completed a degree in Data Science, Computer Science, Mathematics, or a related field Proficiency in Python and data science libraries (NumPy, Pandas, Scikit-learn, etc.) Understanding of statistical analysis and machine learning algorithms Familiarity with SQL and data visualization tools or libraries Strong analytical, problem-solving, and critical thinking skills Eagerness to learn and apply data science techniques to solve real-world problems Internship Benefits: Hands-on experience with real datasets and end-to-end data science projects Certificate of Internship upon successful completion Letter of Recommendation for top performers Build a strong portfolio of data science projects and models Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Micoworks is a company with a clear mission: to Empower every brand for the better future . This ambitious goal sets the stage for their vision and core values. Who we are By 2030, Micoworks aims to be the Asia No.1 Brand Empowerment Company . This mid-term goal outlines their dedication to becoming the leading force in empowering brands across Asia. To achieve their mission and vision, Micoworks identifies four key values that guide their work: WOW THE CUSTOMER SMART SPEED OPEN MIND ALL FOR ONE Micoworks' mission, vision, and values paint a picture of a company dedicated to empowering brands, working with agility and open-mindedness, and prioritising customer success. Job Summary The Senior Data Scientist will work on data-driven initiatives to solve complex business challenges, leveraging advanced analytics, machine learning, and statistical modeling. This role requires expertise in translating data insights into actionable strategies and collaborating with cross-functional teams. Ideal candidates will have a strong background in analytics, or tech-driven industries.Key Responsibilities Develop and deploy predictive models (e.g., customer lifetime value, media mix modeling, time-series forecasting) using Python/R, TensorFlow, or PyTorch. Clean, preprocess, and validate large datasets (structured/unstructured) from multiple sources. Partner with stakeholders (e.g., marketing, finance) to design data-driven solutions (e.g., A/B testing) Ensure adherence to data privacy and ethical AI practices Research and implement cutting-edge techniques (e.g., NLP, deep learning) to enhance business strategies. Required Qualifications Education: Masterβs/PhD in Statistics, Computer Science, Econometrics, or related quantitative fields. Experience: 5+ years in data science, with expertise in: Programming: Python/R, SQL, Spark, and libraries (Pandas, Scikit-learn). Statistical methods: Decision trees, regression, DL and experimental design. Cloud platforms: Azure, Databricks, or AWS 5. Soft Skills: Strong storytelling, stakeholder management, and problem-solving. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
India
Remote
AI and Machine Learning Intern Company: INLIGHN TECH Location: Remote (100% Virtual) Duration: 3 Months Stipend for Top Interns: βΉ15,000 Certificate Provided | Letter of Recommendation | Full-Time Offer Based on Performance About the Company: INLIGHN TECH empowers students and fresh graduates with real-world experience through hands-on, project-driven internships. The AI and Machine Learning Internship is crafted to provide practical exposure to building intelligent systems, enabling interns to bridge theoretical knowledge with real-world applications. Role Overview: As an AI and Machine Learning Intern, you will work on projects involving data preprocessing, model development, and performance evaluation. This internship will strengthen your skills in algorithm design, model optimization, and deploying AI solutions to solve real-world problems. Key Responsibilities: Collect, clean, and preprocess datasets for training machine learning models Implement machine learning algorithms for classification, regression, and clustering Develop deep learning models using frameworks like TensorFlow or PyTorch Evaluate model performance using metrics such as accuracy, precision, and recall Collaborate on AI-driven projects, such as chatbots, recommendation engines, or prediction systems Document code, methodologies, and results for reproducibility and knowledge sharing Qualifications: Pursuing or recently completed a degree in Computer Science, Data Science, Artificial Intelligence, or a related field Strong foundation in Python and understanding of libraries such as Scikit-learn, NumPy, Pandas, and Matplotlib Familiarity with machine learning concepts like supervised and unsupervised learning Experience or interest in deep learning frameworks (TensorFlow, Keras, PyTorch) Good problem-solving skills and a passion for AI innovation Eagerness to learn and contribute to real-world ML applications Internship Benefits: Hands-on experience with real-world AI and ML projects Certificate of Internship upon successful completion Letter of Recommendation for top performers Build a strong portfolio of AI models and machine learning solutions Show more Show less
Posted 3 weeks ago
40.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: AI/ML Developer Job Description We are seeking a talented AI/ML Developer with expertise in Machine Learning and Deep Learning. This role requires a strong understanding of machine learning frameworks and deep learning models, along with specialization in either Large Language Models (LLMs) or Computer Vision. The ideal candidate will possess programming proficiency in Python and have experience with a variety of libraries and deployment tools. Responsibilities Develop and implement machine learning models, focusing on deep learning architectures such as CNNs, RNNs, Transformers, and GANs. Specialize in either Large Language Models (LLMs) or Computer Vision applications. Analyze and preprocess structured and unstructured datasets to identify potential AI/ML use cases. Design, build, train, and deploy machine learning models using cloud platforms like AWS, GCP, or Azure. Optimize model performance and scalability in production environments using tools such as Docker and Kubernetes. Solve complex AI/ML problems efficiently and effectively. Essential Skills Strong expertise in machine learning frameworks, including TensorFlow. Proficiency in Python and libraries like NumPy, OpenCV, Hugging Face, and Scikit-learn. Experience in deploying models on cloud platforms and using deployment tools such as Docker and Kubernetes. Proven track record of solving complex AI/ML problems. Additional Skills & Qualifications Experience in analyzing and preprocessing data to develop AI/ML use cases. Ability to design and deploy models for NLP applications using Large Language Models. Capability to build models for image recognition, object detection, and video analytics in Computer Vision. Work Environment This position offers an innovative work environment where cutting-edge technologies are utilized to develop advanced AI/ML solutions. The role involves collaboration with cross-functional teams and requires adaptability to dynamic project requirements. The work setting encourages continuous learning and application of machine learning techniques. About Actalent Actalent is a global leader in engineering and sciences services. For more than 40 years, weβve helped visionary companies advance their goals. Headquartered in the United States, our teams span 150 offices across North America, EMEA, and APACβwith four delivery centers in India led by 1,000+ extraordinary employees who connect their passion with purpose every day. Our Bangalore, Hyderabad, Pune, and Chennai delivery centers are hubs of engineering expertise, with core capabilities in mechanical and electrical engineering, systems and software, and manufacturing engineering. Our teams deliver work across multiple industries including transportation, consumer and industrial products, and life sciences. We serve more than 4,500 clients, including many Fortune 500 brands. Learn more about how we can work together at actalentservices.com. Show more Show less
Posted 3 weeks ago
3.0 years
0 Lacs
India
Remote
About BeGig BeGig is the leading tech freelancing marketplace. We empower innovative, early-stage, non-tech founders to bring their visions to life by connecting them with top-tier freelance talent. By joining BeGig, you're not just taking on one roleβyouβre signing up for a platform that will continuously match you with high-impact opportunities tailored to your expertise. Your Opportunity Join our network as a Computer Vision Engineer and help startups build AI systems that understand, analyze, and act on visual data. From object detection and facial recognition to medical imaging and video analytics, you'll work on real-world use cases that require state-of-the-art computer vision solutions. Role Overview As a Computer Vision Engineer, you will: Design, train, and deploy computer vision models for specific business applications Work with image, video, or 3D data to extract insights and automate workflows Collaborate with teams to integrate CV models into scalable products What Youβll Do Build and fine-tune models for classification, detection, segmentation, tracking, or OCR Use libraries like OpenCV, PyTorch, TensorFlow, Detectron2, or YOLO Preprocess and augment datasets to improve model robustness Deploy models using APIs, edge devices, or cloud-based inference tools Monitor performance and continuously optimize for accuracy and speed Technical Requirements 3+ years of experience in computer vision or deep learning Proficient in Python and frameworks like PyTorch, TensorFlow, or Keras Experience with OpenCV, scikit-image, and image/video processing pipelines Familiarity with model deployment using ONNX, TensorRT, or cloud services Bonus: experience with real-time CV, synthetic data, or 3D vision What Weβre Looking For A hands-on developer who can take vision-based problems from idea to production A freelancer who enjoys working with data-rich products and diverse use cases Someone who can collaborate with both technical and product teams to deliver real impact Why Join Us Work on challenging computer vision projects across industries Fully remote and flexible freelance opportunities Get matched with future roles in CV, AI, and edge deployment Join a growing network solving real-world problems with intelligent vision systems Ready to bring vision to life? Apply now to become a Computer Vision Engineer with BeGig. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Work Level : Individual Core : Disciplined Leadership : Team Alignment Industry Type : Information Technology Function : Data Analyst Key Skills : Python,Power App,Data Analysis,PLSQL Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner Responsibilities: Collect, clean, and preprocess data from various sources. This is a Remote position. Perform exploratory data analysis (EDA) to identify trends and patterns. Develop dashboards and reports using tools like Excel, Power BI, or Tableau. Use SQL to query and manipulate large datasets. Assist in building predictive models and performing statistical analyses. Present insights and recommendations based on data findings. Collaborate with cross-functional teams to support data-driven decision-making. Requirements Currently pursuing or recently completed a degree in Data Science, Statistics, Mathematics, Computer Science, or a related field. Strong analytical and problem-solving skills. Proficiency in Excel and SQL for data analysis. Experience with data visualization tools like Power BI, Tableau, or Google Data Studio. Basic knowledge of Python or R for data analysis is a plus. Understanding of statistical methods and data modeling concepts. Strong attention to detail and ability to work independently. Excellent communication skills to present insights clearly. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
India
Remote
Machine Learning Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship Application Deadline: 24th May 2025 About WebBoost Solutions by UM WebBoost Solutions by UM provides students and graduates with hands-on learning and career growth opportunities in machine learning and data science . Role Overview As a Machine Learning Intern , youβll work on real-world projects , gaining practical experience in machine learning and data analysis . Responsibilities β Design, test, and optimize machine learning models. β Analyze and preprocess datasets. β Develop algorithms and predictive models for various applications. β Use tools like TensorFlow, PyTorch, and Scikit-learn . β Document findings and create reports to present insights. Requirements π Enrolled in or graduate of a relevant program (AI, ML, Data Science, Computer Science, or related field). π Knowledge of machine learning concepts and algorithms . π Proficiency in Python or R (preferred). π€ Strong analytical and teamwork skills . Benefits π° Stipend: βΉ7,500 - βΉ15,000 (Performance-Based) (Paid) β Practical machine learning experience . β Internship Certificate & Letter of Recommendation . β Build your portfolio with real-world projects . How to Apply π© Submit your application by 24th May 2025 with the subject: "Machine Learning Intern Application" . Equal Opportunity WebBoost Solutions by UM is an equal opportunity employer , welcoming candidates from all backgrounds . Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
India
Remote
Machine Learning Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship Application Deadline: 24th May 2025 About WebBoost Solutions by UM WebBoost Solutions by UM provides students and graduates with hands-on learning and career growth opportunities in machine learning and data science . Role Overview As a Machine Learning Intern , youβll work on real-world projects , gaining practical experience in machine learning and data analysis . Responsibilities β Design, test, and optimize machine learning models. β Analyze and preprocess datasets. β Develop algorithms and predictive models for various applications. β Use tools like TensorFlow, PyTorch, and Scikit-learn . β Document findings and create reports to present insights. Requirements π Enrolled in or graduate of a relevant program (AI, ML, Data Science, Computer Science, or related field) π Knowledge of machine learning concepts and algorithms . π Proficiency in Python or R (preferred). π€ Strong analytical and teamwork skills . Benefits π° Stipend: βΉ7,500 - βΉ15,000 (Performance-Based) (Paid) β Practical machine learning experience . β Internship Certificate & Letter of Recommendation . β Build your portfolio with real-world projects . How to Apply π© Submit your application by 24th May 2025 with the subject: "Machine Learning Intern Application" . Equal Opportunity WebBoost Solutions by UM is an equal opportunity employer , welcoming candidates from all backgrounds . Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
India
Remote
Job Title: Data Science Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About WebBoost Solutions by UM WebBoost Solutions by UM provides aspiring professionals with hands-on experience in data science , offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities β Collect, preprocess, and analyze large datasets. β Develop predictive models and machine learning algorithms . β Perform exploratory data analysis (EDA) to extract meaningful insights. β Create data visualizations and dashboards for effective communication of findings. β Collaborate with cross-functional teams to deliver data-driven solutions . Requirements π Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field . π Proficiency in Python or R for data analysis and modeling. π§ Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) . π Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib) . π§ Strong analytical and problem-solving skills. π£ Excellent communication and teamwork abilities. Stipend & Benefits π° Stipend: βΉ7,500 - βΉ15,000 (Performance-Based). β Hands-on experience in data science projects . β Certificate of Internship & Letter of Recommendation . β Opportunity to build a strong portfolio of data science models and applications. β Potential for full-time employment based on performance. How to Apply π© Submit your resume and a cover letter with the subject line "Data Science Intern Application." π Deadline: 24th May 2025 Equal Opportunity WebBoost Solutions by UM is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Let me know if you need any modifications! π Show more Show less
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The preprocess job market in India is thriving with opportunities for skilled professionals in various industries. Preprocess roles are crucial for data processing, cleaning, and transformation tasks that are essential for businesses to make informed decisions and gain insights from data. Job seekers with expertise in preprocess tools and techniques are in high demand across industries like IT, finance, healthcare, marketing, and more.
These major cities are actively hiring for preprocess roles, offering a wide range of opportunities for job seekers looking to kickstart or advance their careers in this field.
The average salary range for preprocess professionals in India varies based on experience levels. Entry-level professionals can expect to earn around INR 3-5 lakhs per annum, while experienced preprocess specialists can command salaries ranging from INR 8-15 lakhs per annum.
In the preprocess domain, a typical career path may progress as follows: - Junior Preprocess Analyst - Preprocess Specialist - Senior Preprocess Engineer - Preprocess Team Lead - Preprocess Manager
As professionals gain experience and expertise in preprocess tools and techniques, they can advance to higher roles with more responsibilities and leadership opportunities.
In addition to expertise in preprocess tools and techniques, professionals in this field are often expected to have or develop skills in: - Data analysis - Data visualization - Programming languages like Python, R, or SQL - Machine learning - Statistical analysis
Having a diverse skill set can enhance the career prospects of preprocess professionals and open up new opportunities in the data-driven industry.
(Provide 25 interview questions with varying difficulty levels)
As you explore preprocess jobs in India, remember to equip yourself with the necessary skills and knowledge to stand out in a competitive job market. Prepare thoroughly for interviews, showcase your expertise in preprocess tools and techniques, and apply confidently to secure exciting opportunities in this dynamic field. Good luck on your job search journey!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2