Home
Jobs
Companies
Resume

585 Preprocess Jobs - Page 7

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 3.0 years

0 Lacs

Greater Hyderabad Area

On-site

Linkedin logo

Job Description Design and develop machine learning models tailored to mechanical engineering challenges, including predictive modelling, simulation optimisation, and failure analysis. Utilise deep learning and other advanced ML techniques to improve the accuracy and efficiency of CAE simulations. Preprocess and analyse large datasets from CAE simulations, experimental tests, and manufacturing processes for modelling. Train, validate, and fine-tune machine learning models using real-world engineering data. Optimise models for performance, scalability, and robustness in production environments. Collaborate with CAE engineers to integrate ML models into existing simulation workflows (e.g., FEA, CFD, structural analysis). Automate repetitive simulation tasks and enable predictive analytics for design optimisation. Work closely with mechanical engineers, data scientists, and software developers to identify business challenges and develop data-driven solutions. Deploy machine learning models into production environments and monitor their performance. Maintain and update models to ensure reliability and continuous improvement. Stay abreast of the latest advancements in machine learning, AI, and CAE technologies. Apply innovative approaches to solve complex engineering problems. Requirements Bachelor’s or Master’s degree in Mechanical Engineering, Computer Science, or a related field Proven 2-3 years of experience in developing and deploying machine learning models, preferably in mechanical engineering or CAE domain Hands-on experience with CAE tools such as ANSYS, Abaqus, or similar FEA/CFD software Strong programming skills in Python, R, or Java Proficiency in machine learning frameworks (TensorFlow, PyTorch, scikit-learn) Experience with data preprocessing, feature engineering, and statistical analysis Solid understanding of mathematics, statistics, and problem-solving skills Excellent analytical thinking and ability to tackle complex engineering challenges Strong communication and teamwork skills to collaborate across disciplines Preferred: Experience with physics-informed machine learning and digital twin technologies Preferred: Familiarity with automation of CAE workflows and predictive modelling for product design Benefits Challenging job and a chance to team up with a young and dynamic professional group Chance to build yourself as WE grow. Remuneration that stays competitive and attractive to retain the best. Opportunity to join an organization experiencing year on year growth check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#6875E2;border-color:#6875E2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

🧠 Data Science Intern (Remote) Company: Coreline Solutions Type: Internship (3 to 6 Months) Mode: 100% Remote Stipend: Unpaid (Full-time opportunity may be offered upon successful completion) 🌐 About Us Coreline Solutions is a digital consulting and tech company focused on building smart, data-driven solutions for modern businesses. From AI integration to analytics platforms, we empower companies through technology, data science, and intelligent systems. Our team believes in continuous learning, transparency, and innovation — and we’re looking for passionate interns to grow with us. 🎯 About the Internship We’re seeking a Data Science Intern who is eager to explore real-world applications of machine learning, data analysis, and automation. You’ll be working alongside our engineering and analytics team to contribute to projects that improve business processes, insights, and outcomes. This internship is entirely remote, giving you the flexibility to learn and contribute from wherever you are. 📌 Key Responsibilities Collect, clean, and preprocess structured and unstructured data Perform exploratory data analysis (EDA) to extract insights Build and evaluate predictive models using Python and ML libraries Visualize data through tools such as matplotlib, seaborn, or Power BI Support teams with statistical analysis, feature engineering, and reporting Document models, results, and learnings in a collaborative environment ✅ What We’re Looking For Currently pursuing or recently completed a degree in Data Science, Computer Science, Statistics, or a related field Good understanding of Python, NumPy, pandas, scikit-learn, and basic ML algorithms Familiarity with SQL and data visualization tools Analytical thinking and a curiosity for solving complex problems Ability to work independently and meet project deadlines remotely 💡 Bonus Skills (Preferred but Not Required) Exposure to cloud services (AWS, GCP, or Azure) Basic knowledge of Git/GitHub for version control Interest in NLP, deep learning, or data engineering 🎁 What You’ll Gain Hands-on experience with live data projects and business use-cases Mentorship from experienced data scientists and tech leads Internship Certificate upon completion Letter of Recommendation for high-performing interns Possibility of full-time placement based on performance and company needs 🤝 Our Commitment We are proud to be an equal opportunity organization. Coreline Solutions values diversity and is committed to creating an inclusive space where all team members, interns, and applicants feel respected and supported. All internship communications and personal data will be handled responsibly and securely, in alignment with LinkedIn’s Privacy Policy and Professional Community Policies. 📬 How to Apply To apply, send your updated resume and a brief introduction to: 📩 hr@corelinesolutions.site Use the subject line: "Application – Data Science Intern – [Your Full Name]" 📌 Before applying, make sure your LinkedIn profile reflects your latest skills and projects. Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description Kenvue is currently recruiting for- Analyst, Data Science This position reports to the Manager, Data Science and is based at Bengaluru, India. Who We Are At Kenvue , we realize the extraordinary power of everyday care. Built on over a century of heritage and rooted in science, we’re the house of iconic brands - including Neutrogena, Aveeno, Tylenol, Listerine, Johnson’s and BAND-AID® Brand Adhesive Bandages that you already know and love. Science is our passion; care is our talent. Our global team is made up of ~ 22,000 diverse and brilliant people, passionate about insights, innovation and committed to delivering the best products to our customers. With expertise and empathy, being a Kenvuer means having the power to impact the life of millions of people every day. We put people first, care fiercely, earn trust with science and solve with courage – and have brilliant opportunities waiting for you! Join us in shaping our future–and yours. For more information, click here . Role reports to- Manager, Data Science Location- Bengaluru, India Travel %- 10% What You Will Do As Analyst, Data Science, you will develop and implement advance analytics, optimization models, and machine learning solutions to solve complex Operations Research business problems. In this role, you will work closely with cross-functional teams to analyze large datasets, identify patterns, and generate insights that drive business value. Key Responsibilities Develop, refine and review mathematical models to represent supply chain systems, including inventory management, production planning, transportation logistics, and distribution networks. Apply advanced optimization techniques, such as linear programming, integer programming, network flow, simulation, and heuristic algorithms, to solve complex supply chain problems Collaborate with stakeholders to understand business objectives, constraints, and requirements, and translate them into mathematical models and optimization problems. Collaborate with data scientists and business analysts to gather and preprocess data from various sources, ensuring data accuracy and integrity. Analyze large datasets, extract relevant information, and identify patterns and trends to support decision-making processes. Follow code version through bitbucket / GIT, documenting work on Confluence. Build APIs for seamless integration with application. Stay up to date on the latest developments in operations research, supply chain management, and optimization techniques. Conduct research and explore innovative approaches to address supply chain challenges and drive continuous improvement. Required Qualifications What We Are Looking For Masters degree in Industrial Engineering, Operations Research or Management Science or related field 2 – 4 years of business experience At least 2 years of experience in Supply Chain Optimization Proficiency in mathematical modelling and optimization techniques, such as linear programming, integer programming, network flow, simulation, and heuristic algorithms. Strong programming skills in languages such as Python with experience in optimization libraries (e.g., Gurobi, FICO) and data manipulation tools (e.g., pandas, NumPy). Experience with data analysis, statistical modelling, and visualization using tools like SQL, Streamlit Hands on python programmer with strong experience of OOPs Knowledge of supply chain concepts, including demand forecasting, inventory management, production planning, transportation logistics, and distribution networks Desired Qualifications APICS certification – CPIM, CSCP Microsoft certification (e.g., Azure Fundamentals) Machine Learning certification Kenvue is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment based on business needs, job requirements, and individual qualifications, without regard to race, color, religion, sex, sexual orientation, gender identity, age, national origin, protected veteran status, or any other legally protected characteristic, and will not be discriminated against on the basis of disability. Primary Location Asia Pacific-India-Karnataka-Bangalore Job Function Digital Product Development Job Qualifications Masters degree in Industrial Engineering, Operations Research or Management Science or related field Show more Show less

Posted 1 week ago

Apply

3.0 years

4 - 7 Lacs

Ahmedabad

On-site

Required Qualifications:  Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related field.  +3 years of professional experience in AI/ML data engineering. Technical Skills:  Proficiency in programming languages such as Python and R.  Strong understanding of machine learning algorithms and frameworks (e.g., TensorFlow, PyTorch, scikit-learn).  Experience with conversational AI platforms (preferred: Oracle Digital Assistant).  Experience with data processing tools and platforms (e.g., Apache Spark, Hadoop etc..).  Familiarity with Oracle Cloud Infrastructure (OCI) and its services for deploying ODA & AI/ML models.  Experience with natural language processing (NLP) techniques and libraries (e.g., NLTK, SpaCy).  Understanding of deep learning architectures for NLP (e.g., transformers, BERT).  Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes).  Knowledge of statistical analysis and data visualization using R-language. Soft Skills:  Strong analytical and problem-solving skills.  Excellent communication and teamwork abilities.  Eagerness to learn new technologies and methodologies. Key Responsibilities: 1. AI/ML Model Development: o Design, develop, and implement machine learning models to address complex business challenges. o Preprocess and analyze large datasets to extract valuable insights for model training. o Optimize model performance through hyperparameter tuning and algorithm selection. o Utilize R in conjunction with Python for a comprehensive data analysis toolkit. 2. Oracle Digital Assistant (ODA) Integration: o Develop and deploy intelligent conversational agents using Oracle Digital Assistant (ODA). o Integrate AI/ML models seamlessly with ODA to enhance the capabilities of chatbots and virtual assistants. o Continuously refine ODA solutions based on user feedback and performance metrics. o Leverage Oracle Cloud Infrastructure (OCI) for deploying and managing ODA solutions, ensuring scalability and reliability. 3. Data Engineering: o Build and maintain robust data pipelines to support AI/ML model training and deployment. o Implement data validation processes to ensure data quality and integrity. o Collaborate closely with data scientists and analysts to define data requirements and deliver actionable insights. 4. Collaboration and Communication: o Work effectively across functional teams, including data scientists, software engineers, and product managers. o Communicate technical concepts clearly to both technical and non-technical stakeholders. o Document development processes, models, and integrations for future reference. 5. Research and Innovation: o Stay abreast of the latest advancements in AI/ML, NLP, and conversational AI technologies. o Experiment with new algorithms and approaches to improve existing solutions. o Contribute to the development of best practices and standards for AI/ML engineering. 6. Documentation o Create and maintain comprehensive project documentation. o Document code and APIs to ensure they are understandable and usable by other developers. Job Type: Full-time Pay: ₹448,780.74 - ₹767,935.03 per year Schedule: Day shift Experience: Data Engineer: 2 years (Preferred) Work Location: In person

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Machine Learning Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with Certificate of Internship Application Deadline : 7th June 2025 About Unified Mentor Unified Mentor provides students and graduates with hands-on learning opportunities and career growth in Machine Learning and Data Science. Role Overview As a Machine Learning Intern, you will work on real-world projects, enhancing your practical skills in data analysis and model development. Responsibilities ✅ Design, test, and optimize machine learning models ✅ Analyze and preprocess datasets ✅ Develop algorithms and predictive models ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn ✅ Document findings and create reports Requirements 🎓 Enrolled in or a graduate of a relevant program (Computer Science, AI, Data Science, or related field) 🧠 Knowledge of machine learning concepts and algorithms 💻 Proficiency in Python or R (preferred) 🤝 Strong analytical and teamwork skills Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Hands-on machine learning experience ✔ Internship Certificate & Letter of Recommendation ✔ Real-world project contributions for your portfolio Equal Opportunity Unified Mentor is an equal-opportunity employer, welcoming candidates from all backgrounds. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Machine Learning Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship Application Deadline: 7th June 2025 About WebBoost Solutions by UM WebBoost Solutions by UM provides students and graduates with hands-on learning and career growth opportunities in machine learning and data science . Role Overview As a Machine Learning Intern , you’ll work on real-world projects , gaining practical experience in machine learning and data analysis . Responsibilities ✅ Design, test, and optimize machine learning models. ✅ Analyze and preprocess datasets. ✅ Develop algorithms and predictive models for various applications. ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn . ✅ Document findings and create reports to present insights. Requirements 🎓 Enrolled in or graduate of a relevant program (AI, ML, Data Science, Computer Science, or related field) 📊 Knowledge of machine learning concepts and algorithms . 🐍 Proficiency in Python or R (preferred). 🤝 Strong analytical and teamwork skills . Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Practical machine learning experience . ✔ Internship Certificate & Letter of Recommendation . ✔ Build your portfolio with real-world projects . How to Apply 📩 Submit your application by 7th June 2025 with the subject: "Machine Learning Intern Application" . Equal Opportunity WebBoost Solutions by UM is an equal opportunity employer , welcoming candidates from all backgrounds . Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Machine Learning Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with Certificate of Internship Application Deadline : 7th June 2025 About Unified Mentor Unified Mentor provides students and graduates with hands-on learning opportunities and career growth in Machine Learning and Data Science. Role Overview As a Machine Learning Intern, you will work on real-world projects, enhancing your practical skills in data analysis and model development. Responsibilities ✅ Design, test, and optimize machine learning models ✅ Analyze and preprocess datasets ✅ Develop algorithms and predictive models ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn ✅ Document findings and create reports Requirements 🎓 Enrolled in or a graduate of a relevant program (Computer Science, AI, Data Science, or related field) 🧠 Knowledge of machine learning concepts and algorithms 💻 Proficiency in Python or R (preferred) 🤝 Strong analytical and teamwork skills Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Hands-on machine learning experience ✔ Internship Certificate & Letter of Recommendation ✔ Real-world project contributions for your portfolio Equal Opportunity Unified Mentor is an equal-opportunity employer, welcoming candidates from all backgrounds. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Designation: - ML / MLOPs Engineer Location: - Noida (Sector- 132) Key Responsibilities: • Model Development & Algorithm Optimization : Design, implement, and optimize ML models and algorithms using libraries and frameworks such as TensorFlow , PyTorch , and scikit-learn to solve complex business problems. • Training & Evaluation : Train and evaluate models using historical data, ensuring accuracy, scalability, and efficiency while fine-tuning hyperparameters. • Data Preprocessing & Cleaning : Clean, preprocess, and transform raw data into a suitable format for model training and evaluation, applying industry best practices to ensure data quality. • Feature Engineering : Conduct feature engineering to extract meaningful features from data that enhance model performance and improve predictive capabilities. • Model Deployment & Pipelines : Build end-to-end pipelines and workflows for deploying machine learning models into production environments, leveraging Azure Machine Learning and containerization technologies like Docker and Kubernetes . • Production Deployment : Develop and deploy machine learning models to production environments, ensuring scalability and reliability using tools such as Azure Kubernetes Service (AKS) . • End-to-End ML Lifecycle Automation : Automate the end-to-end machine learning lifecycle, including data ingestion, model training, deployment, and monitoring, ensuring seamless operations and faster model iteration. • Performance Optimization : Monitor and improve inference speed and latency to meet real- time processing requirements, ensuring efficient and scalable solutions. • NLP, CV, GenAI Programming : Work on machine learning projects involving Natural Language Processing (NLP) , Computer Vision (CV) , and Generative AI (GenAI) , applying state-of-the-art techniques and frameworks to improve model performance. • Collaboration & CI/CD Integration : Collaborate with data scientists and engineers to integrate ML models into production workflows, building and maintaining continuous integration/continuous deployment (CI/CD) pipelines using tools like Azure DevOps , Git , and Jenkins . • Monitoring & Optimization : Continuously monitor the performance of deployed models, adjusting parameters and optimizing algorithms to improve accuracy and efficiency. • Security & Compliance : Ensure all machine learning models and processes adhere to industry security standards and compliance protocols , such as GDPR and HIPAA . • Documentation & Reporting : Document machine learning processes, models, and results to ensure reproducibility and effective communication with stakeholders. Required Qualifications: • Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or a related field. • 3+ years of experience in machine learning operations (MLOps), cloud engineering, or similar roles. • Proficiency in Python , with hands-on experience using libraries such as TensorFlow , PyTorch , scikit-learn , Pandas , and NumPy . • Strong experience with Azure Machine Learning services, including Azure ML Studio , Azure Databricks , and Azure Kubernetes Service (AKS) . • Knowledge and experience in building end-to-end ML pipelines, deploying models, and automating the machine learning lifecycle. • Expertise in Docker , Kubernetes , and container orchestration for deploying machine learning models at scale. • Experience in data engineering practices and familiarity with cloud storage solutions like Azure Blob Storage and Azure Data Lake . • Strong understanding of NLP , CV , or GenAI programming, along with the ability to apply these techniques to real-world business problems. • Experience with Git , Azure DevOps , or similar tools to manage version control and CI/CD pipelines. • Solid experience in machine learning algorithms , model training , evaluation , and hyperparameter tuning Show more Show less

Posted 1 week ago

Apply

10.0 - 12.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Job Summary: As a AI Developer at Emerson, you will be responsible for analyzing complex data sets to identify trends, develop predictive models, and deliver actionable insights. You will collaborate with multi-functional teams to comprehend business needs and provide data-driven solutions that improve decision-making and boost business growth In this Role, Your Responsibilities Will Be: Analyze large, complex data sets using statistical methods and machine learning techniques to extract meaningful insights. Develop and implement predictive models and algorithms to solve business problems and improve processes. Develop visual representations and dashboards to clearly communicate findings and insights to collaborators. Work with data engineers, product managers, and other team members to understand business requirements and deliver solutions. Clean and preprocess data to ensure accuracy and completeness for analysis. Prepare and present reports on data analysis, model performance, and important data to collaborators and management Participate in regular Scrum events such as Sprint Planning, Sprint Review, and Sprint Retrospective Stay updated with the latest industry trends and advancements in data science and machine learning techniques. Who You Are: You quickly and decisively act in constantly evolving, unexpected situations. You adjust communication content and style to meet the needs of diverse partners. You always keep the end in sight; puts in extra effort to meet deadlines. You analyze multiple and diverse sources of information to define problems accurately before moving to solutions. You observe situational and group dynamics and select best-fit approach. For This Role, You Will Need: Bachelor's degree in computer science, Data Science, Statistics, or a related field or equivalent experience is preferred Total 10-12 years of industry experience More than 5 years of experience in a data science or analytics role, with a strong track record of building and deploying models. Solid grasp of machine learning techniques and algorithms, such as GPTs, CNN, RNN, k-NN, Naive Bayes, SVM, Decision Forests, etc. Experience with NLP, NLG, and Large Language Models like – BERT, LLaMa, LaMDA, GPT, BLOOM, PaLM, DALL-E, etc. Proficiency in programming languages such as Python or R, and experience with data manipulation libraries (e.g., pandas, NumPy). Experience with machine learning frameworks and libraries such as Go, TensorFlow, PyTorch Familiarity with data visualization tools (e.g., Tableau, Power BI, Matplotlib, Seaborn). Experience with SQL and NoSQL databases such as MongoDB, Cassandra, Vector databases Strong analytical and problem-solving skills, with the ability to work with complex data sets and extract actionable insights. Strong verbal and written communication skills, capable of explaining complex technical details to non-technical collaborators Preferred Qualifications that Set You Apart: Prior experience in engineering domain and working with teams in Scaled Agile Framework (SAFe) are nice to have Possession of relevant certification/s in data science from reputed universities specializing in AI. Familiarity with cloud platforms, Microsoft Azure is preferred Ability to work in a fast-paced environment and handle various projects simultaneously. Strong analytical and fixing skills, with the ability to resolve issues related to model performance and infrastructure Our Culture & Commitment to You At Emerson, we prioritize a workplace where every employee is valued, respected, and empowered to grow. We foster an environment that encourages innovation, collaboration, and diverse perspectives—because we know that great ideas come from great teams. Our commitment to ongoing career development and growing an inclusive culture ensures you have the support to thrive. Whether through mentorship, training, or leadership opportunities, we invest in your success so you can make a lasting impact. We believe diverse teams, working together are key to driving growth and delivering business results. We recognize the importance of employee wellbeing. We prioritize providing competitive benefits plans, a variety of medical insurance plans, Employee Assistance Program, employee resource groups, recognition, and much more. Our culture offers flexible time off plans, including paid parental leave (maternal and paternal), vacation and holiday leave. About Us WHY EMERSON Our Commitment to Our People At Emerson, we are motivated by a spirit of collaboration that helps our diverse, multicultural teams across the world drive innovation that makes the world healthier, safer, smarter, and more sustainable. And we want you to join us in our bold aspiration. We have built an engaged community of inquisitive, dedicated people who thrive knowing they are welcomed, trusted, celebrated, and empowered to solve the world’s most complex problems — for our customers, our communities, and the planet. You’ll contribute to this vital work while further developing your skills through our award-winning employee development programs. We are a proud corporate citizen in every city where we operate and are committed to our people, our communities, and the world at large. We take this responsibility seriously and strive to make a positive impact through every endeavor. At Emerson, you’ll see firsthand that our people are at the center of everything we do. So, let’s go. Let’s think differently. Learn, collaborate, and grow. Seek opportunity. Push boundaries. Be empowered to make things better. Speed up to break through. Let’s go, together. Accessibility Assistance or Accommodation If you have a disability and are having difficulty accessing or using this website to apply for a position, please contact: idisability.administrator@emerson.com . About Emerson Emerson is a global leader in automation technology and software. Through our deep domain expertise and legacy of flawless execution, Emerson helps customers in critical industries like life sciences, energy, power and renewables, chemical and advanced factory automation operate more sustainably while improving productivity, energy security and reliability. With global operations and a comprehensive portfolio of software and technology, we are helping companies implement digital transformation to measurably improve their operations, conserve valuable resources and enhance their safety. We offer equitable opportunities, celebrate diversity, and embrace challenges with confidence that, together, we can make an impact across a broad spectrum of countries and industries. Whether you’re an established professional looking for a career change, an undergraduate student exploring possibilities, or a recent graduate with an advanced degree, you’ll find your chance to make a difference with Emerson. Join our team – let’s go! No calls or agencies please. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Machine Learning Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship Application Deadline: 7th June 2025 About WebBoost Solutions by UM WebBoost Solutions by UM provides students and graduates with hands-on learning and career growth opportunities in machine learning and data science . Role Overview As a Machine Learning Intern , you’ll work on real-world projects , gaining practical experience in machine learning and data analysis . Responsibilities ✅ Design, test, and optimize machine learning models. ✅ Analyze and preprocess datasets. ✅ Develop algorithms and predictive models for various applications. ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn . ✅ Document findings and create reports to present insights. Requirements 🎓 Enrolled in or graduate of a relevant program (AI, ML, Data Science, Computer Science, or related field). 📊 Knowledge of machine learning concepts and algorithms . 🐍 Proficiency in Python or R (preferred). 🤝 Strong analytical and teamwork skills . Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Practical machine learning experience . ✔ Internship Certificate & Letter of Recommendation . ✔ Build your portfolio with real-world projects . How to Apply 📩 Submit your application by 7th June 2025 with the subject: "Machine Learning Intern Application" . Equal Opportunity WebBoost Solutions by UM is an equal opportunity employer , welcoming candidates from all backgrounds . Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Long Description Day-to-Day Responsibilities: Data Analysis Understand the issues the business is facing and identify relevant data sources to define the scope of the associated impacts. Collects, selects, and validates data relevant to the analysis. Extract and translate business data into actionable data. Analyze and validate heterogenous, possibly unstructured data masses, to extract useful knowledge to optimize the company’s offers, services, or processes. Analyze large, complex data sets to identify trends, patterns, and insights. Utilize insights to propose process and/or system changes to drive continual service improvements. Data Management Implement processes for acquiring and integrating data from various sources. Clean and preprocess data to prepare it for analysis. Assist in data reconciliation and validation checks across various systems. Coordinate with IT and other departments to implement data solutions. Comply with establishing and maintaining data governance policies and procedures. Complete end to end quality assurance activities to ensure reporting is accurate and aligns with established business definitions and objectives. Ensure periodic audits for reporting to maintain data integrity and validity. General Reporting Provide guidance to internal stakeholders on best practices for reporting. This includes source identification, report set up, and report maintenance in systems of record. Collaborate with business leaders to understand their data needs and provide analytical support for decision-making. Develop and maintain key performance indicators to track the Global Services program’s effectiveness. Complete regular analyses of key performance indicators to ensure performance demonstrates global best practices and standard procedures. Create and maintain advanced dashboards and reports using tools like Tableau, Power BI, or similar, to drive operational and business improvements. Develop and present insightful reports that highlight workflow status and value. Provide input and expertise regarding appropriate financial and operational KPIs/metrics to identify the most appropriate measures for Services reporting components. Present data findings to senior management and stakeholders in a clear and concise manner. Advanced Analytics Performs research, analyzes reports, and creates statistical models for presentation/review. Summarize findings and communicate results to partners. Develop and implement data models, algorithms, and statistical analyses to address business needs. Use statistical methods to interpret data and generate actionable insights. Integrate advanced analytics practices into reporting standards introduced to the business. Build and validate predictive models to forecast business outcomes and support strategic decision-making. Documentation Conduct data process mapping activities across various systems to drive continual service improvements and platform enhancements/business process improvements. Assist in documenting data lineage/data flows. Perform database structure research and create data mapping documents to understand relevant data elements along with their definitions and use. Coordinates and provides demos, training and knowledge transfer for any Services reporting solution that launches. IT Reporting Initiative Collate reporting requirements for reporting needs prior to submission to IT, organizing and identifying data sources and intended reporting outputs, for both internal and external stakeholders. Develop and present business case documentation and rationale for investments and value/impact of reporting initiatives via the IT Steering Committee. Ensure new reporting meets the pre-defined acceptance criteria to ensure that the expected business value/customer satisfaction outputs are achieved. Coordinates release planning, testing and implementation of reporting features and functionality with the IT team and LOB. Work cross-functionally to ensure that external customer reporting requirements are accepted by Marketing, Product, and other functional leads. Responsible for business stakeholder management, providing clear and consistent communication of all report development to the business, management, and technology team. Troubleshoot and resolve bugs/defects in Services reporting solutions, with the assistance of IT, as reported or observed by stakeholders. Special projects as required. Must Haves Proven ability to significantly contribute toward or lead operational initiatives with a results-oriented approach. Ability to build internal relationships. Exceptional attention to detail and structured problem-solving skills. Excellent presentation skills and comfortable sharing thought leadership with Executive stakeholders. Proficiency in various business productivity systems. Ability to balance multiple tasks with changing priorities. Ability to negotiate conflict and maintain constructive working relationships with people at all levels of the organization. Self-starter capable of working independently and ensuring deadlines are met. Excellent communication and interpersonal skills, both verbal and written. Skilled with Microsoft Office (Word, PowerPoint, Outlook, Excel etc). Proficiency with SQL and with reporting and visualization tools such as SQL Server Reporting Services, DAX, Visual Studio, Report Builder, Qlik, and Power BI. Extensive experience with ITSM platforms and integrated reporting (e.g. ServiceNow) Understanding of data languages such as DAX, VBA, MDX, R, Python, and Power Query. Knowledge of industry-specific metrics and reporting standards in both financial and operational domains (e.g. ASA, MTTR, NPV, ROI, Cost Ratios). Understanding of relational databases and experience with database management systems. Proven experience in advanced analytics, data mining, predictive modeling. and data processing techniques. Demonstrated success in setting up reporting standards, as well as strategies for data upkeep and collection. Nice-to-Haves Familiarity with cloud platforms (e.g., AWS, Azure) is a plus. A minimum of a 4-year degree or comparable industry experience is required. Degree in quantitative field (e.g. Statistics, Mathematics, or Computer Science) is preferred 4+ years in a BI, data/business analyst, or reporting centric role preferred. Involvement in a Services role within the IT/UCC industry Familiarity with Agile development practices Demonstrated experience understanding and executing technical strategies and processes, resulting in exceeding business targets. Certifications: Data analysis or business intelligence tools (e.g. Microsoft Certified: Power BI Data Analyst Associate) Process and Quality Improvement (e.g. Six Sigma, Lean, Quality) ITIL Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description: Data Scientist (Intern) Position Overview: As a Data Scientist, you will be responsible for analyzing and interpreting complex datasets to provide valuable insights and solve challenging business problems. You will work closely with cross-functional teams to gather data, build predictive models, develop algorithms, and deliver data-driven solutions. This position is ideal for a fresher who possesses strong analytical skills, programming expertise, and a passion for working with data. Key Responsibilities: Data Collection and Preprocessing: Collect, clean, and preprocess large volumes of structured and unstructured data from various sources. Conduct data quality assessments and implement data cleaning techniques to ensure accuracy and reliability. Exploratory Data Analysis (EDA): Perform exploratory data analysis to understand the characteristics of the data and identify patterns, trends, and outliers. Utilize statistical techniques and visualizations to gain insights from the data. Statistical Modeling and Machine Learning: Develop predictive models and algorithms using statistical techniques and machine learning algorithms. Apply regression analysis, classification, clustering, time series analysis, and other relevant methods to solve business problems. Evaluate model performance, fine-tune parameters, and optimize models for better accuracy. Feature Engineering: Identify and engineer relevant features from raw data to enhance model performance. Conduct feature selection techniques to improve model interpretability and efficiency. Model Deployment and Evaluation: Collaborate with software engineers to deploy models into production systems. Monitor model performance, diagnose issues, and implement improvements as needed. Data Visualization and Reporting: Communicate findings and insights effectively through clear and concise data visualizations, reports, and presentations. Present complex data-driven concepts to non-technical stakeholders in a way that is easy to understand. Continuous Learning and Research: Stay up-to-date with the latest advancements in data science, machine learning, and related fields. Conduct research and experimentation to explore new methodologies and approaches. Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Linkedin logo

Role - Computer Vision Engineer Experience - 2+ years Location - Indore (Onsite) Responsibilities: Develop, implement, and deploy machine learning models to address specific business challenges. Collaborate with cross-functional teams to gather and analyze data, identify opportunities for leveraging machine learning, and define project goals. Clean, preprocess, and analyze large datasets to extract meaningful insights and features for model training. Stay abreast of the latest developments in machine learning and apply innovative solutions to enhance model performance. Work closely with software engineers to integrate machine learning models into production systems. Conduct thorough testing and validation of models to ensure accuracy, robustness, and scalability. Collaborate with stakeholders to understand feedback and iterate on models for continuous improvement. Document and communicate machine learning solutions effectively to both technical and non-technical audiences. Keep up-to-date with industry best practices and contribute to a culture of learning and knowledge sharing within the team. Qualifications: Bachelor's/Master’s/Ph.D. in Computer Science, Machine Learning, Data Science, or related field. Proven experience working on machine learning projects, with a focus on model development, deployment, and optimization. Proficiency in programming languages such as Python, R, or Java, and familiarity with relevant libraries and frameworks (e.g., TensorFlow, PyTorch, scikit-learn). Strong understanding of machine learning algorithms, statistics, and data structures. Experience with data preprocessing, feature engineering, and model evaluation. Solid understanding of software development practices and version control systems. Excellent problem-solving and analytical skills. Effective communication skills and ability to collaborate in a team-oriented environment. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job description Job Role: Ai Engineer Experience: 3 to 5 Years Location : Client Office – Pune, India Job Type : Full-Time Department : Artificial Intelligence / Engineering Work Mode : On-site at client location About the Role We are seeking a highly skilled and versatile Senior AI Engineer with over 3 to 5 years of hands-on experience to join our client’s team in Pune. This role focuses on designing, developing, and deploying cutting-edge AI and machine learning solutions for high-scale, high-concurrency applications where security, scalability, and performance are paramount. You will work closely with cross-functional teams, including data scientists, DevOps engineers, security specialists, and business stakeholders, to deliver robust AI solutions that drive measurable business impact in dynamic, large-scale environments. Job Summary: We are seeking a passionate and experienced Node.js Developer to join our backend engineering team. As a key contributor, you will be responsible for building scalable, high-performance APIs, microservices, and backend systems that power our products and services. You will leverage modern technologies and best practices to design and implement robust, maintainable, and efficient solutions. You should have a deep understanding of Node.js, NestJS, Express.js, along with hands-on experience designing and building complex backend systems. Key Responsibilities Architect, develop, and deploy advanced machine learning and deep learning models across domains like NLP, computer vision, predictive analytics, or reinforcement learning, ensuring scalability and performance under high-traffic conditions. Preprocess, clean, and analyze large-scale structured and unstructured datasets using advanced statistical, ML, and big data techniques. Collaborate with data engineering and DevOps teams to integrate AI/ML models into production-grade pipelines, ensuring seamless operation under high concurrency. Optimize models for latency, throughput, accuracy, and resource efficiency, leveraging distributed computing and parallel processing where necessary. Implement robust security measures, including data encryption, secure model deployment, and adherence to compliance standards (e.g., GDPR, CCPA). Partner with client-side technical teams to translate complex business requirements into scalable, secure AI-driven solutions. Stay at the forefront of AI/ML advancements, experimenting with emerging tools, frameworks, and techniques (e.g., generative AI, federated learning, or AutoML). Write clean, modular, and maintainable code, along with comprehensive documentation and reports for model explainability, reproducibility, and auditability. Proactively monitor and maintain deployed models, ensuring reliability and performance in production environments with millions of concurrent users. Required Qualifications Bachelor’s or Master’s degree in Computer Science, Machine Learning, Data Science, or a related technical field. 5+ years of experience building and deploying AI/ML models in production environments with high-scale traffic and concurrency. Advanced proficiency in Python and modern AI/ML frameworks, including TensorFlow, PyTorch, Scikit-learn, and JAX. Hands-on expertise in at least two of the following domains: NLP, computer vision, time-series forecasting, or generative AI. Deep understanding of the end-to-end ML lifecycle, including data preprocessing, feature engineering, hyperparameter tuning, model evaluation, and deployment. Proven experience with cloud platforms (AWS, GCP, or Azure) and their AI/ML services (e.g., SageMaker, Vertex AI, or Azure ML). Strong knowledge of containerization (Docker, Kubernetes) and RESTful API development for secure and scalable model deployment. Familiarity with secure coding practices, data privacy regulations, and techniques for safeguarding AI systems against adversarial attacks. Preferred Skills Expertise in MLOps frameworks and tools such as MLflow, Kubeflow, or SageMaker for streamlined model lifecycle management. Hands-on experience with large language models (LLMs) or generative AI frameworks (e.g., Hugging Face Transformers, LangChain, or Llama). Proficiency in big data technologies and orchestration tools (e.g., Apache Spark, Airflow, or Kafka) for handling massive datasets and real-time pipelines. Experience with distributed training techniques (e.g., Horovod, Ray, or TensorFlow Distributed) for large-scale model development. Knowledge of CI/CD pipelines and infrastructure-as-code tools (e.g., Terraform, Ansible) for scalable and automated deployments. Familiarity with security frameworks and tools for AI systems, such as model hardening, differential privacy, or encrypted computation. Proven ability to work in global, client-facing roles, with strong communication skills to bridge technical and business teams. Share the CV on hr.mobilefirst@gmail.com/6355560672 Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Lucknow, Uttar Pradesh, India

On-site

Linkedin logo

Selected Intern's Day-to-day Responsibilities Include Collaborate with the AI research team to design and develop AI models tailored to address specific challenges in the food industry Conduct research and experiments to explore novel approaches in LLM and deep learning for solving real-world problems related to food production, distribution, and consumption Collect, preprocess, and analyze large datasets to train and evaluate AI models, ensuring accuracy, reliability, and scalability Implement algorithms and techniques for natural language processing, logic reasoning, and mathematical modeling to extract valuable insights from diverse sources of data Develop APIs for seamless integration of AI models with existing systems and third-party tools, ensuring compatibility, efficiency, and reliability Integrate AI solutions with third-party tools and platforms to enhance functionality and performance, leveraging APIs and SDKs for seamless data exchange and collaboration Perform quality assurance (QA) testing to validate the functionality, performance, and reliability of AI models, APIs, and integrated systems Utilize Python stack and Conda-like technologies for development, ensuring consistency, reproducibility, and scalability of AI solutions Collaborate with cross-functional teams to identify requirements, define specifications, and prioritize features for API development and integration Stay updated on the latest advancements in AI, LLM, and deep learning research, as well as emerging tools and technologies for API creation and integration About Company: FoodNEST(S) - Bringing the restaurant to your doorstep. FoodNEST(S) is a fast-growing food tech startup founded in 2021 by Vatsal Asthana, an ambitiously keen food enthusiast. FoodNEST was founded with the sole purpose of reiterating the authentic food tradition of India and keeping alive all the astounding Indian delicacies that have been perfected across generations. We offer curated best variety of cuisines from regional to international from a variety of restaurants making it easy for people to get their favorite restaurants at their doorstep. Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Working as an AI/ML Engineer at Navtech, you will: * Design, develop, and deploy machine learning models for classification, regression, clustering, recommendations, or NLP tasks. Clean, preprocess, and analyze large datasets to extract meaningful insights and features. Work closely with data engineers to develop scalable and reliable data pipelines. Experiment with different algorithms and techniques to improve model performance. Monitor and maintain production ML models, including retraining and model drift detection. Collaborate with software engineers to integrate ML models into applications and services. Document processes, experiments, and decisions for reproducibility and transparency. Stay current with the latest research and trends in machine learning and AI. Who Are We Looking for Exactly? * 2–4 years of hands-on experience in building and deploying ML models in real-world applications. Strong knowledge of Python and ML libraries such as Scikit-learn, TensorFlow, PyTorch, XGBoost, or similar. Experience with data preprocessing, feature engineering, and model evaluation techniques. Solid understanding of ML concepts such as supervised and unsupervised learning, overfitting, regularization, etc. Experience working with Jupyter, pandas, NumPy, and visualization libraries like Matplotlib or Seaborn. Familiarity with version control (Git) and basic software engineering practices. You consistently demonstrate strong verbal and written communication skills as well as strong analytical and problem-solving abilities You should have a master’s degree /Bachelors (BS) in computer science, Software Engineering, IT, Technology Management or related degrees and throughout education in English medium. We’ll REALLY love you if you: * Have knowledge of cloud platforms (AWS, Azure, GCP) and ML services (SageMaker, Vertex AI, etc.) Have knowledge of GenAI prompting and hosting of LLMs. Have experience with NLP libraries (spaCy, Hugging Face Transformers, NLTK). Have familiarity with MLOps tools and practices (MLflow, DVC, Kubeflow, etc.). Have exposure to deep learning and neural network architectures. Have knowledge of REST APIs and how to serve ML models (e.g., Flask, FastAPI, Docker). Why Navtech? * Performance review and Appraisal Twice a year. Competitive pay package with additional bonus & benefits. Work with US, UK & Europe based industry renowned clients for exponential technical growth. Medical Insurance cover for self & immediate family. Work with a culturally diverse team from different geographies. About Us Navtech is a premier IT software and Services provider. Navtech’s mission is to increase public cloud adoption and build cloud-first solutions that become trendsetting platforms of the future. We have been recognized as the Best Cloud Service Provider at GoodFirms for ensuring good results with quality services. Here, we strive to innovate and push technology and service boundaries to provide best-in-class technology solutions to our clients at scale. We deliver to our clients globally from our state-of-the-art design and development centers in the US & Hyderabad. We’re a fast-growing company with clients in the United States, UK, and Europe. We are also a certified AWS partner. You will join a team of talented developers, quality engineers, product managers whose mission is to impact above 100 million people across the world with technological services by the year 2030. Navtech is looking for a AI/ML Engineer to join our growing data science and machine learning team. In this role, you will be responsible for building, deploying, and maintaining machine learning models and pipelines that power intelligent products and data-driven decisions. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

AI & Machine Learning Intern 📍 Location: Remote (100% Virtual) 📅 Duration: 3 Months 💸 Stipend for Top Interns: ₹15,000 🎁 Perks: Certificate | Letter of Recommendation | Full-Time Offer (Based on Performance) About INLIGHN TECH INLIGHN TECH is focused on delivering practical, project-driven learning experiences to help students and graduates build careers in emerging technologies. Our AI & Machine Learning Internship is designed to offer hands-on experience in building intelligent systems and solving real-world problems using data. 🚀 Internship Overview As an AI & ML Intern , you will work on projects involving machine learning models, data preprocessing, and algorithm development . This internship will equip you with the skills to apply AI techniques in various domains, including natural language processing, computer vision, and predictive analytics. 🔧 Key Responsibilities Clean and preprocess datasets for training and testing machine learning models Build, train, and evaluate ML models using Python libraries like scikit-learn, TensorFlow, PyTorch, and Keras Work on projects involving classification, regression, clustering, NLP , or image processing Analyze model performance and optimize results through hyperparameter tuning Collaborate with team members to implement AI solutions for real-world scenarios Present findings through visualizations, reports, and presentations ✅ Qualifications Pursuing or recently completed a degree in Computer Science, Data Science, Engineering, or related fields Strong foundation in Python programming and statistics Understanding of machine learning algorithms and AI concepts Familiarity with Jupyter Notebook , Pandas , NumPy , and visualization libraries like Matplotlib/Seaborn Bonus: Exposure to NLP, deep learning , or AI model deployment tools Curiosity, creativity, and a passion for solving problems with data 🎓 What You’ll Gain Hands-on experience with real datasets and applied ML projects Knowledge of industry-standard AI tools and workflows A portfolio of AI/ML projects you can showcase to employers Internship Certificate upon successful completion Letter of Recommendation for outstanding performers Opportunity for a Full-Time Offer based on performance Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Linkedin logo

Role Overview As a Data Science Analyst at Jai Kisan, you will play a critical role in transforming data into actionable insights to inform strategic business decisions. You’ll work cross-functionally with product, engineering, operations, and leadership teams to unlock the full potential of data through advanced analytics, automation, and AI-driven insights. This role requires a solid foundation in data handling, modern analytics tooling, and a deep curiosity for leveraging emerging technologies like LLMs, vector databases, and cloud-native platforms. Key Responsibilities Collect, clean, preprocess, and validate datasets from diverse structured and unstructured sources including APIs, data lakes, and real-time streams. Conduct exploratory data analysis (EDA) to identify trends, correlations, and business opportunities using statistical and machine learning techniques. Build, maintain, and optimize scalable data pipelines using Airflow, dbt, or Dagster to support both batch and real-time analytics. Develop and deploy AI/ML models, including LLM-based applications, for predictive analytics, recommendation systems, and automation use cases. Work with vector databases (e.g., Pinecone, Weaviate, Chroma) for semantic search and embedding-based applications. Design and manage dashboards and self-serve analytics tools using Power BI, Looker Studio, or Tableau to enable data-driven decisions. Collaborate with backend and data engineers to integrate data solutions into microservices and APIs. Interpret and clearly communicate complex analytical findings to stakeholders, including non-technical teams. Stay ahead of industry trends including AI advancements, data governance, MLOps, vector search, and cloud-native services. Required Skills & Technologies Databases: Proficient in SQL, PostgreSQL, MongoDB, with working knowledge of vector databases (e.g., Pinecone, FAISS, Weaviate). Languages & Tools: Strong programming experience in Python (pandas, NumPy, scikit-learn, LangChain, PyTorch/TensorFlow), SQL, and optionally R. Data Workflow Tools: Experience with Apache Airflow, dbt, Dagster, or similar tools. BI & Visualization: Proficiency in Power BI, Tableau, Looker Studio, Plotly, or Matplotlib. AI/ML: Exposure to LLMs (GPT, BERT, etc.), embedding models, and AI prompt engineering for analytics augmentation. Data APIs & Embeddings: Familiarity with OpenAI, Cohere, Hugging Face APIs for vector search and semantic understanding. Cloud Platforms: Hands-on experience with AWS, GCP, or Azure, especially with services like S3, BigQuery, Redshift, Athena, or Azure Synapse. Version Control & DevOps: Experience using Git, CI/CD pipelines, and Docker is a plus. Qualifications Bachelor's or Master’s degree in Computer Science, Data Science, Statistics, Mathematics, Economics, or a related field. 1–3 years of hands-on experience in a data analysis or applied machine learning role. Strong problem-solving and storytelling abilities with a deep sense of ownership. Excellent communication and collaboration skills; ability to translate technical findings into business impact. Good To Have Experience in MCP (Multi-Cloud Platforms) and cloud-agnostic data pipelines. Understanding of data mesh, data fabric, or modern data stack architectures. Contributions to open-source analytics tools or AI projects. Knowledge of data privacy, compliance standards (GDPR, SOC2), and data security best practices. Show more Show less

Posted 1 week ago

Apply

1.0 - 4.0 years

0 Lacs

Thiruvananthapuram

On-site

Job Title: Python Developer – AI/ML Experience: 1–4 Years Location: Trivandrum – Work From Office (Kerala candidates only) Employment Type: Full-Time Key Responsibilities: Develop and deploy ML models using Python Work with libraries like NumPy, Pandas, Scikit-learn, TensorFlow, or PyTorch Preprocess and analyze large datasets Collaborate with data scientists and engineers for model integration Optimize code and ensure performance in production environments Requirements: Strong Python programming skills Hands-on experience with ML algorithms and data processing Familiarity with model training, evaluation, and versioning Knowledge of REST APIs and cloud platforms is a plus Mail to thasleema@qcentro.com Job Type: Permanent Location Type: In-person Application Question(s): Kerala candidates only Experience: python AI/ML: 1 year (Required) Work Location: In person

Posted 1 week ago

Apply

0 years

0 Lacs

Delhi

On-site

Job Summary: We are looking for a highly skilled Data Management and Data Mining Specialist to join our team in Noida. The ideal candidate will manage and maintain data resources, ensure data quality, and apply data mining techniques to uncover valuable insights. This role involves close collaboration with cross-functional teams to optimize data processes, enhance decision-making, and support strategic business objectives. Key Responsibilities: Utilize advanced data mining techniques to identify patterns, trends, and actionable insights. Collaborate with inside sales and business teams to understand requirements and provide data-driven solutions. Support marketing and sales teams by delivering data insights for targeted campaigns and initiatives. Clean, preprocess, and validate datasets to ensure accuracy and reliability. Design and implement data models, pipelines, and efficient storage solutions. Manage data uploads and maintenance on platforms like Outreach and Salesforce. Ensure timely and effective data channelling to facilitate smooth inside sales operations. Oversee and monitor all data uploads and processes on Outreach. Skills and Qualifications: Educational Background: Bachelor's or Master's degree in any field. Technical Skills: Strong knowledge of tools such as ZoomInfo, Salesforce, Zero Bounce, Outreach, and LinkedIn. Hands-on experience with Microsoft Excel (advanced) and PowerPoint for data analysis and presentation. Familiarity with data modelling, data cleaning, and visualization tools is a plus. Soft Skills: Strong attention to detail and problem-solving abilities. Excellent communication skills for interacting with diverse teams. Ability to multitask and manage deadlines in a fast-paced environment.

Posted 1 week ago

Apply

1.0 years

0 Lacs

Lucknow

On-site

Full job description Job Title: AI Intern (LLM and Deep Learning Focus) Company: FoodNEST(S) Technologies OPC Pvt Ltd. Location: Lucknow, India Duration: 6 months (with potential for full-time employment) Compensation: Accommodation with Food, 4000 Stipend About Us: FoodNEST(S) Technologies is a dynamic startup in Lucknow, Being an innovative and fast-growing startup revolutionizing the AI landscape across businesses. We are looking for a highly capable and proactive individual to join us. We're committed to using artificial intelligence and deep learning to improve every aspect of easing business and hefty workloads into seamless automated workflows. We're looking for top college graduates and final-year students to join our vibrant team. If you're passionate about technology , this is your chance to make a real impact in a fast-paced environment! Internship Overview: Join us at FoodNEST(S) Technologies as an AI Intern and dive into the world of cutting-edge technology! You'll team up with our seasoned experts to create and put into action advanced AI solutions, specifically focusing on Large Language Models and Deep Learning techniques in the food industry. This internship isn't just about learning; it's a hands-on opportunity in a fast-paced startup environment where you'll make a real difference through impactful projects that drive innovation and change. Responsibilities: Collaborate with the AI research team to design and develop AI models tailored to address specific challenges in the food industry. Conduct research and experiments to explore novel approaches in LLM and Deep Learning for solving real-world problems related to food production, distribution, and consumption. Collect, preprocess, and analyze large datasets to train and evaluate AI models, ensuring accuracy, reliability, and scalability. Implement algorithms and techniques for natural language processing, logic reasoning, and mathematical modeling to extract valuable insights from diverse sources of data. Develop APIs for seamless integration of AI models with existing systems and third-party tools, ensuring compatibility, efficiency, and reliability. Integrate AI solutions with third-party tools and platforms to enhance functionality and performance, leveraging APIs and SDKs for seamless data exchange and collaboration. Perform quality assurance (QA) testing to validate the functionality, performance, and reliability of AI models, APIs, and integrated systems. Utilize Python stack and Conda-like technologies for development, ensuring consistency, reproducibility, and scalability of AI solutions. Collaborate with cross-functional teams to identify requirements, define specifications, and prioritize features for API development and integration. Stay updated on the latest advancements in AI, LLM, and Deep Learning research, as well as emerging tools and technologies for API creation and integration. Qualifications: Currently pursuing or recently graduated from a top-tier college or university, with a focus on computer science, artificial intelligence, machine learning, or related fields. Strong understanding of AI concepts, including machine learning algorithms, deep learning architectures, and natural language processing techniques. Proficiency in programming languages such as Python, TensorFlow, PyTorch, or similar frameworks for AI development. Experience with API development and integration, including RESTful APIs, web services, and data exchange protocols. Familiarity with third-party tools and platforms for AI development, such as cloud services (AWS, Azure, Google Cloud), data analytics tools, and collaboration platforms. Knowledge of Python stack and tech frameworks like Conda for development and environment management. Excellent analytical and problem-solving skills, with a keen interest in tackling complex challenges and driving innovation. Ability to work independently as well as collaboratively in a team environment, with strong communication and interpersonal skills. Startup mindset with a passion for entrepreneurship, adaptability to dynamic and fast-changing environments, and a willingness to take on diverse responsibilities. Prior experience or coursework in LLM and Deep Learning applications in the food industry is a plus, but not required. Benefits: Accommodation and food provided (Breakfast, Lunch, Dinner) for candidates. Monthly stipend of 4000 INR. Opportunity for hands-on experience and mentorship from industry experts in cutting edge technology space Potential for full-time employment with FoodNEST(S) Technologies upon successful completion of the internship. Chance to be part of a dynamic and innovative startup that is reshaping the future through technology-driven solutions. Candidates who can apply 1. Candidates who are ready to join immediately or within 15 days if offered the internship. 2. Candidates who are you ready to relocate as it's an Onsite Internship at location Lucknow . Join FoodNEST(S) Technologies and be part of a team that is revolutionizing businesses. Apply now to embark on an exciting journey. Application Question(s): Proficiency in Neural Networks, Generative AI, Python language, Statistics and Mathematics. Can you provide an example from your past experience where you demonstrated a startup mindset by taking initiative, working under pressure, and adapting to rapidly changing circumstances to achieve a significant goal? How did you handle the challenges, and what was the outcome? Job Type: Internship Contract length: 6 months Pay: ₹4,000.00 per month Benefits: Food provided Schedule: Day shift Monday to Friday Education: Bachelor's (Preferred) Experience: total work: 1 year (Preferred) Work Location: In person

Posted 1 week ago

Apply

3.0 - 8.0 years

0 Lacs

Indore

On-site

Experience 3-8 years Years Location Indore Type On Site. Skills/Requirements: Python Job Description: Data Analyst Responsibilities: Participate in discovery processes, collaborating with key stakeholders to identify business requirements and expected outcomes. Collect, clean, and preprocess data from various sources to ensure data quality and consistency. Use statistical tools to interpret data sets, paying particular attention to trends and patterns that could be valuable for diagnostic and predictive analytics. Design and build dashboards and reports using Power BI, or other data visualisation tools to provide actionable insights. Create and maintain comprehensive documentation for all data analysis processes and methodologies. Collaborate with development teams to integrate data insights into software applications and business processes. Define business-specific performance metrics to measure data analysis effectiveness and monitor these metrics over time. Data Analyst Requirements: Graduate in Mathematics, Statistics, Computer Science, or a related field; advanced degree preferred. 3 or more years of relevant data analysis experience, particularly in collecting, organising, analysing, and disseminating significant amounts of information with attention to detail and accuracy. Proficiency in using SQL for data querying and manipulation. Hands-on experience with data visualisation tools such as Power BI, Tableau, or similar. Excellent understanding of statistical techniques and their applications. Programming experience in Python for data analysis. Proficiency in Microsoft Excel, including advanced functions and data analysis tools. Strong analytical skills with the ability to collect, organise, analyse, and disseminate significant amounts of information with attention to detail and accuracy. Great communication skills and ability to convey complex topics to a cross-functional audience. Skills: Data Analysis, SQL, Power BI, Tableau, Statistics, Python, Excel, Data Visualization, Data Cleaning.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Designation: - ML / MLOPs Engineer Location: - Noida (Sector- 132) Key Responsibilities: • Model Development & Algorithm Optimization : Design, implement, and optimize ML models and algorithms using libraries and frameworks such as TensorFlow , PyTorch , and scikit-learn to solve complex business problems. • Training & Evaluation : Train and evaluate models using historical data, ensuring accuracy, scalability, and efficiency while fine-tuning hyperparameters. • Data Preprocessing & Cleaning : Clean, preprocess, and transform raw data into a suitable format for model training and evaluation, applying industry best practices to ensure data quality. • Feature Engineering : Conduct feature engineering to extract meaningful features from data that enhance model performance and improve predictive capabilities. • Model Deployment & Pipelines : Build end-to-end pipelines and workflows for deploying machine learning models into production environments, leveraging Azure Machine Learning and containerization technologies like Docker and Kubernetes . • Production Deployment : Develop and deploy machine learning models to production environments, ensuring scalability and reliability using tools such as Azure Kubernetes Service (AKS) . • End-to-End ML Lifecycle Automation : Automate the end-to-end machine learning lifecycle, including data ingestion, model training, deployment, and monitoring, ensuring seamless operations and faster model iteration. • Performance Optimization : Monitor and improve inference speed and latency to meet real- time processing requirements, ensuring efficient and scalable solutions. • NLP, CV, GenAI Programming : Work on machine learning projects involving Natural Language Processing (NLP) , Computer Vision (CV) , and Generative AI (GenAI) , applying state-of-the-art techniques and frameworks to improve model performance. • Collaboration & CI/CD Integration : Collaborate with data scientists and engineers to integrate ML models into production workflows, building and maintaining continuous integration/continuous deployment (CI/CD) pipelines using tools like Azure DevOps , Git , and Jenkins . • Monitoring & Optimization : Continuously monitor the performance of deployed models, adjusting parameters and optimizing algorithms to improve accuracy and efficiency. • Security & Compliance : Ensure all machine learning models and processes adhere to industry security standards and compliance protocols , such as GDPR and HIPAA . • Documentation & Reporting : Document machine learning processes, models, and results to ensure reproducibility and effective communication with stakeholders. Required Qualifications: • Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or a related field. • 3+ years of experience in machine learning operations (MLOps), cloud engineering, or similar roles. • Proficiency in Python , with hands-on experience using libraries such as TensorFlow , PyTorch , scikit-learn , Pandas , and NumPy . • Strong experience with Azure Machine Learning services, including Azure ML Studio , Azure Databricks , and Azure Kubernetes Service (AKS) . • Knowledge and experience in building end-to-end ML pipelines, deploying models, and automating the machine learning lifecycle. • Expertise in Docker , Kubernetes , and container orchestration for deploying machine learning models at scale. • Experience in data engineering practices and familiarity with cloud storage solutions like Azure Blob Storage and Azure Data Lake . • Strong understanding of NLP , CV , or GenAI programming, along with the ability to apply these techniques to real-world business problems. • Experience with Git , Azure DevOps , or similar tools to manage version control and CI/CD pipelines. • Solid experience in machine learning algorithms , model training , evaluation , and hyperparameter tuning Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Job Description Job Title Data Scientist – Operations Location Kraft Heinz Global Capability Center (GCC), Ahmedabad About Kraft Heinz At Kraft Heinz, we are revolutionizing the food and beverage industry by leveraging data and innovation to deliver exceptional value to our customers. Our Global Capability Center (GCC) in Ahmedabad serves as a critical hub for operational excellence, driving efficiency and innovation across the organization. Role Overview The Data Scientist – Operations will play a key role in transforming operational processes through advanced analytics and data-driven decision-making. This role focuses on optimizing supply chain, manufacturing, and overall operations by developing predictive models, streamlining workflows, and uncovering insights to enhance efficiency and reduce costs. Key Responsibilities Advanced Analytics and Data Modeling Develop predictive models for demand forecasting, inventory optimization, and supply chain resilience. Leverage machine learning techniques to optimize production schedules, logistics, and procurement. Build algorithms to predict and mitigate risks in operational processes. Operational Efficiency Analyze manufacturing and supply chain data to identify bottlenecks and recommend process improvements. Implement solutions for waste reduction, cost optimization, and improved throughput. Conduct root cause analysis for operational inefficiencies and develop actionable insights. Collaboration with Stakeholders Partner with operations, supply chain, and procurement teams to understand analytical needs and deliver insights. Collaborate with IT and data engineering teams to ensure data availability and accuracy. Present findings and recommendations to non-technical stakeholders in an accessible manner. Data Management and Tools Work with large datasets to clean, preprocess, and analyze data Location(s) Ahmedabad - Venus Stratum GCC Kraft Heinz is an Equal Opportunity Employer – Underrepresented Ethnic Minority Groups/Women/Veterans/Individuals with Disabilities/Sexual Orientation/Gender Identity and other protected classes . Show more Show less

Posted 1 week ago

Apply

0.0 - 1.0 years

0 Lacs

Thiruvananthapuram, Kerala

On-site

Indeed logo

Job Title: Python Developer – AI/ML Experience: 1–4 Years Location: Trivandrum – Work From Office (Kerala candidates only) Employment Type: Full-Time Key Responsibilities: Develop and deploy ML models using Python Work with libraries like NumPy, Pandas, Scikit-learn, TensorFlow, or PyTorch Preprocess and analyze large datasets Collaborate with data scientists and engineers for model integration Optimize code and ensure performance in production environments Requirements: Strong Python programming skills Hands-on experience with ML algorithms and data processing Familiarity with model training, evaluation, and versioning Knowledge of REST APIs and cloud platforms is a plus Mail to thasleema@qcentro.com Job Type: Permanent Location Type: In-person Application Question(s): Kerala candidates only Experience: python AI/ML: 1 year (Required) Work Location: In person

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies