Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 years
1 - 3 Lacs
Madurai
On-site
Hello, We are seeking a highly skilled and motivated Python Developer having 2 to 3 years of expertise in Artificial Intelligence (AI) and Machine Learning (ML) to join our technology team. The ideal candidate will have a strong foundation in Python programming, experience with AI/ML frameworks, and the ability to build scalable, intelligent systems that solve real-world problems. Experience : Minimum 2 to 3 years Required Skills: Programming: Strong proficiency in Python and libraries like NumPy, Pandas, Scikit-learn, etc. AI/ML Frameworks: Experience with TensorFlow, PyTorch, Keras, or similar libraries. Understanding of supervised, unsupervised, and deep learning algorithms. Data Handling: Expertise in data preprocessing, feature engineering, and visualization. Familiarity with SQL and NoSQL databases. Tools & Technologies: Experience with Jupyter Notebooks, Git, Docker, and REST APIs. Familiarity with cloud platforms like AWS, Azure, or GCP is a plus. Soft Skills: Problem-solving and analytical thinking. Effective communication and teamwork. Self-motivated and adaptable to fast-changing environments. Preferred Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or related field. 2+ years of experience in Python development with a focus on AI/ML. Previous experience in deploying ML models to production environments. Key Responsibilities: 1. Design, develop, and deploy AI/ML models using Python and associated frameworks. 2. Collaborate with data scientists, engineers, and product teams to implement intelligent features in applications. 3. Collect, preprocess, and analyze structured and unstructured data for training and evaluation. 4. Build APIs or services to serve AI/ML models in production environments. 5. Optimize model performance and monitor outcomes for accuracy and relevance. 6. Research and implement best practices, tools, and technologies in the AI/ML ecosystem. 7. Maintain proper documentation and version control of code and models. 8. Ensure scalability, efficiency, and security of deployed solutions. Work location : Madurai & Madurai based candidates are highly preferred. Office timings : 10 AM to 7:30 PM Interested candidates can forward their updated resume to this Whatsapp Number +918122781352 Telephonic interview will be initially held for the experienced candidates & Based on the performance further details will be discussed. Job Types: Full-time, Permanent Pay: ₹15,000.00 - ₹30,000.00 per month Schedule: Day shift Fixed shift Morning shift Supplemental Pay: Performance bonus Yearly bonus Work Location: In person
Posted 16 hours ago
0 years
0 Lacs
India
Remote
🤖 Machine Learning Intern – Remote | Learn AI by Building It 📍 Location: Remote / Virtual 💼 Type: Internship (Unpaid) 🎁 Perks: Certificate After Completion || Letter of Recommendation (6 Months) 🕒 Schedule: 5–7 hrs/week | Flexible Timing Join Skillfied Mentor as a Machine Learning Intern and move beyond online courses. You’ll work on real datasets, build models, and see your algorithms in action — all while gaining experience that hiring managers actually look for. Whether you're aiming for a career in AI, data science, or automation — this internship will build your foundation with hands-on learning. 🔧 What You’ll Do: Work with real datasets to clean, preprocess, and transform data Build machine learning models using Python, NumPy, Pandas, Scikit-learn Perform classification, regression, and clustering tasks Use Jupyter Notebooks for experimentation and documentation Collaborate on mini-projects and model evaluation tasks Present insights in simple, digestible formats 🎓 What You’ll Gain: ✅ Full Python course included during the internship ✅ Hands-on projects to showcase on your resume or portfolio ✅ Certificate of Completion + LOR (6-month internship) ✅ Experience with industry-relevant tools & techniques ✅ Remote flexibility — manage your time with just 5–7 hours/week 🗓️ Application Deadline: 5th August 2025 👉 Apply now to start your ML journey with Skillfied Mentor
Posted 16 hours ago
0 years
0 Lacs
India
Remote
Machine Learning Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with Certificate of Internship Application Deadline : 01st August 2025 About Unified Mentor Unified Mentor provides students and graduates with hands-on learning opportunities and career growth in Machine Learning and Data Science. Role Overview As a Machine Learning Intern, you will work on real-world projects, enhancing your practical skills in data analysis and model development. Responsibilities ✅ Design, test, and optimize machine learning models ✅ Analyze and preprocess datasets ✅ Develop algorithms and predictive models ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn ✅ Document findings and create reports Requirements 🎓 Enrolled in or a graduate of a relevant program (Computer Science, AI, Data Science, or related field) 🧠 Knowledge of machine learning concepts and algorithms 💻 Proficiency in Python or R (preferred) 🤝 Strong analytical and teamwork skills Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Hands-on machine learning experience ✔ Internship Certificate & Letter of Recommendation ✔ Real-world project contributions for your portfolio Equal Opportunity Unified Mentor is an equal-opportunity employer, welcoming candidates from all backgrounds.
Posted 19 hours ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Assist in developing Python-based applications and scripts supporting AI/ML workflows. Support model building and data analysis using libraries like NumPy, Pandas, Scikit-learn, TensorFlow, or PyTorch. Clean and preprocess datasets for training and validation using best practices in data wrangling. Work under senior developers/data scientists to deploy, test, and validate ML models. Contribute to automation scripts and tools that optimize data pipelines and ML operations. Document code, algorithms, and workflows for internal and external use. Stay updated with emerging trends in Python development and machine learning frameworks. Participate in code reviews and team meetings, ensuring continuous learning and improvement. Good knowledge in SQL.
Posted 19 hours ago
4.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Greetings ! One our our client TOP MNC Giant looking for Data Scientist Important Notes: Only person who can join immediately or within 7 days ONLY APPLY Base Locations: Gurgaon and Bengaluru (hybrid setup 3 days work from office). Role: Data Scientist Exp: 4 to 8 Years Immediate Joiners Only Skills (must have) Bachelor’s or master’s degree in computer science, Data Science, Engineering, or a related field. Strong programming skills in languages such as Python, SQL etc. Experience in developing and deploying AI/ML and deep learning solutions with libraries and frameworks, such as Scikit-learn, TensorFlow, PyTorch etc. Experience in ETL and Datawarehouse tools such as Azure Data Factory,Azur e Data Lake or Databricks etc. Knowledge of math, probability, and statistics. Familiarity with a variety of ML algorithms. Good experience in cloud infrastructure such as Azure (Preferred), AWS/GCP Exposure to Gen AI, Vector DB, LLM (Large language Model) Skills (good to have) Experience in Flask/Django, Streamlit is a bonus Experience with MLOps: MLFlow, Kubeflow, CI/CD Pipeline etc. Good to have experience in Docker, Kubernetes etc Collaborate with software engineers, business stake holders and/or domain experts to translate business requirements into product features, tools, projects, AI/ML, NLP/NLU and deep learning solutions. Develop, implement, and deploy AI/ML solutions. Preprocess and analyze large datasets to identify patterns, trends, and insights. Evaluate, validate, and optimize AI/ML models to ensure their accuracy, efficiency, and generalizability. Deploy applications and AI/ML model into cloud environment such as AWS/Azure/GCP etc. Monitor and maintain the performance of AI/ML models in production environments, identifying opportunities for improvement and updating models as needed. Document AI/ML model development processes, results, and lessons learned to facilitate knowledge sharing and continuous improvement. INTERESTED CANDIDATES PERFECT MATCH TO THE JD AND WHO CAN JOIN ASAP ONLY DO APPLY ALONG WITH BELOW MENTIONED DETAILS : Total exp : Relevant exp in Data Scientist : Applying for Gurgaon and Bengaluru : Open for Hybrid : Current CTC : Expected CTC : Can join ASAP : Will call you once we receive your updated profile along with above mentioned details. Thanks, Venkat Solti solti.v@anlage.co.in
Posted 20 hours ago
10.0 - 12.0 years
0 Lacs
Delhi, India
On-site
Sr/Lead ML Engineer Placement type (FTE/C/CTH): C/CTH Duration : 6 month with extension Location: Phoenix AZ, must be onsite 5 days a week Start Date: 2 weeks from the offer Interview Process One and done Reason for position Integration ML to the Observability Grafana platform Team Overview Onshore and offshore Project Description AI/ML for Observability (AIOps) Developed machine learning and deep learning solutions for observability data to enhance IT operations. Implemented time series forecasting, anomaly detection, and event correlation models. Integrated LLMs using prompt engineering, fine-tuning, and RAG for incident summarization. Built MCP client-server architecture for seamless integration with the Grafana ecosystem. Duties/Day to Day Overview Machine Learning & Model Development Design and develop ML/DL models for: Time series forecasting (e.g., system load, CPU/memory usage) Anomaly detection in logs, metrics, or traces Event classification and correlation to reduce alert noise Select, train, and tune models using frameworks like TensorFlow, PyTorch, or scikit-learn Evaluate model performance using metrics like precision, recall, F1-score, and AUC ML Pipeline Engineering Build scalable data pipelines for training and inference (batch or streaming) Preprocess large observability datasets from tools like Prometheus, Kafka, or BigQuery Deploy models using cloud-native services (e.g., GCP Vertex AI, Azure ML, Docker/Kubernetes) Maintain retraining pipelines and monitor for model drift LLM Integration for Observability Intelligence Implement LLM-based workflows for summarizing incidents or logs Develop and refine prompts for GPT, LLaMA, or other large language models Integrate Retrieval-Augmented Generation (RAG) with vector databases (e.g., FAISS, Pinecone) Control latency, hallucinations, and cost in production LLM pipelines Grafana & MCP Ecosystem Integration Build or extend MCP client/server components for Grafana Surface ML model outputs (e.g., anomaly scores, predictions) in observability dashboards Collaborate with observability engineers to integrate ML insights into existing monitoring tools Collaboration & Agile Delivery Participate in daily stand-ups, sprint planning, and retrospectives Collaborate with: Data engineers on pipeline performance and data ingestion Frontend developers for real-time data visualizations SRE and DevOps teams for alert tuning and feedback loop integration Translate model outputs into actionable insights for platform teams Testing, Documentation & Version Control Write unit, integration, and regression tests for ML code and pipelines Maintain documentation on models, data sources, assumptions, and APIs Use Git, CI/CD pipelines, and model versioning tools (e.g., MLflow, DVC) Top Requirements (Must haves) AI ML Engineer Skills Design and develop machine learning algorithms and deep learning applications and systems for Observability data (AIOps) Hands on experience in Time series forecasting/prediction, anomaly detection ML algorithms Hands on experience in event classification and correlation ML algorithms Hands on experience on integrating with LLMs with prompt/fine-tuning/rag for effective summarization Working knowledge on implementing MCP client and server for Grafana Eco-system or similar exposure Key Skills: Programming languages: Python, R ML Frameworks: TensorFlow, PyTorch, scikit-learn Cloud platforms: Google Cloud, Azure Front-End Frameworks/Libraries: Experience with frameworks like React, Angular, or Vue.js, and libraries like jQuery. Design Tools: Proficiency in design software like Figma, Adobe XD, or Sketch. Databases: Knowledge of database technologies like MySQL, MongoDB, or PostgreSQL. Server-Side Languages: Familiarity with server-side languages like Python, Node.js, or Java. Version Control: Experience with Git and other version control systems. Testing: Knowledge of testing frameworks and methodologies. Agile Development: Experience with agile development methodologies. Communication and Collaboration: Strong communication and collaboration skills. Experience: Lead – 10 to 12 Years (Onshore and Offshore). Developers - 6 to 8 Years for Engineers
Posted 1 day ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description: Job Title : Python Developer/Backend Developer (AI/ML knowledge preferred), AWS Cloud We are currently seeking a talented Python Developer with a strong foundation in software development and a keen interest in artificial intelligence and machine learning. While AI/ML knowledge is not mandatory, it is considered an asset for this role. As a Python Developer at EXL you will have the opportunity to work on diverse projects and collaborate with cross-functional teams to deliver high-quality solutions. Responsibilities: Develop and maintain scalable and robust Python applications and services. Collaborate with software engineers, data scientists, and other stakeholders to integrate AI/ML components into software solutions. Assist in implementing AI/ML algorithms and models using Python-based libraries and frameworks. Participate in code reviews, testing, and debugging activities to ensure the quality and reliability of software products. Stay updated on emerging technologies and trends in AI/ML to contribute insights and ideas for enhancing our products and services. Work closely with data engineers to access, preprocess, and analyze data for AI/ML model development. Document code, processes, and best practices to facilitate knowledge sharing and collaboration within the team. Provide support and assistance to other team members as needed. Qualifications: Bachelor’s or master’s degree in computer science, Engineering, or related field. Strong proficiency in Python programming language. Strong proficiency in AWS Cloud Familiarity with software development methodologies, tools, and best practices. Understanding of basic concepts in artificial intelligence and machine learning is good to have. Strong proficiency in python programming for ML development Hand on experience working with ML frameworks (Tensor, Scikit, etc.) Knowledge of Azure cloud and especially working with Azure ML studio and cognitive services. Knowledge on working with SQL, NO SQL Databases and REST APIs Knowledge on Azure OpenAI is good have and preferred. Dataset preparation and cleansing for model creation. Working knowledge of different types of data (structured, semi-structured, and unstructured) Expertise in python frameworks such as Fast API, Flask and Django. Working with huge data sets and data analysis with Pandas and NumPy Working with Python ORM Libraries Ability to handle large datasets. Ability to work independently and collaboratively in a fast-paced environment. Excellent problem-solving skills and attention to detail. Effective communication and interpersonal skills. While prior experience or knowledge in AI/ML is preferred, we welcome candidates who are passionate about learning and growing in this field. If you are a talented Python Developer looking to expand your skills and contribute to exciting projects, we encourage you to apply and join our dynamic team at EXL.
Posted 1 day ago
0 years
0 Lacs
India
Remote
Data Science Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Application Deadline: 01st August 2025 Opportunity: Full-time role based on performance + Internship Certificate About Unified Mentor Unified Mentor provides aspiring professionals with hands-on experience in data science through industry-relevant projects, helping them build successful careers. Responsibilities Collect, preprocess, and analyze large datasets Develop predictive models and machine learning algorithms Perform exploratory data analysis (EDA) to extract insights Create data visualizations and dashboards for effective communication Collaborate with cross-functional teams to deliver data-driven solutions Requirements Enrolled in or a graduate of Data Science, Computer Science, Statistics, or a related field Proficiency in Python or R for data analysis and modeling Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) Familiarity with data visualization tools like Tableau, Power BI, or Matplotlib Strong analytical and problem-solving skills Excellent communication and teamwork abilities Stipend & Benefits Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) Hands-on experience in data science projects Certificate of Internship & Letter of Recommendation Opportunity to build a strong portfolio of data science models and applications Potential for full-time employment based on performance How to Apply Submit your resume and a cover letter with the subject line "Data Science Intern Application." Equal Opportunity: Unified Mentor welcomes applicants from all backgrounds.
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Senior – Senior Data Scientist Role Overview: We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus. Minimum 3-7 years of experience in Data Science and Machine Learning. In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 day ago
2.0 - 3.0 years
2 - 6 Lacs
Hyderābād
Remote
We are looking for a highly motivated and skilled Generative AI (GenAI) Developer to join our dynamic team. You will be responsible for building and deploying GenAI solutions using large language models (LLMs) to address real-world business challenges. The role involves working with cross-functional teams, applying prompt engineering and fine-tuning techniques, and building scalable AI-driven applications. A strong foundation in machine learning, NLP, and a passion for emerging GenAI technologies is essential. Responsibilities Design, develop, and implement GenAI solutions using large language models (LLMs) to address specific business needs using Python. Collaborate with stakeholders to identify opportunities for GenAI integration and translate requirements into scalable solutions. Preprocess and analyze unstructured data (text, documents, etc.) for model training, fine-tuning, and evaluation. Apply prompt engineering, fine-tuning, and RAG (Retrieval-Augmented Generation) techniques to optimize LLM outputs. Deploy GenAI models and APIs into production environments, ensuring performance, scalability, and reliability. Monitor and maintain deployed solutions, incorporating improvements based on feedback and real-world usage. Stay up to date with the latest advancements in GenAI, LLMs, and orchestration tools (e.g., LangChain, LlamaIndex). Write clean, maintainable, and well-documented code, and contribute to team-wide code reviews and best practices. Requirements 2-3 years of relevant Proven experience as an AI Developer. Proficiency in Python Good understanding multiple of Gen AI models (OpenAI, LLAMA2, Mistral) and ability to setup up local GPTs using ollama, lm studio etc. Experience with LLMs, RAG (Retrieval-Augmented Generation), and vector databases (e.g., FAISS, Pinecone). Multi agents frameworks to create workflows Langchain or similar tools like lamaindex, langgraph etc. Knowledge of Machine Learning frameworks, libraries, and tools. Excellent problem-solving skills and solution mindset Strong communication and teamwork skills. Ability to work independently and manage ones time effectively. Experience with any of cloud platforms (AWS, GCP, Azure). Benefits Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centres. Work-Life Balance: Accellor prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training, Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Personal Accident Insurance, Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Disclaimer: - Accellor is proud to be an equal opportunity employer. We do not discriminate in hiring or any employment decision based on race, color, religion, national origin, age, sex (including pregnancy, childbirth, or related medical conditions), marital status, ancestry, physical or mental disability, genetic information, veteran status, gender identity or expression, sexual orientation, or other applicable legally protected characteristic
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Senior – Senior Data Scientist Role Overview: We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus. Minimum 3-7 years of experience in Data Science and Machine Learning. In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 day ago
0 years
2 - 6 Lacs
Indore
On-site
We are a cutting-edge technology company that specializes in developing innovative artificial intelligence and machine learning solutions. Our mission is to harness the power of AI to drive business growth, improve efficiency, and enhance customer experience. Job Summary: We are seeking an experienced Artificial Intelligence Engineer to join our team in Indore, Madhya Pradesh, India. As an AI Engineer, you will be responsible for designing, developing, and deploying AI and ML models to solve complex business problems. You will work closely with our data scientists, product managers, and other engineers to integrate AI into our products and services. Key Responsibilities: 1. Design and Develop AI/ML Models: Design, develop, and deploy AI/ML models using Python and other relevant technologies. Collaborate with data scientists to gather requirements, collect data, and develop models. Implement and test models using various frameworks and libraries (e.g., TensorFlow, PyTorch, Scikit-learn). 2. Prompt Engineering: Develop and refine natural language processing (NLP) models using prompt engineering techniques. Create high-quality prompts to elicit accurate and relevant responses from AI models. Optimize prompt design to improve model performance and reduce errors. 3. Python Development: Develop and maintain Python scripts and code to support AI/ML model deployment. Utilize Python libraries and frameworks to build and integrate AI/ML models into our products. Collaborate with other engineers to ensure seamless integration with existing systems. 4. Data Preprocessing and Analysis: Collect, preprocess, and analyze data to support AI/ML model development. Clean, transform, and feature-engineer data to improve model performance. Work with data scientists to identify and address data quality issues. 5. Collaboration and Communication: Work closely with data scientists, product managers, and other engineers to integrate AI
Posted 1 day ago
6.0 years
0 Lacs
India
Remote
We’re Hiring: Machine Learning Engineer (Part-Time | Flexible Remote) Are you an experienced ML Engineer looking for a flexible, part-time opportunity to work on real-world impact projects? Join us in building intelligent systems that match candidates to projects using structured skills, assessments, and feedback data. This is your chance to own end-to-end ML pipelines and work on meaningful automation in the HRTech space — all on your own schedule. 🔍 Role Overview: We’re looking for an ML Engineer to architect and deploy predictive models that power candidate–project matching intelligence , leveraging structured applicant data. You'll design scalable ML workflows and inference pipelines on AWS . 🔧 Key Responsibilities: Build data pipelines to ingest & preprocess applicant data (skills, assessments, feedback) Engineer task-specific features and transformation logic Train predictive models (logistic regression, XGBoost, etc.) on SageMaker Automate batch ETL, retraining flows, and storage with AWS S3 Deploy inference endpoints with Lambda , and integrate with systems in production Monitor model drift, performance, and feedback loops for continuous learning Document architecture, workflows, and ensure explainability ✅ You’ll Need: 4–6+ years in ML/AI engineering roles Strong command of Python, scikit-learn, XGBoost , and feature engineering Proven experience with AWS ML stack : SageMaker, Lambda, S3 Hands-on SQL/NoSQL and automated data workflows Familiarity with CI/CD for ML (CodePipeline, CodeBuild, etc.) Ability to independently own schema, features, and model delivery Bachelor’s or Master’s in CS, Engineering, or related field ⭐ Bonus Points For: NLP experience (extracting features from feedback/comments) Familiarity with serverless architectures Background in recruitment, talent platforms, or skills-matching system 👉 Apply now or DM us to know more. #Hiring #MachineLearning #MLJobs #RemoteJobs #PartTime #AWS #HRTech #MLOps #AI #RecruitmentTech #SageMaker #FlexibleWork
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Senior – Senior Data Scientist Role Overview: We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus. Minimum 3-7 years of experience in Data Science and Machine Learning. In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
Kochi, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Senior – Senior Data Scientist Role Overview: We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus. Minimum 3-7 years of experience in Data Science and Machine Learning. In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Senior – Senior Data Scientist Role Overview: We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus. Minimum 3-7 years of experience in Data Science and Machine Learning. In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 day ago
0 years
0 Lacs
India
Remote
Data Science Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Application Deadline: 31st July 2025 Opportunity: Full-time role based on performance + Internship Certificate About Unified Mentor Unified Mentor provides aspiring professionals with hands-on experience in data science through industry-relevant projects, helping them build successful careers. Responsibilities Collect, preprocess, and analyze large datasets Develop predictive models and machine learning algorithms Perform exploratory data analysis (EDA) to extract insights Create data visualizations and dashboards for effective communication Collaborate with cross-functional teams to deliver data-driven solutions Requirements Enrolled in or a graduate of Data Science, Computer Science, Statistics, or a related field Proficiency in Python or R for data analysis and modeling Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) Familiarity with data visualization tools like Tableau, Power BI, or Matplotlib Strong analytical and problem-solving skills Excellent communication and teamwork abilities Stipend & Benefits Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) Hands-on experience in data science projects Certificate of Internship & Letter of Recommendation Opportunity to build a strong portfolio of data science models and applications Potential for full-time employment based on performance How to Apply Submit your resume and a cover letter with the subject line "Data Science Intern Application." Equal Opportunity: Unified Mentor welcomes applicants from all backgrounds.
Posted 1 day ago
3.0 years
0 Lacs
India
On-site
Job Title: Supply Chain Optimization Specialist Experience: 3+ Years Department: Operations Research / Supply Chain Analytics Position Overview: We are seeking a highly analytical and skilled Supply Chain Optimization Specialist with a strong background in mathematical modeling, optimization, and data analysis. The ideal candidate will play a critical role in improving supply chain operations by developing advanced models and providing data-driven insights. You will collaborate with cross-functional teams to ensure effective implementation of optimized solutions in real-world supply chain systems. Key Responsibilities: Mathematical Modeling & Optimization Develop, refine, and validate mathematical models for inventory management, production planning, transportation logistics, and distribution networks. Apply advanced optimization techniques including linear programming, integer programming, network flows, simulation, and heuristics to solve complex supply chain challenges. Perform sensitivity analysis, scenario modeling, and risk assessment to evaluate system performance under various conditions. Translate business objectives, constraints, and requirements into mathematical frameworks and optimization problems. Data Analysis & Insights Analyze large-scale supply chain data to extract actionable insights and identify performance trends. Partner with data scientists and analysts to gather, clean, and preprocess data from multiple sources ensuring accuracy and completeness. Provide recommendations to optimize cost, improve efficiency, and enhance customer satisfaction through data-driven decisions. Solution Development & Deployment Present analytical findings, models, and recommendations to stakeholders in a clear, structured format. Provide input on trade-offs between analytical rigor and speed-to-market solutions. Collaborate with internal teams including Data Engineers, Data Scientists, Business Analysts, and Project Managers to test and deploy solutions effectively. Research & Innovation Stay abreast of emerging trends in supply chain management, operations research, and optimization methodologies. Research and propose innovative approaches to address new and evolving supply chain challenges. Qualifications: Master’s degree in Industrial Engineering, Operations Research, Management Science , or a related field. 3+ years of professional experience in supply chain modeling and optimization. Strong command of optimization techniques such as linear/integer programming, network flow modeling, simulation, and heuristic algorithms . Programming proficiency in Python, R , or MATLAB , with hands-on experience using optimization libraries like Gurobi, CPLEX, FICO . Expertise in data manipulation using pandas, NumPy , and similar tools. Solid understanding of SQL for data extraction; experience with visualization platforms like Tableau or Power BI . Strong knowledge of supply chain processes, including demand forecasting, inventory management, production planning, transportation logistics , and distribution networks . Preferred Skills: Excellent problem-solving and critical thinking abilities. Strong communication skills to explain technical solutions to non-technical stakeholders. Experience working in cross-functional and collaborative environments.
Posted 1 day ago
2.0 - 3.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
We are looking for a highly motivated and skilled Generative AI (GenAI) Developer to join our dynamic team. You will be responsible for building and deploying GenAI solutions using large language models (LLMs) to address real-world business challenges. The role involves working with cross-functional teams, applying prompt engineering and fine-tuning techniques, and building scalable AI-driven applications. A strong foundation in machine learning, NLP, and a passion for emerging GenAI technologies is essential. Responsibilities Design, develop, and implement GenAI solutions using large language models (LLMs) to address specific business needs using Python Collaborate with stakeholders to identify opportunities for GenAI integration and translate requirements into scalable solutions Preprocess and analyze unstructured data (text, documents, etc.) for model training, fine-tuning, and evaluation Apply prompt engineering, fine-tuning, and RAG (Retrieval-Augmented Generation) techniques to optimize LLM outputs Deploy GenAI models and APIs into production environments, ensuring performance, scalability, and reliability Monitor and maintain deployed solutions, incorporating improvements based on feedback and real-world usage Stay up to date with the latest advancements in GenAI, LLMs, and orchestration tools (e.g., LangChain, LlamaIndex) Write clean, maintainable, and well-documented code, and contribute to team-wide code reviews and best practices Requirements 2-3 years of relevant Proven experience as an AI Developer Proficiency in Python Good understanding multiple of Gen AI models (OpenAI, LLAMA2, Mistral) and ability to setup up local GPTs using ollama, lm studio etc Experience with LLMs, RAG (Retrieval-Augmented Generation), and vector databases (e.g., FAISS, Pinecone) Multi agents frameworks to create workflows Langchain or similar tools like lamaindex, langgraph etc Knowledge of Machine Learning frameworks, libraries, and tools Excellent problem-solving skills and solution mindset Strong communication and teamwork skills Ability to work independently and manage ones time effectively Experience with any of cloud platforms (AWS, GCP, Azure) Benefits Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centres. Work-Life Balance: Accellor prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training, Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Personal Accident Insurance, Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Disclaimer: - Accellor is proud to be an equal opportunity employer. We do not discriminate in hiring or any employment decision based on race, color, religion, national origin, age, sex (including pregnancy, childbirth, or related medical conditions), marital status, ancestry, physical or mental disability, genetic information, veteran status, gender identity or expression, sexual orientation, or other applicable legally protected characteristic
Posted 1 day ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description - We are seeking a Machine Learning Engineer to assist in developing and implementing object detection and AI-based predictions. You will have the opportunity to work on real-world applications and contribute to the development of novel algorithms. As a ML engineer, you’ll collaborate with our team and works various applied AI/ML tasks. Key Responsibilities - Assist in training and fine-tuning ML models for real-time AI-based tasks. Work with large datasets to prepare, annotate, preprocess, and augment data for training purposes. Implement and test model architectures to improve accuracy, speed, and performance. Help analyze and optimize model performance based on results and metrics. Document research and findings, contributing to team knowledge and project reports. Participate in code reviews and contribute to software development best practices. Stay updated with the latest trends and research in computer vision and applied ML. Qualifications - Bachelor's or master's degree program in Computer Science, Electrical Engineering, or a related field. Solid understanding of AI/ML concepts, data cleaning, synthetic data generation and other relevant concepts. Hands-on experience with MLOps. Hands-on experience with YOLO or similar deep learning object detection frameworks (e.g., Faster R-CNN, SSD). Proficiency in programming languages such as Python Experience with OpenCV and Pytorch is a must. Experience with Statistical Analysis and modeling is a plus. Experience with CUDA and GPU acceleration is a plus. Experience with ROS and C++ is also a huge plus. Strong problem-solving skills, attention to detail, and a collaborative mindset. Preferred Skills - Knowledge of data cleaning techniques and data preprocessing methods. Familiarity with version control systems like Git . Experience in deploying models into production environments (optional). Exposure to ROS and OpenCV in C++.
Posted 1 day ago
0 years
0 Lacs
India
Remote
Data Science Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About WebBoost Solutions by UM WebBoost Solutions by UM provides aspiring professionals with hands-on experience in data science , offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities ✅ Collect, preprocess, and analyze large datasets. ✅ Develop predictive models and machine learning algorithms . ✅ Perform exploratory data analysis (EDA) to extract meaningful insights. ✅ Create data visualizations and dashboards for effective communication of findings. ✅ Collaborate with cross-functional teams to deliver data-driven solutions . Requirements 🎓 Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field . 🐍 Proficiency in Python for data analysis and modeling. 🧠 Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) . 📊 Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib) . 🧐 Strong analytical and problem-solving skills. 🗣 Excellent communication and teamwork abilities. Stipend & Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based). ✔ Hands-on experience in data science projects . ✔ Certificate of Internship & Letter of Recommendation . ✔ Opportunity to build a strong portfolio of data science models and applications. ✔ Potential for full-time employment based on performance. How to Apply 📩 Submit your resume and a cover letter with the subject line "Data Science Intern Application." 📅 Deadline: 31st July 2025 Equal Opportunity WebBoost Solutions by UM is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Let me know if you need any modifications! 🚀
Posted 1 day ago
2.0 - 3.0 years
0 Lacs
New Delhi, Delhi, India
On-site
Experience - 2-3 Years Must Have Skills - Excellent verbal and written English communication skills. Role of the Data Analyst : This role leads a mix of critical components, dealing with large data sets and turning into meaningful insights that support strategic decision-making across the organization working closely with key business stakeholders to analyze business performance, analyze trends and develop data driven solutions that enhance operational efficiency. Your Role Accountabilities: OPERATIONS/PROJECT MANAGEMENT ● Analyze large datasets using SQL and MySQL Workbench to identify trends and patterns. ● Work closely with cross-functional teams to understand business objectives and translate them into data driven insights. ● Develop, maintain, and enhance dashboards and reports using Microsoft Excel, Pivot Tables, and Charts. ● Conduct deep-dive analyses to explain metric anomalies and performance dips (e.g., user engagement or sales).. ● Present findings in a clear and concise manner to both technical and non-technical stakeholders. ● Support ongoing optimization of business operations and hiring strategies through data analysis. ● Continuously improve data processes and stay up to date with best practices in analytics. ● Contribute to the automation of recurring reports and data extraction processes to increase team efficiency. ● Clean, validate, and preprocess raw data to ensure data quality and accuracy before analysis. Collaborate with other teams including sourcing, procurement, mobility to deliver a high quality customer service. STRATEGY ● Collaborate with key stakeholders to understand team needs and dependencies to better align business processes. ● Assist in developing and executing a methodology to evaluate, prioritize and monitor the success of the business processes. ● Work closely with various cross function org to understand the change, draw strategy to cover the support for business users. ● Collaborate with key stakeholders, gathered requirements to plan the budget, track the expenses and future forecast. ● Create comprehensive and meaningful strategy presentations for senior executives. ● Ability to build a framework and drive development through dynamic business intelligence tools and dashboards for use, Ability to handle multiple assignments concurrently. in ongoing business planning and goal measurement through KPIs. and worksheets. ● A passion for accuracy and translating insights into a compelling narrative; able to maintain a balance between the details and the larger picture. ANALYTICS ● Develop comprehensive performance analysis of business processes and review ways of improvement. ● Actively participate in stakeholder meetings with the goal of understanding all major projects and initiatives planned Qualifications & Experiences: ● 2-3 year of experience as a Data Analyst. ● Expert user of Microsoft Office (Excel, PowerPoint, Word) to prepare all documents, presentations, graphs. ● Educational qualification – B.Tech or any Master degree in computers. Not Required but preferred experience: ● Familiarity with streaming and similar products/services ● Experience working in a national or global company ● Comfortable in working in highly iterative and somewhat unstructured environment .
Posted 2 days ago
0 years
0 Lacs
Greater Kolkata Area
On-site
We are seeking a skilled and passionate AI/ML Engineer to join our team and help us develop intelligent systems that leverage machine learning and artificial intelligence. You will design, develop, and deploy machine learning models, work closely with cross-functional teams, and contribute to cutting-edge solutions that solve real-world problems. Responsibilities Design and implement machine learning models and algorithms for various use cases (e. g., prediction, classification, NLP, computer vision). Analyze and preprocess large datasets to build robust training pipelines. Research to stay up-to-date with the latest AI/ML advancements and integrate relevant techniques into projects. Train, fine-tune, and optimize models for performance and scalability. Deploy models to production using tools such as Docker, Kubernetes, or cloud services (AWS, GCP, Azure). Collaborate with software engineers, data scientists, and product teams to integrate AI/ML solutions into applications. Monitor model performance in production and continuously iterate for improvements. Document design choices, code, and models for transparency and reproducibility. Requirements Experience with NLP libraries (e. g., Hugging Face Transformers, spaCy) or computer vision tools (e. g., OpenCV). Preferred experience in Real Image Processing, and RAG-oriented. Background in deep learning architectures such as CNNs, RNNs, GANs, or transformer models. Knowledge of MLOps practices and tools (e. g., MLflow, Kubeflow, SageMaker). Contributions to open-source AI/ML projects or publications in relevant conferences/journals. This job was posted by Tista Saha from SentientGeeks.
Posted 2 days ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Data is now more important than ever, and the information distilled from it is crucial to make meaningful decisions. GeoIQ is a product developed by data scientists for data scientists. We obtain, visualise, and analyse data from heterogeneous sources to build smarter variables that aid decision making. As a part of the data science team, you will be responsible for building predictive models and day-to-day analysis for our clients, building algorithms on the data collected from hundreds of sources to build location-defining attributes. Responsibilities Collaborate with the data science team to understand project requirements and objectives. Collect, clean, and preprocess data from various sources to ensure its accuracy and suitability for analysis. Develop and maintain data pipelines to automate data ingestion and transformation processes. Conduct exploratory data analysis to identify patterns, trends, and insights in large datasets. Utilise statistical techniques to perform data analysis and generate actionable insights for clients. Build and implement predictive models using machine learning algorithms to support decision-making processes. Create visualisations and dashboards to communicate analysis results and findings effectively. Collaborate with cross-functional teams to understand their data needs and provide analytical support. Continuously improve data quality, data integrity, and data security practices. Stay up-to-date with the latest trends and advancements in data analysis and machine learning. Assist in the development and improvement of GeoIQ's location AI platform through data-driven insights. Participate in team meetings and brainstorming sessions to contribute innovative ideas and solutions. Requirements Proficient in Python. Experience with R is a plus. Ability to analyse large datasets and draw insightful observations. Work experience of 1+ years with at least 6 months working with Python. Prior experience with data extraction, manipulation, and analysis. Prior experience with SQL. Knowledge of statistical techniques. Experience with working on Spatial Data will be an added advantage. This job was posted by Saurav Mehta from GeoIQ.
Posted 2 days ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Mandate 2: Remote Work About Swiggy Swiggy Instamart, is building the convenience grocery segment in India. We offer more than 30000 + assortments / products to our customers within 10-15 mins. We are striving to augment our consumer promise of enabling unparalleled convenience by making grocery delivery instant and delightful. Instamart has been operating in 90+ cities across India and plans to expand to a few more soon. We have seen immense love from the customers till now and are excited to redefine how India shops. Role and Responsibilities: - Analysing various business scenarios and recommending prompt actions for exponentially expanding business. Co-create initiatives to the business teams to meet business objectives. Gather relevant data from various sources and Clean, preprocess, and transform raw data into a usable format. Help a manager with data and automating interactive dashboards to enable real-time monitoring of key metrics. Provide recommendations based on data findings to support decision-making. Oversee and maintain our marketing softwares - stack, including CRM, marketing finance tools, analytics platforms, and more. Collaborate with marketing teams to ensure the successful execution of campaigns, including BTL, ATL ,activations and sampling. Work closely with sales, product, and other teams to align marketing efforts.. Assist in the management of the marketing budget, tracking expenses, and ensuring cost-effectiveness. Monitor campaign performance and make real-time adjustments as needed. Desired Candidate: - Min 3 year Work Experience of SQL and MS-Excel Understanding of SQL (Structured Query Language) to work with databases. Ability to clean, explore, and structure the data from data. A degree in fields like Mathematics, Statistics, or Computer Science "We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regards to race, color, religion, sex, disability status, or any other characteristic protected by the law"
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The preprocess job market in India is thriving with opportunities for skilled professionals in various industries. Preprocess roles are crucial for data processing, cleaning, and transformation tasks that are essential for businesses to make informed decisions and gain insights from data. Job seekers with expertise in preprocess tools and techniques are in high demand across industries like IT, finance, healthcare, marketing, and more.
These major cities are actively hiring for preprocess roles, offering a wide range of opportunities for job seekers looking to kickstart or advance their careers in this field.
The average salary range for preprocess professionals in India varies based on experience levels. Entry-level professionals can expect to earn around INR 3-5 lakhs per annum, while experienced preprocess specialists can command salaries ranging from INR 8-15 lakhs per annum.
In the preprocess domain, a typical career path may progress as follows: - Junior Preprocess Analyst - Preprocess Specialist - Senior Preprocess Engineer - Preprocess Team Lead - Preprocess Manager
As professionals gain experience and expertise in preprocess tools and techniques, they can advance to higher roles with more responsibilities and leadership opportunities.
In addition to expertise in preprocess tools and techniques, professionals in this field are often expected to have or develop skills in: - Data analysis - Data visualization - Programming languages like Python, R, or SQL - Machine learning - Statistical analysis
Having a diverse skill set can enhance the career prospects of preprocess professionals and open up new opportunities in the data-driven industry.
(Provide 25 interview questions with varying difficulty levels)
As you explore preprocess jobs in India, remember to equip yourself with the necessary skills and knowledge to stand out in a competitive job market. Prepare thoroughly for interviews, showcase your expertise in preprocess tools and techniques, and apply confidently to secure exciting opportunities in this dynamic field. Good luck on your job search journey!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough