Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
Kochi, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Senior – Senior Data Scientist Role Overview: We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus. Minimum 3-7 years of experience in Data Science and Machine Learning. In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Senior – Senior Data Scientist Role Overview: We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus. Minimum 3-7 years of experience in Data Science and Machine Learning. In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 2 days ago
0 years
0 Lacs
India
Remote
Data Science Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Application Deadline: 31st July 2025 Opportunity: Full-time role based on performance + Internship Certificate About Unified Mentor Unified Mentor provides aspiring professionals with hands-on experience in data science through industry-relevant projects, helping them build successful careers. Responsibilities Collect, preprocess, and analyze large datasets Develop predictive models and machine learning algorithms Perform exploratory data analysis (EDA) to extract insights Create data visualizations and dashboards for effective communication Collaborate with cross-functional teams to deliver data-driven solutions Requirements Enrolled in or a graduate of Data Science, Computer Science, Statistics, or a related field Proficiency in Python or R for data analysis and modeling Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) Familiarity with data visualization tools like Tableau, Power BI, or Matplotlib Strong analytical and problem-solving skills Excellent communication and teamwork abilities Stipend & Benefits Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) Hands-on experience in data science projects Certificate of Internship & Letter of Recommendation Opportunity to build a strong portfolio of data science models and applications Potential for full-time employment based on performance How to Apply Submit your resume and a cover letter with the subject line "Data Science Intern Application." Equal Opportunity: Unified Mentor welcomes applicants from all backgrounds.
Posted 2 days ago
3.0 years
0 Lacs
India
On-site
Job Title: Supply Chain Optimization Specialist Experience: 3+ Years Department: Operations Research / Supply Chain Analytics Position Overview: We are seeking a highly analytical and skilled Supply Chain Optimization Specialist with a strong background in mathematical modeling, optimization, and data analysis. The ideal candidate will play a critical role in improving supply chain operations by developing advanced models and providing data-driven insights. You will collaborate with cross-functional teams to ensure effective implementation of optimized solutions in real-world supply chain systems. Key Responsibilities: Mathematical Modeling & Optimization Develop, refine, and validate mathematical models for inventory management, production planning, transportation logistics, and distribution networks. Apply advanced optimization techniques including linear programming, integer programming, network flows, simulation, and heuristics to solve complex supply chain challenges. Perform sensitivity analysis, scenario modeling, and risk assessment to evaluate system performance under various conditions. Translate business objectives, constraints, and requirements into mathematical frameworks and optimization problems. Data Analysis & Insights Analyze large-scale supply chain data to extract actionable insights and identify performance trends. Partner with data scientists and analysts to gather, clean, and preprocess data from multiple sources ensuring accuracy and completeness. Provide recommendations to optimize cost, improve efficiency, and enhance customer satisfaction through data-driven decisions. Solution Development & Deployment Present analytical findings, models, and recommendations to stakeholders in a clear, structured format. Provide input on trade-offs between analytical rigor and speed-to-market solutions. Collaborate with internal teams including Data Engineers, Data Scientists, Business Analysts, and Project Managers to test and deploy solutions effectively. Research & Innovation Stay abreast of emerging trends in supply chain management, operations research, and optimization methodologies. Research and propose innovative approaches to address new and evolving supply chain challenges. Qualifications: Master’s degree in Industrial Engineering, Operations Research, Management Science , or a related field. 3+ years of professional experience in supply chain modeling and optimization. Strong command of optimization techniques such as linear/integer programming, network flow modeling, simulation, and heuristic algorithms . Programming proficiency in Python, R , or MATLAB , with hands-on experience using optimization libraries like Gurobi, CPLEX, FICO . Expertise in data manipulation using pandas, NumPy , and similar tools. Solid understanding of SQL for data extraction; experience with visualization platforms like Tableau or Power BI . Strong knowledge of supply chain processes, including demand forecasting, inventory management, production planning, transportation logistics , and distribution networks . Preferred Skills: Excellent problem-solving and critical thinking abilities. Strong communication skills to explain technical solutions to non-technical stakeholders. Experience working in cross-functional and collaborative environments.
Posted 2 days ago
2.0 - 3.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
We are looking for a highly motivated and skilled Generative AI (GenAI) Developer to join our dynamic team. You will be responsible for building and deploying GenAI solutions using large language models (LLMs) to address real-world business challenges. The role involves working with cross-functional teams, applying prompt engineering and fine-tuning techniques, and building scalable AI-driven applications. A strong foundation in machine learning, NLP, and a passion for emerging GenAI technologies is essential. Responsibilities Design, develop, and implement GenAI solutions using large language models (LLMs) to address specific business needs using Python Collaborate with stakeholders to identify opportunities for GenAI integration and translate requirements into scalable solutions Preprocess and analyze unstructured data (text, documents, etc.) for model training, fine-tuning, and evaluation Apply prompt engineering, fine-tuning, and RAG (Retrieval-Augmented Generation) techniques to optimize LLM outputs Deploy GenAI models and APIs into production environments, ensuring performance, scalability, and reliability Monitor and maintain deployed solutions, incorporating improvements based on feedback and real-world usage Stay up to date with the latest advancements in GenAI, LLMs, and orchestration tools (e.g., LangChain, LlamaIndex) Write clean, maintainable, and well-documented code, and contribute to team-wide code reviews and best practices Requirements 2-3 years of relevant Proven experience as an AI Developer Proficiency in Python Good understanding multiple of Gen AI models (OpenAI, LLAMA2, Mistral) and ability to setup up local GPTs using ollama, lm studio etc Experience with LLMs, RAG (Retrieval-Augmented Generation), and vector databases (e.g., FAISS, Pinecone) Multi agents frameworks to create workflows Langchain or similar tools like lamaindex, langgraph etc Knowledge of Machine Learning frameworks, libraries, and tools Excellent problem-solving skills and solution mindset Strong communication and teamwork skills Ability to work independently and manage ones time effectively Experience with any of cloud platforms (AWS, GCP, Azure) Benefits Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centres. Work-Life Balance: Accellor prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training, Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Personal Accident Insurance, Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Disclaimer: - Accellor is proud to be an equal opportunity employer. We do not discriminate in hiring or any employment decision based on race, color, religion, national origin, age, sex (including pregnancy, childbirth, or related medical conditions), marital status, ancestry, physical or mental disability, genetic information, veteran status, gender identity or expression, sexual orientation, or other applicable legally protected characteristic
Posted 3 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description - We are seeking a Machine Learning Engineer to assist in developing and implementing object detection and AI-based predictions. You will have the opportunity to work on real-world applications and contribute to the development of novel algorithms. As a ML engineer, you’ll collaborate with our team and works various applied AI/ML tasks. Key Responsibilities - Assist in training and fine-tuning ML models for real-time AI-based tasks. Work with large datasets to prepare, annotate, preprocess, and augment data for training purposes. Implement and test model architectures to improve accuracy, speed, and performance. Help analyze and optimize model performance based on results and metrics. Document research and findings, contributing to team knowledge and project reports. Participate in code reviews and contribute to software development best practices. Stay updated with the latest trends and research in computer vision and applied ML. Qualifications - Bachelor's or master's degree program in Computer Science, Electrical Engineering, or a related field. Solid understanding of AI/ML concepts, data cleaning, synthetic data generation and other relevant concepts. Hands-on experience with MLOps. Hands-on experience with YOLO or similar deep learning object detection frameworks (e.g., Faster R-CNN, SSD). Proficiency in programming languages such as Python Experience with OpenCV and Pytorch is a must. Experience with Statistical Analysis and modeling is a plus. Experience with CUDA and GPU acceleration is a plus. Experience with ROS and C++ is also a huge plus. Strong problem-solving skills, attention to detail, and a collaborative mindset. Preferred Skills - Knowledge of data cleaning techniques and data preprocessing methods. Familiarity with version control systems like Git . Experience in deploying models into production environments (optional). Exposure to ROS and OpenCV in C++.
Posted 3 days ago
0 years
0 Lacs
India
Remote
Data Science Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About WebBoost Solutions by UM WebBoost Solutions by UM provides aspiring professionals with hands-on experience in data science , offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities ✅ Collect, preprocess, and analyze large datasets. ✅ Develop predictive models and machine learning algorithms . ✅ Perform exploratory data analysis (EDA) to extract meaningful insights. ✅ Create data visualizations and dashboards for effective communication of findings. ✅ Collaborate with cross-functional teams to deliver data-driven solutions . Requirements 🎓 Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field . 🐍 Proficiency in Python for data analysis and modeling. 🧠 Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) . 📊 Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib) . 🧐 Strong analytical and problem-solving skills. 🗣 Excellent communication and teamwork abilities. Stipend & Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based). ✔ Hands-on experience in data science projects . ✔ Certificate of Internship & Letter of Recommendation . ✔ Opportunity to build a strong portfolio of data science models and applications. ✔ Potential for full-time employment based on performance. How to Apply 📩 Submit your resume and a cover letter with the subject line "Data Science Intern Application." 📅 Deadline: 31st July 2025 Equal Opportunity WebBoost Solutions by UM is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Let me know if you need any modifications! 🚀
Posted 3 days ago
2.0 - 3.0 years
0 Lacs
New Delhi, Delhi, India
On-site
Experience - 2-3 Years Must Have Skills - Excellent verbal and written English communication skills. Role of the Data Analyst : This role leads a mix of critical components, dealing with large data sets and turning into meaningful insights that support strategic decision-making across the organization working closely with key business stakeholders to analyze business performance, analyze trends and develop data driven solutions that enhance operational efficiency. Your Role Accountabilities: OPERATIONS/PROJECT MANAGEMENT ● Analyze large datasets using SQL and MySQL Workbench to identify trends and patterns. ● Work closely with cross-functional teams to understand business objectives and translate them into data driven insights. ● Develop, maintain, and enhance dashboards and reports using Microsoft Excel, Pivot Tables, and Charts. ● Conduct deep-dive analyses to explain metric anomalies and performance dips (e.g., user engagement or sales).. ● Present findings in a clear and concise manner to both technical and non-technical stakeholders. ● Support ongoing optimization of business operations and hiring strategies through data analysis. ● Continuously improve data processes and stay up to date with best practices in analytics. ● Contribute to the automation of recurring reports and data extraction processes to increase team efficiency. ● Clean, validate, and preprocess raw data to ensure data quality and accuracy before analysis. Collaborate with other teams including sourcing, procurement, mobility to deliver a high quality customer service. STRATEGY ● Collaborate with key stakeholders to understand team needs and dependencies to better align business processes. ● Assist in developing and executing a methodology to evaluate, prioritize and monitor the success of the business processes. ● Work closely with various cross function org to understand the change, draw strategy to cover the support for business users. ● Collaborate with key stakeholders, gathered requirements to plan the budget, track the expenses and future forecast. ● Create comprehensive and meaningful strategy presentations for senior executives. ● Ability to build a framework and drive development through dynamic business intelligence tools and dashboards for use, Ability to handle multiple assignments concurrently. in ongoing business planning and goal measurement through KPIs. and worksheets. ● A passion for accuracy and translating insights into a compelling narrative; able to maintain a balance between the details and the larger picture. ANALYTICS ● Develop comprehensive performance analysis of business processes and review ways of improvement. ● Actively participate in stakeholder meetings with the goal of understanding all major projects and initiatives planned Qualifications & Experiences: ● 2-3 year of experience as a Data Analyst. ● Expert user of Microsoft Office (Excel, PowerPoint, Word) to prepare all documents, presentations, graphs. ● Educational qualification – B.Tech or any Master degree in computers. Not Required but preferred experience: ● Familiarity with streaming and similar products/services ● Experience working in a national or global company ● Comfortable in working in highly iterative and somewhat unstructured environment .
Posted 3 days ago
0 years
0 Lacs
Greater Kolkata Area
On-site
We are seeking a skilled and passionate AI/ML Engineer to join our team and help us develop intelligent systems that leverage machine learning and artificial intelligence. You will design, develop, and deploy machine learning models, work closely with cross-functional teams, and contribute to cutting-edge solutions that solve real-world problems. Responsibilities Design and implement machine learning models and algorithms for various use cases (e. g., prediction, classification, NLP, computer vision). Analyze and preprocess large datasets to build robust training pipelines. Research to stay up-to-date with the latest AI/ML advancements and integrate relevant techniques into projects. Train, fine-tune, and optimize models for performance and scalability. Deploy models to production using tools such as Docker, Kubernetes, or cloud services (AWS, GCP, Azure). Collaborate with software engineers, data scientists, and product teams to integrate AI/ML solutions into applications. Monitor model performance in production and continuously iterate for improvements. Document design choices, code, and models for transparency and reproducibility. Requirements Experience with NLP libraries (e. g., Hugging Face Transformers, spaCy) or computer vision tools (e. g., OpenCV). Preferred experience in Real Image Processing, and RAG-oriented. Background in deep learning architectures such as CNNs, RNNs, GANs, or transformer models. Knowledge of MLOps practices and tools (e. g., MLflow, Kubeflow, SageMaker). Contributions to open-source AI/ML projects or publications in relevant conferences/journals. This job was posted by Tista Saha from SentientGeeks.
Posted 3 days ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Data is now more important than ever, and the information distilled from it is crucial to make meaningful decisions. GeoIQ is a product developed by data scientists for data scientists. We obtain, visualise, and analyse data from heterogeneous sources to build smarter variables that aid decision making. As a part of the data science team, you will be responsible for building predictive models and day-to-day analysis for our clients, building algorithms on the data collected from hundreds of sources to build location-defining attributes. Responsibilities Collaborate with the data science team to understand project requirements and objectives. Collect, clean, and preprocess data from various sources to ensure its accuracy and suitability for analysis. Develop and maintain data pipelines to automate data ingestion and transformation processes. Conduct exploratory data analysis to identify patterns, trends, and insights in large datasets. Utilise statistical techniques to perform data analysis and generate actionable insights for clients. Build and implement predictive models using machine learning algorithms to support decision-making processes. Create visualisations and dashboards to communicate analysis results and findings effectively. Collaborate with cross-functional teams to understand their data needs and provide analytical support. Continuously improve data quality, data integrity, and data security practices. Stay up-to-date with the latest trends and advancements in data analysis and machine learning. Assist in the development and improvement of GeoIQ's location AI platform through data-driven insights. Participate in team meetings and brainstorming sessions to contribute innovative ideas and solutions. Requirements Proficient in Python. Experience with R is a plus. Ability to analyse large datasets and draw insightful observations. Work experience of 1+ years with at least 6 months working with Python. Prior experience with data extraction, manipulation, and analysis. Prior experience with SQL. Knowledge of statistical techniques. Experience with working on Spatial Data will be an added advantage. This job was posted by Saurav Mehta from GeoIQ.
Posted 3 days ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Mandate 2: Remote Work About Swiggy Swiggy Instamart, is building the convenience grocery segment in India. We offer more than 30000 + assortments / products to our customers within 10-15 mins. We are striving to augment our consumer promise of enabling unparalleled convenience by making grocery delivery instant and delightful. Instamart has been operating in 90+ cities across India and plans to expand to a few more soon. We have seen immense love from the customers till now and are excited to redefine how India shops. Role and Responsibilities: - Analysing various business scenarios and recommending prompt actions for exponentially expanding business. Co-create initiatives to the business teams to meet business objectives. Gather relevant data from various sources and Clean, preprocess, and transform raw data into a usable format. Help a manager with data and automating interactive dashboards to enable real-time monitoring of key metrics. Provide recommendations based on data findings to support decision-making. Oversee and maintain our marketing softwares - stack, including CRM, marketing finance tools, analytics platforms, and more. Collaborate with marketing teams to ensure the successful execution of campaigns, including BTL, ATL ,activations and sampling. Work closely with sales, product, and other teams to align marketing efforts.. Assist in the management of the marketing budget, tracking expenses, and ensuring cost-effectiveness. Monitor campaign performance and make real-time adjustments as needed. Desired Candidate: - Min 3 year Work Experience of SQL and MS-Excel Understanding of SQL (Structured Query Language) to work with databases. Ability to clean, explore, and structure the data from data. A degree in fields like Mathematics, Statistics, or Computer Science "We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regards to race, color, religion, sex, disability status, or any other characteristic protected by the law"
Posted 3 days ago
0.0 years
0 Lacs
Indore, Madhya Pradesh
On-site
We are a cutting-edge technology company that specializes in developing innovative artificial intelligence and machine learning solutions. Our mission is to harness the power of AI to drive business growth, improve efficiency, and enhance customer experience. Job Summary: We are seeking an experienced Artificial Intelligence Engineer to join our team in Indore, Madhya Pradesh, India. As an AI Engineer, you will be responsible for designing, developing, and deploying AI and ML models to solve complex business problems. You will work closely with our data scientists, product managers, and other engineers to integrate AI into our products and services. Key Responsibilities: 1. Design and Develop AI/ML Models: Design, develop, and deploy AI/ML models using Python and other relevant technologies. Collaborate with data scientists to gather requirements, collect data, and develop models. Implement and test models using various frameworks and libraries (e.g., TensorFlow, PyTorch, Scikit-learn). 2. Prompt Engineering: Develop and refine natural language processing (NLP) models using prompt engineering techniques. Create high-quality prompts to elicit accurate and relevant responses from AI models. Optimize prompt design to improve model performance and reduce errors. 3. Python Development: Develop and maintain Python scripts and code to support AI/ML model deployment. Utilize Python libraries and frameworks to build and integrate AI/ML models into our products. Collaborate with other engineers to ensure seamless integration with existing systems. 4. Data Preprocessing and Analysis: Collect, preprocess, and analyze data to support AI/ML model development. Clean, transform, and feature-engineer data to improve model performance. Work with data scientists to identify and address data quality issues. 5. Collaboration and Communication: Work closely with data scientists, product managers, and other engineers to integrate AI
Posted 3 days ago
3.0 - 5.0 years
0 Lacs
Gurugram, Haryana, India
Remote
Job Description: Geospatial Analyst Location: Gurgaon (On-site) Employment: Full-Time Experience Level: 3-5 years About Us Aaizel Tech Labs is a pioneering tech startup at the intersection of Cybersecurity, AI, Geospatial solutions, and more. We drive innovation by delivering transformative technology solutions across industries. As a growing startup, we are looking for passionate and versatile professionals eager to work on cutting-edge projects in a dynamic environment. Job Summary We are seeking a talented Geospatial Analyst to join our R&D team and contribute to the development of next-generation geospatial products. You will be involved in data acquisition, spatial analysis, and visualization, driving innovative solutions across domains like remote sensing, smart cities, precision agriculture, and environmental monitoring. Key Responsibilities: 1. Geospatial Data Acquisition and Processing: Collect and process high-resolution satellite imagery, LiDAR data, and drone-acquired datasets. Use remote sensing software (ENVI, ERDAS) to preprocess data, including radiometric corrections, georeferencing, and orthorectification. 2. Spatial Analysis and Modelling: Develop spatial models and algorithms for applications such as land use classification, change detection, and object recognition in geospatial data. Implement advanced GIS techniques, including spatial interpolation, hydrological modelling, and network analysis. 3. Visualisation and Cartography: Create detailed and interactive maps, 3D models, and geospatial visualisations using ArcGIS, QGIS, and Mapbox. Utilise tools like Blender and Unity for 3D environmental modelling and simulation. 4. Data Integration and Database Management: Integrate geospatial data with other data sources (IoT, GPS, weather data) for comprehensive analysis. Design and manage spatial databases using PostgreSQL/PostGIS, ensuring efficient data storage and retrieval. 5. Advanced Geospatial Analytics: Develop custom scripts in Python or R for spatial data analysis, including machine learning applications like predictive modelling and anomaly detection. Apply geostatistical methods (Kriging, Moran’s I) for environmental impact assessments and resource management. 6. Collaboration and Reporting: Collaborate with AI/ML engineers, software developers, and project managers to integrate geospatial insights into broader tech solutions. Prepare detailed analytical reports, dashboards, and presentations to communicate findings to stakeholders 7. Tool Development and Automation: Develop automated geospatial tools using APIs (Google Earth Engine, OpenStreetMap) to streamline data analysis workflows. Implement automated change detection systems for monitoring environmental changes or urban expansion. Required Skills, Qualification and Experience: Educational Background: Master’s in Geoinformatics, Remote Sensing, or related field; strong preference for candidates from preferably top-tier institutions. Experience: 3-5 years of experience with a strong portfolio of completed projects Technical Skills: Proficiency in GIS software (ArcGIS, QGIS) and remote sensing tools (ENVI, ERDAS). Experience with programming languages (Python, R) for geospatial data manipulation and analysis. Familiarity with cloud-based geospatial platforms like AWS S3, GCP Earth Engine, or Azure Maps. Strong understanding of spatial databases (PostgreSQL/PostGIS) and geospatial data standards (GeoJSON, WMS/WFS). Problem-Solving: Ability to solve complex spatial problems using data-driven approaches and innovative techniques. Communication: Strong presentation skills to convey complex geospatial information clearly to technical and non-technical stakeholders. Attention to Detail: High level of precision in geospatial data processing and analysis. Application Process: To apply, please submit your resume and a cover letter detailing your relevant experience and enthusiasm for the role to hr@aaizeltech.com or Bhavik@aaizeltech.com or anju@aaizeltech.com (Contact No- 8493801093)
Posted 3 days ago
3.0 - 7.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description: EY GDS – Data and Analytics - D and A – Senior – Senior Data Scientist Role Overview: We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements : Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus. Minimum 3-7 years of experience in Data Science and Machine Learning. In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 3 days ago
0 years
12 Lacs
India
On-site
Job Description: We are looking for a Python AI Developer with expertise in AI/ML model development using data sourced from internal/external databases . The ideal candidate should have experience in building, training, and deploying machine learning models using structured and unstructured data. You will work closely with data engineers and software developers to create AI-driven solutions that enhance our applications and business processes. Key Responsibilities: Develop and deploy AI models using Python and machine learning frameworks (TensorFlow, PyTorch, Scikit-learn, etc.). Worked with internal databases (SQL, NoSQL) to extract, clean, and preprocess data for AI models. Design and implement data pipelines for training and inference.Optimise model performance and ensure scalability .Collaborate with software developers to integrate AI solutions into production systems. Ensure the security and integrity of AI-driven applications.Stay updated with the latest AI, ML, and data science trends. Required Skills & Qualifications: Strong proficiency in Python and AI/ML libraries such as TensorFlow, PyTorch, and Scikit-learn . Experience working with structured and unstructured data from relational (PostgreSQL, MySQL) and NoSQL (MongoDB, Firebase) databases.Knowledge of ETL processes and data engineering concepts. Experience with API development for AI model integration.Familiarity with cloud platforms (AWS, Azure, GCP) for AI deployments.Strong understanding of data preprocessing, feature engineering, and model tuning . Ability to write clean, efficient, and scalable code.Experience with version control (Git) and Agile methodologies. Preferred Skills: Knowledge of Natural Language Processing (NLP) and LLM fine-tuning. Experience with MLOps tools and practices.Familiarity with big data technologies such as Spark, Hadoop .Understanding of deep learning architectures (CNNs, RNNs, Transformers). Job Type: Full-time Pay: Up to ₹100,000.00 per month Benefits: Flexible schedule Work Location: In person Application Deadline: 07/04/2025
Posted 3 days ago
3.0 - 5.0 years
0 Lacs
Gurugram, Haryana, India
Remote
Job Description: Geospatial Analyst Location: Gurgaon (On-site) Employment: Full-Time Experience Level: 3-5 years About Us Aaizel Tech Labs is a pioneering tech startup at the intersection of Cybersecurity, AI, Geospatial solutions, and more. We drive innovation by delivering transformative technology solutions across industries. As a growing startup, we are looking for passionate and versatile professionals eager to work on cutting-edge projects in a dynamic environment. Job Summary We are seeking a talented Geospatial Analyst to join our R&D team and contribute to the development of next-generation geospatial products. You will be involved in data acquisition, spatial analysis, and visualization, driving innovative solutions across domains like remote sensing, smart cities, precision agriculture, and environmental monitoring . Key Responsibilities: 1. Geospatial Data Acquisition and Processing: Collect and process high-resolution satellite imagery, LiDAR data, and drone-acquired datasets. Use remote sensing software (ENVI, ERDAS) to preprocess data, including radiometric corrections, georeferencing, and orthorectification. 2. Spatial Analysis and Modelling: Develop spatial models and algorithms for applications such as land use classification, change detection, and object recognition in geospatial data. Implement advanced GIS techniques, including spatial interpolation, hydrological modelling, and network analysis. 3. Visualisation and Cartography: Create detailed and interactive maps, 3D models, and geospatial visualisations using ArcGIS, QGIS, and Mapbox. Utilise tools like Blender and Unity for 3D environmental modelling and simulation. 4. Data Integration and Database Management: Integrate geospatial data with other data sources (IoT, GPS, weather data) for comprehensive analysis. Design and manage spatial databases using PostgreSQL/PostGIS, ensuring efficient data storage and retrieval. 5. Advanced Geospatial Analytics: Develop custom scripts in Python or R for spatial data analysis, including machine learning applications like predictive modelling and anomaly detection. Apply geostatistical methods (Kriging, Moran’s I) for environmental impact assessments and resource management. 6. Collaboration and Reporting: Collaborate with AI/ML engineers, software developers, and project managers to integrate geospatial insights into broader tech solutions. Prepare detailed analytical reports, dashboards, and presentations to communicate findings to stakeholders 7. Tool Development and Automation: Develop automated geospatial tools using APIs (Google Earth Engine, OpenStreetMap) to streamline data analysis workflows. Implement automated change detection systems for monitoring environmental changes or urban expansion. Required Skills, Qualification and Experience: Educational Background: Master’s in Geoinformatics, Remote Sensing, or related field; strong preference for candidates from preferably top-tier institutions. Experience: 3-5 years of experience with a strong portfolio of completed projects Technical Skills: Proficiency in GIS software (ArcGIS, QGIS) and remote sensing tools (ENVI, ERDAS). Experience with programming languages (Python, R) for geospatial data manipulation and analysis. Familiarity with cloud-based geospatial platforms like AWS S3, GCP Earth Engine, or Azure Maps. Strong understanding of spatial databases (PostgreSQL/PostGIS) and geospatial data standards (GeoJSON, WMS/WFS). Problem-Solving: Ability to solve complex spatial problems using data-driven approaches and innovative techniques. Communication: Strong presentation skills to convey complex geospatial information clearly to technical and non-technical stakeholders. Attention to Detail: High level of precision in geospatial data processing and analysis. Application Process: To apply, please submit your resume and a cover letter detailing your relevant experience and enthusiasm for the role to hr@aaizeltech.com or Bhavik@aaizeltech.com or anju@aaizeltech.com (Contact No- 8493801093)
Posted 3 days ago
0 years
0 Lacs
India
Remote
Data Science Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About WebBoost Solutions by UM WebBoost Solutions by UM provides aspiring professionals with hands-on experience in data science , offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities ✅ Collect, preprocess, and analyze large datasets. ✅ Develop predictive models and machine learning algorithms . ✅ Perform exploratory data analysis (EDA) to extract meaningful insights. ✅ Create data visualizations and dashboards for effective communication of findings. ✅ Collaborate with cross-functional teams to deliver data-driven solutions . Requirements 🎓 Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field . 🐍 Proficiency in Python for data analysis and modeling. 🧠 Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) . 📊 Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib) . 🧐 Strong analytical and problem-solving skills. 🗣 Excellent communication and teamwork abilities. Stipend & Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based). ✔ Hands-on experience in data science projects . ✔ Certificate of Internship & Letter of Recommendation . ✔ Opportunity to build a strong portfolio of data science models and applications. ✔ Potential for full-time employment based on performance. How to Apply 📩 Submit your resume and a cover letter with the subject line "Data Science Intern Application." 📅 Deadline: 30th July 2025 Equal Opportunity WebBoost Solutions by UM is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Let me know if you need any modifications! 🚀
Posted 3 days ago
0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Work Level : Individual Core : Communication Skills, Problem Solving, Execution Leadership : Decisive, Team Alignment, Working Independently Industry Type : IT Services & Consulting Function : Data Analyst Key Skills : MySQL,Python,Bigdata,Data Science,Data Analytics,Data Analysis,Cloud,AWS,Business Intelligence (BI),Statistical Modeling,R,Big Data Platforms,Tableau Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Responsibilities: This is a Remote Position. Collect, clean, and preprocess data from various sources. Perform exploratory data analysis (EDA) to identify trends and patterns. Develop dashboards and reports using tools like Excel, Power BI, or Tableau. Use SQL to query and manipulate large datasets. Assist in building predictive models and performing statistical analyses. Present insights and recommendations based on data findings. Collaborate with cross-functional teams to support data-driven decision-making. Requirement: Currently pursuing or recently completed a degree in Data Science, Statistics, Mathematics, Computer Science, or a related field. Strong analytical and problem-solving skills. Proficiency in Excel and SQL for data analysis. Experience with data visualization tools like Power BI, Tableau, or Google Data Studio. Basic knowledge of Python or R for data analysis is a plus. Understanding of statistical methods and data modeling concepts. Strong attention to detail and ability to work independently. Excellent communication skills to present insights clearly. Preferred Skills: Experience with big data technologies (Google BigQuery, AWS, etc.). Familiarity with machine learning techniques and predictive modeling. Knowledge of business intelligence (BI) tools and reporting frameworks. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 4 days ago
2.0 - 5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description: We're seeking a skilled and motivated ML/AI Engineer to join our team and drive end-to-end AI development for cutting-edge healthcare prediction models. As part of our AI delivery team you will be working on designing, developing, and deploying ML models to solve complex business challenges. Key Responsibilities: Design, develop, and deploy scalable machine learning models and AI solutions. Collaborate with engineers and product managers to understand business requirements and translate them into technical solutions. Analyse and preprocess large datasets for training and testing ML models. Experiment with different ML algorithms and techniques to improve model performance. Skills & Qualifications: Experience: B.Tech with 2-5 years of relevant experience in Machine Learning, AI, or Data Science roles. OR M.Tech, M. Statistics with 2-4 years of relevant experiencein Machine Learning, AI, or Data Science roles. Education: Bachelor's/master's degree in computer science or Statistics or any other relevant engineering / AI course from a Tier 1 or Tier 2 college. Technical Skills: Proficiency in programming languages such asPython & R Strong knowledge of classicalmachine learning algorithms with hands on experience in the following: Supervised (Classification model, regressions models) Unsupervised, (Clustering Algorithms, Autoencoders) Ensemble Models (Stacking, Bagging, Boosting techniques, Random Forest, XGBoost) Experience in data preprocessing, feature engineering, and handling large-scale datasets. Model evaluation techniques like accuracy, precision, recall, F1 score, AUC-ROC for classification, and MAE, MSE, RMSE, R-squared for regression. Explainable AI (XAI) techniques include methods like SHAP values, LIME, feature importance from decision trees, and partial dependence plots. Experience withML frameworkslikeTensorFlow, PyTorch, Scikit-learn,orKeras. Building & deploying Model APIs using framework like Flask, Fast API, Django, TensorFlow Serving etc. Knowledge ofcloud platformslike Azure (preferred), AWS, GCP and experience deploying models in such environments -> changed the wording, added points. Nice to Have: Object Oriented Programming. Familiarity withNLP, ortime seriesanalysis. -> Moving this into Nice to have, not bare min Exposure todeep learningmodels (RNN, LSTM) and working with GPUs.
Posted 4 days ago
0 years
0 Lacs
India
Remote
Machine Learning Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with Certificate of Internship Application Deadline : 30th July 2025 About Unified Mentor Unified Mentor provides students and graduates with hands-on learning opportunities and career growth in Machine Learning and Data Science. Role Overview As a Machine Learning Intern, you will work on real-world projects, enhancing your practical skills in data analysis and model development. Responsibilities ✅ Design, test, and optimize machine learning models ✅ Analyze and preprocess datasets ✅ Develop algorithms and predictive models ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn ✅ Document findings and create reports Requirements 🎓 Enrolled in or a graduate of a relevant program (Computer Science, AI, Data Science, or related field) 🧠 Knowledge of machine learning concepts and algorithms 💻 Proficiency in Python or R (preferred) 🤝 Strong analytical and teamwork skills Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Hands-on machine learning experience ✔ Internship Certificate & Letter of Recommendation ✔ Real-world project contributions for your portfolio Equal Opportunity Unified Mentor is an equal-opportunity employer, welcoming candidates from all backgrounds.
Posted 4 days ago
0 years
0 Lacs
India
Remote
Data Science Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Application Deadline: 30th July 2025 Opportunity: Full-time role based on performance + Internship Certificate About Unified Mentor Unified Mentor provides aspiring professionals with hands-on experience in data science through industry-relevant projects, helping them build successful careers. Responsibilities Collect, preprocess, and analyze large datasets Develop predictive models and machine learning algorithms Perform exploratory data analysis (EDA) to extract insights Create data visualizations and dashboards for effective communication Collaborate with cross-functional teams to deliver data-driven solutions Requirements Enrolled in or a graduate of Data Science, Computer Science, Statistics, or a related field Proficiency in Python or R for data analysis and modeling Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) Familiarity with data visualization tools like Tableau, Power BI, or Matplotlib Strong analytical and problem-solving skills Excellent communication and teamwork abilities Stipend & Benefits Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) Hands-on experience in data science projects Certificate of Internship & Letter of Recommendation Opportunity to build a strong portfolio of data science models and applications Potential for full-time employment based on performance How to Apply Submit your resume and a cover letter with the subject line "Data Science Intern Application." Equal Opportunity: Unified Mentor welcomes applicants from all backgrounds.
Posted 4 days ago
2.0 - 3.0 years
2 Lacs
Cochin
On-site
We are a fast-growing technology startup based in Cochin, Kerala, focused on building innovative AI-powered software solutions for the healthcare, retail, and hospitality industries. We’re looking for a passionate AI Developer / Engineer to join our team and help us take our products to the next level. Key Responsibilities: Design, develop, and deploy AI/ML models for real-world applications. Build and optimize NLP, Computer Vision, or Predictive Analytics modules. Preprocess data and build datasets for training and inference. Integrate AI models into production-ready software using Python and REST APIs. Collaborate with software developers, product managers, and domain experts. Required Skills: 2–3 years of experience in AI/ML development. Proficient in Python and frameworks like TensorFlow, PyTorch, Scikit-learn . Experience with NLP, OCR, or Computer Vision projects. Solid understanding of data preprocessing , model training , and evaluation metrics . Ability to work with APIs , Databases (SQL/NoSQL) , and cloud tools. Experience with version control systems (e.g., Git). Preferred Skills (Bonus): Experience with AI in healthcare or OCR for documents/prescriptions . Knowledge of LLMs (e.g., GPT, LLaMA) or Generative AI . Deployment experience using Docker , Kubernetes , or AWS/GCP/Azure . Familiarity with Flutter , Node.js , or full-stack environments. Job Type: Full-time Pay: From ₹24,000.00 per month Work Location: In person Expected Start Date: 19/08/2025
Posted 4 days ago
1.0 - 3.0 years
2 - 4 Lacs
India
On-site
Job Summary : As an AI Analyst, you will be responsible for analysing complex data sets, developing algorithms, and implementing AI models to solve business problems and improve decision-making processes. You will work closely with cross-functional teams to identify opportunities for AI applications, interpret results, and communicate findings to stakeholders. Key Responsibilities: Data Analysis: Collect, clean, and preprocess large data sets from various sources. Conduct exploratory data analysis to identify trends, patterns, and anomalies. Model Development: Develop, train, and validate machine learning models tailored to specific business needs. Utilize algorithms and statistical techniques to enhance model performance. Collaboration : Work closely with clients team to define AI project objectives and scope. Collaborate on the integration of AI models into existing systems and workflows. Reporting and Visualization: Create dashboards and visualizations to communicate insights derived from data analysis and model results. Present findings and recommendations to non-technical stakeholders in a clear and effective manner. Continuous Improvement: Stay up-to-date with the latest advancements in AI and machine learning technologies. Identify opportunities for process improvements and contribute to the development of best practices. Qualifications: 1. Bachelor’s degree in Data Science, Computer Science, Statistics, Mathematics, or a related field. A Master’s degree is a plus. 2. Experience : 1 to 3 years ( AI work experience in real time projects is mandatory. Intern experience will not be considered) 2. Proven experience in data analysis, statistical modeling, and machine learning. 3. Proficiency in programming languages such as Python and experience with frameworks like TensorFlow 4. Strong understanding of algorithms, data structures, and software engineering principles. 5. Experience with data visualization tools (e.g., Tableau, Power BI, Matplotlib). 6. Ability to communicate complex technical concepts to non-technical audiences effectively. 7. Strong analytical and problem-solving skills. 8. Familiarity with cloud platforms and exposure to AWS Sage maker is a key requirement. Preferred Skills: 1. Experience with natural language processing (NLP), computer vision, reinforced learning RL. 2. Understanding of ethical considerations in AI and machine learning. 3.Strong understanding of deep learning principles (CNNs, loss functions, optimization) and expertise in computer vision tasks like object detection and image processing. Proficiency in Python programming and relevant deep learning frameworks such as PyTorch or TensorFlow, along with experience in utilizing existing YOLO implementations and adapting them for specific applications Job Type: Full-time Pay: ₹200,000.00 - ₹400,000.00 per year Benefits: Health insurance Work Location: In person
Posted 4 days ago
0 years
0 Lacs
India
Remote
🤖 Machine Learning Intern – Remote | Learn AI by Building It 📍 Location: Remote / Virtual 💼 Type: Internship (Unpaid) 🎁 Perks: Certificate After Completion || Letter of Recommendation (6 Months) 🕒 Schedule: 5–7 hrs/week | Flexible Timing Join Skillfied Mentor as a Machine Learning Intern and move beyond online courses. You’ll work on real datasets, build models, and see your algorithms in action — all while gaining experience that hiring managers actually look for. Whether you're aiming for a career in AI, data science, or automation — this internship will build your foundation with hands-on learning. 🔧 What You’ll Do: Work with real datasets to clean, preprocess, and transform data Build machine learning models using Python, NumPy, Pandas, Scikit-learn Perform classification, regression, and clustering tasks Use Jupyter Notebooks for experimentation and documentation Collaborate on mini-projects and model evaluation tasks Present insights in simple, digestible formats 🎓 What You’ll Gain: ✅ Full Python course included during the internship ✅ Hands-on projects to showcase on your resume or portfolio ✅ Certificate of Completion + LOR (6-month internship) ✅ Experience with industry-relevant tools & techniques ✅ Remote flexibility — manage your time with just 5–7 hours/week 🗓️ Application Deadline: 1st August 2025 👉 Apply now to start your ML journey with Skillfied Mentor
Posted 4 days ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Razorpay was founded by Shashank Kumar and Harshil Mathur in 2014. Razorpay is building a new-age digital banking hub (Neobank) for businesses in India with the mission is to enable frictionless banking and payments experiences for businesses of all shapes and sizes. What started as a B2B payments company is processing billions of dollars of payments for lakhs of businesses across India. We are a full-stack financial services organisation, committed to helping Indian businesses with comprehensive and innovative payment and business banking solutions built over robust technology to address the entire length and breadth of the payment and banking journey for any business. Over the past year, we've disbursed loans worth millions of dollars in loans to thousands of businesses. In parallel, Razorpay is reimagining how businesses manage money by simplifying business banking (via Razorpay X) and enabling capital availability for businesses (via Razorpay Capital). The Role Senior Analytics Specialist will work with the central analytics team at Razorpay. This will give you an opportunity to work in a fast-paced environment aimed at creating a very high impact and to work with a diverse team of smart and hardworking professionals from various backgrounds. Some of the responsibilities include working with large, complex data sets, developing strong business and product understanding and closely being involved in the product life cycle. Roles And Responsibilities You will work with large, complex data sets to solve open-ended, high impact business problems using data mining, experimentation, statistical analysis and related techniques, machine learning as needed You would have/develop a strong understanding of the business & product and conduct analysis to derive insights, develop hypothesis and validate with sound rigorous methodologies or formulate the problems for modeling with ML You would apply excellent problem solving skills and independently scope, deconstruct and formulate solutions from first-principles that bring outside-in and state of the art view You would be closely involved with the product life cycle working on ideation, reviewing Product Requirement Documents, defining success criteria, instrumenting for product features, Impact assessment and identifying and recommending improvements to further enhance the Product features You would expedite root cause analyses/insight generation against a given recurring use case through automation/self-serve platforms You will develop compelling stories with business insights, focusing on strategic goals of the organization You will work with Business, Product and Data engineering teams for continuous improvement of data accuracy through feedback and scoping on instrumentation quality and completeness Set high standards in project management; own scope and timelines for the team Mandatory Qualifications Bachelor's/Master’s degree in Engineering, Economics, Finance, Mathematics, Statistics, Business Administration or a related quantitative field 3+ years of high quality hands-on experience in analytics and data science Hands on experience in SQL, Python and Tableau Define the business and product metrics to be evaluated, work with engg on data instrumentation, create and automate self-serve dashboards to present to relevant stakeholders leveraging tools such as Tableau. Ability to structure and analyze data leveraging techniques like EDA, Cohort analysis, Funnel analysis and transform them into understandable and actionable recommendations and then communicate them effectively across the organization. Hands on experience in working with large scale structured, semi structured and unstructured data and various approach to preprocess/cleanse data, dimensionality reduction Work experience in Consumer-tech organizations would be a plus Developed a clear understanding of the qualitative and quantitative aspects of the product/strategic initiative and leverage it to identify and act upon existing Gaps and Opportunities Hands on experience of A/B testing, Significance testing, supervised and unsupervised ML, Web Analytics and Statistical Learning Razorpay believes in and follows an equal employment opportunity policy that doesn't discriminate on gender, religion, sexual orientation, colour, nationality, age, etc. We welcome interests and applications from all groups and communities across the globe. Follow us on LinkedIn & Twitter
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough