Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3.0 years
0 - 0 Lacs
Nagpur, Maharashtra, India
Remote
Experience : 3.00 + years Salary : GBP 1785-2500 / month (based on experience) Expected Notice Period : 15 Days Shift : (GMT+01:00) Europe/London (BST) Opportunity Type : Remote Placement Type : Full Time Contract for 6 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - UK's Leading AgriTech Company) What do you need for this opportunity? Must have skills required: AgriTech Industry, Large Language Models, Nvidia Jetson, Raspberry PI, Blender, Computer Vision, opencv, Python, Pytorch/tensorflow, Segmentation, Extraction, Regression UK's Leading AgriTech Company is Looking for: Location: Remote Type: 6 months contract Experience Level : 3–5 Years Industry: Agritech | Sustainability | AI for Renewables About Us We're an AI-first company transforming the renewable and sustainable agriculture space. Our mission is to harness advanced computer vision and machine learning to enable smart, data-driven decisions in the livestock and agricultural ecosystem. We focus on practical applications such as automated weight estimation of cattle , livestock monitoring, and resource optimization to drive a more sustainable food system. Role Overview We are hiring a Computer Vision Engineer to develop intelligent image-based systems for livestock management, focusing on cattle weight estimation from images and video feeds. You will be responsible for building scalable vision pipelines, working with deep learning models, and bringing AI to production in real-world farm settings . Key Responsibilities Design and develop vision-based models to predict cattle weight from 2D/3D images, video, or depth data. Build image acquisition and preprocessing pipelines using multi-angle camera data. Implement classical and deep learning-based feature extraction techniques (e.g., body measurements, volume estimation). Conduct camera calibration, multi-view geometry analysis, and photogrammetry for size inference. Apply deep learning architectures (e.g., CNNs, ResNet, UNet, Mask R-CNN) for object detection, segmentation, and keypoint localization. Build 3D reconstruction pipelines using stereo imaging, depth sensors, or photogrammetry. Optimize and deploy models for edge devices (e.g., NVIDIA Jetson) or cloud environments. Collaborate with data scientists and product teams to analyze livestock datasets, refine prediction models, and validate outputs. Develop tools for automated annotation, model training pipelines, and continuous performance tracking. Required Qualifications & Skills Computer Vision: Object detection, keypoint estimation, semantic/instance segmentation, stereo imaging, and structure-from-motion. Weight Estimation Techniques: Experience in livestock monitoring, body condition scoring, and volumetric analysis from images/videos. Image Processing: Noise reduction, image normalization, contour extraction, 3D reconstruction, and camera calibration. Data Analysis & Modeling: Statistical modeling, regression techniques, and feature engineering for biological data. Technical Stack Programming Languages: Python (mandatory) Libraries & Frameworks: OpenCV, PyTorch, TensorFlow, Keras, scikit-learn 3D Processing: Open3D, PCL (Point Cloud Library), Blender (optional) Data Handling: NumPy, Pandas, DVC Annotation Tools: LabelImg, CVAT, Roboflow Cloud & DevOps: AWS/GCP, Docker, Git, CI/CD pipelines Deployment Tools: ONNX, TensorRT, FastAPI, Flask (for model serving) Preferred Qualifications Prior experience working in agritech, animal husbandry, or precision livestock farming. Familiarity with Large Language Models (LLMs) and integrating vision + language models for domain-specific insights. Knowledge of edge computing for on-farm device deployment (e.g., NVIDIA Jetson, Raspberry Pi). Contributions to open-source computer vision projects or relevant publications in CVPR, ECCV, or similar conferences. Soft Skills Strong problem-solving and critical thinking skills Clear communication and documentation practices Ability to work independently and collaborate in a remote, cross-functional team Why Join Us? Work at the intersection of AI and sustainability Be part of a dynamic and mission-driven team Opportunity to lead innovation in an emerging field of agritech Flexible remote work environment How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
5.0 - 10.0 years
12 - 20 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Key Responsibilities: Design, develop, and maintain scalable and optimized ETL pipelines using Python and SQL. Work with Google BigQuery and other cloud-based platforms to build data warehousing solutions. Develop and deploy ML models; collaborate with Data Scientists for productionizing models. Write efficient and optimized SQL queries for large-scale data processing. Build APIs using Flask/Django for machine learning and data applications. Work with both SQL and NoSQL databases including Elasticsearch. Implement data ingestion using batch and streaming technologies. Ensure data quality, integrity, and governance across the data lifecycle. Automate and optimize CI/CD pipelines for data solutions. Collaborate with cross-functional teams to gather data requirements and deliver solutions. Troubleshoot and monitor data pipelines for seamless operations. Required Skills & Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or related field. 5+ years of experience with Python in a data engineering and/or ML context. Strong hands-on experience with SQL, BigQuery, and cloud data platforms (preferably GCP). Practical knowledge of ML concepts and experience developing ML models. Proficiency in frameworks such as Flask and Django. Experience with NoSQL databases and data streaming technologies. Solid understanding of data modeling, warehousing, and ETL frameworks. Familiarity with CI/CD tools and automation best practices. Excellent communication, problem-solving, and collaboration skills.
Posted 1 week ago
3.0 years
0 - 0 Lacs
Kanpur, Uttar Pradesh, India
Remote
Experience : 3.00 + years Salary : GBP 1785-2500 / month (based on experience) Expected Notice Period : 15 Days Shift : (GMT+01:00) Europe/London (BST) Opportunity Type : Remote Placement Type : Full Time Contract for 6 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - UK's Leading AgriTech Company) What do you need for this opportunity? Must have skills required: AgriTech Industry, Large Language Models, Nvidia Jetson, Raspberry PI, Blender, Computer Vision, opencv, Python, Pytorch/tensorflow, Segmentation, Extraction, Regression UK's Leading AgriTech Company is Looking for: Location: Remote Type: 6 months contract Experience Level : 3–5 Years Industry: Agritech | Sustainability | AI for Renewables About Us We're an AI-first company transforming the renewable and sustainable agriculture space. Our mission is to harness advanced computer vision and machine learning to enable smart, data-driven decisions in the livestock and agricultural ecosystem. We focus on practical applications such as automated weight estimation of cattle , livestock monitoring, and resource optimization to drive a more sustainable food system. Role Overview We are hiring a Computer Vision Engineer to develop intelligent image-based systems for livestock management, focusing on cattle weight estimation from images and video feeds. You will be responsible for building scalable vision pipelines, working with deep learning models, and bringing AI to production in real-world farm settings . Key Responsibilities Design and develop vision-based models to predict cattle weight from 2D/3D images, video, or depth data. Build image acquisition and preprocessing pipelines using multi-angle camera data. Implement classical and deep learning-based feature extraction techniques (e.g., body measurements, volume estimation). Conduct camera calibration, multi-view geometry analysis, and photogrammetry for size inference. Apply deep learning architectures (e.g., CNNs, ResNet, UNet, Mask R-CNN) for object detection, segmentation, and keypoint localization. Build 3D reconstruction pipelines using stereo imaging, depth sensors, or photogrammetry. Optimize and deploy models for edge devices (e.g., NVIDIA Jetson) or cloud environments. Collaborate with data scientists and product teams to analyze livestock datasets, refine prediction models, and validate outputs. Develop tools for automated annotation, model training pipelines, and continuous performance tracking. Required Qualifications & Skills Computer Vision: Object detection, keypoint estimation, semantic/instance segmentation, stereo imaging, and structure-from-motion. Weight Estimation Techniques: Experience in livestock monitoring, body condition scoring, and volumetric analysis from images/videos. Image Processing: Noise reduction, image normalization, contour extraction, 3D reconstruction, and camera calibration. Data Analysis & Modeling: Statistical modeling, regression techniques, and feature engineering for biological data. Technical Stack Programming Languages: Python (mandatory) Libraries & Frameworks: OpenCV, PyTorch, TensorFlow, Keras, scikit-learn 3D Processing: Open3D, PCL (Point Cloud Library), Blender (optional) Data Handling: NumPy, Pandas, DVC Annotation Tools: LabelImg, CVAT, Roboflow Cloud & DevOps: AWS/GCP, Docker, Git, CI/CD pipelines Deployment Tools: ONNX, TensorRT, FastAPI, Flask (for model serving) Preferred Qualifications Prior experience working in agritech, animal husbandry, or precision livestock farming. Familiarity with Large Language Models (LLMs) and integrating vision + language models for domain-specific insights. Knowledge of edge computing for on-farm device deployment (e.g., NVIDIA Jetson, Raspberry Pi). Contributions to open-source computer vision projects or relevant publications in CVPR, ECCV, or similar conferences. Soft Skills Strong problem-solving and critical thinking skills Clear communication and documentation practices Ability to work independently and collaborate in a remote, cross-functional team Why Join Us? Work at the intersection of AI and sustainability Be part of a dynamic and mission-driven team Opportunity to lead innovation in an emerging field of agritech Flexible remote work environment How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
0 years
0 Lacs
India
Remote
📊 Data Analyst Intern 📍 Location: Remote (100% Virtual) 📅 Duration: 3 Months 💸 Stipend for Top Interns: ₹15,000 🎁 Perks: Internship Certificate | Letter of Recommendation | Full-Time Offer (Based on Performance) About INLIGHN TECH INLIGHN TECH is a growing edtech startup that offers practical, project-driven virtual internships to students and graduates. Our Data Analyst Internship is crafted to help individuals gain hands-on experience with data analytics tools, workflows, and real-world business problems. 🚀 Internship Overview As a Data Analyst Intern , you’ll work with real datasets to extract insights, create dashboards, and support data-driven decision-making. This is a great opportunity to build your analytical thinking, tool proficiency, and storytelling with data. 🔧 Key Responsibilities Gather, clean, and analyze data using tools like Excel , SQL , and Python Build visual dashboards using Power BI , Tableau , or Google Data Studio Interpret trends and patterns to create actionable insights Present data findings to the team using reports and visualizations Perform EDA (Exploratory Data Analysis) on structured and unstructured datasets Collaborate with other interns and departments to support data needs ✅ Qualifications Currently pursuing or recently completed a degree in Data Analytics, Statistics, Computer Science, or related fields Knowledge of MS Excel , SQL , and Python (Pandas/Numpy) Understanding of basic data visualization tools Strong analytical and problem-solving skills Attention to detail and a passion for working with data Eagerness to learn and take initiative in projects 🎓 What You’ll Gain Hands-on experience in data cleaning, analysis, and visualization A professional project portfolio to showcase your data skills Internship Certificate upon successful completion Letter of Recommendation for top performers Opportunity for a Full-Time Role based on your performance Improved fluency with industry-standard tools and analytics workflows Show more Show less
Posted 1 week ago
3.0 years
0 - 0 Lacs
Kochi, Kerala, India
Remote
Experience : 3.00 + years Salary : GBP 1785-2500 / month (based on experience) Expected Notice Period : 15 Days Shift : (GMT+01:00) Europe/London (BST) Opportunity Type : Remote Placement Type : Full Time Contract for 6 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - UK's Leading AgriTech Company) What do you need for this opportunity? Must have skills required: AgriTech Industry, Large Language Models, Nvidia Jetson, Raspberry PI, Blender, Computer Vision, opencv, Python, Pytorch/tensorflow, Segmentation, Extraction, Regression UK's Leading AgriTech Company is Looking for: Location: Remote Type: 6 months contract Experience Level : 3–5 Years Industry: Agritech | Sustainability | AI for Renewables About Us We're an AI-first company transforming the renewable and sustainable agriculture space. Our mission is to harness advanced computer vision and machine learning to enable smart, data-driven decisions in the livestock and agricultural ecosystem. We focus on practical applications such as automated weight estimation of cattle , livestock monitoring, and resource optimization to drive a more sustainable food system. Role Overview We are hiring a Computer Vision Engineer to develop intelligent image-based systems for livestock management, focusing on cattle weight estimation from images and video feeds. You will be responsible for building scalable vision pipelines, working with deep learning models, and bringing AI to production in real-world farm settings . Key Responsibilities Design and develop vision-based models to predict cattle weight from 2D/3D images, video, or depth data. Build image acquisition and preprocessing pipelines using multi-angle camera data. Implement classical and deep learning-based feature extraction techniques (e.g., body measurements, volume estimation). Conduct camera calibration, multi-view geometry analysis, and photogrammetry for size inference. Apply deep learning architectures (e.g., CNNs, ResNet, UNet, Mask R-CNN) for object detection, segmentation, and keypoint localization. Build 3D reconstruction pipelines using stereo imaging, depth sensors, or photogrammetry. Optimize and deploy models for edge devices (e.g., NVIDIA Jetson) or cloud environments. Collaborate with data scientists and product teams to analyze livestock datasets, refine prediction models, and validate outputs. Develop tools for automated annotation, model training pipelines, and continuous performance tracking. Required Qualifications & Skills Computer Vision: Object detection, keypoint estimation, semantic/instance segmentation, stereo imaging, and structure-from-motion. Weight Estimation Techniques: Experience in livestock monitoring, body condition scoring, and volumetric analysis from images/videos. Image Processing: Noise reduction, image normalization, contour extraction, 3D reconstruction, and camera calibration. Data Analysis & Modeling: Statistical modeling, regression techniques, and feature engineering for biological data. Technical Stack Programming Languages: Python (mandatory) Libraries & Frameworks: OpenCV, PyTorch, TensorFlow, Keras, scikit-learn 3D Processing: Open3D, PCL (Point Cloud Library), Blender (optional) Data Handling: NumPy, Pandas, DVC Annotation Tools: LabelImg, CVAT, Roboflow Cloud & DevOps: AWS/GCP, Docker, Git, CI/CD pipelines Deployment Tools: ONNX, TensorRT, FastAPI, Flask (for model serving) Preferred Qualifications Prior experience working in agritech, animal husbandry, or precision livestock farming. Familiarity with Large Language Models (LLMs) and integrating vision + language models for domain-specific insights. Knowledge of edge computing for on-farm device deployment (e.g., NVIDIA Jetson, Raspberry Pi). Contributions to open-source computer vision projects or relevant publications in CVPR, ECCV, or similar conferences. Soft Skills Strong problem-solving and critical thinking skills Clear communication and documentation practices Ability to work independently and collaborate in a remote, cross-functional team Why Join Us? Work at the intersection of AI and sustainability Be part of a dynamic and mission-driven team Opportunity to lead innovation in an emerging field of agritech Flexible remote work environment How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
3.0 years
0 - 0 Lacs
Greater Bhopal Area
Remote
Experience : 3.00 + years Salary : GBP 1785-2500 / month (based on experience) Expected Notice Period : 15 Days Shift : (GMT+01:00) Europe/London (BST) Opportunity Type : Remote Placement Type : Full Time Contract for 6 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - UK's Leading AgriTech Company) What do you need for this opportunity? Must have skills required: AgriTech Industry, Large Language Models, Nvidia Jetson, Raspberry PI, Blender, Computer Vision, opencv, Python, Pytorch/tensorflow, Segmentation, Extraction, Regression UK's Leading AgriTech Company is Looking for: Location: Remote Type: 6 months contract Experience Level : 3–5 Years Industry: Agritech | Sustainability | AI for Renewables About Us We're an AI-first company transforming the renewable and sustainable agriculture space. Our mission is to harness advanced computer vision and machine learning to enable smart, data-driven decisions in the livestock and agricultural ecosystem. We focus on practical applications such as automated weight estimation of cattle , livestock monitoring, and resource optimization to drive a more sustainable food system. Role Overview We are hiring a Computer Vision Engineer to develop intelligent image-based systems for livestock management, focusing on cattle weight estimation from images and video feeds. You will be responsible for building scalable vision pipelines, working with deep learning models, and bringing AI to production in real-world farm settings . Key Responsibilities Design and develop vision-based models to predict cattle weight from 2D/3D images, video, or depth data. Build image acquisition and preprocessing pipelines using multi-angle camera data. Implement classical and deep learning-based feature extraction techniques (e.g., body measurements, volume estimation). Conduct camera calibration, multi-view geometry analysis, and photogrammetry for size inference. Apply deep learning architectures (e.g., CNNs, ResNet, UNet, Mask R-CNN) for object detection, segmentation, and keypoint localization. Build 3D reconstruction pipelines using stereo imaging, depth sensors, or photogrammetry. Optimize and deploy models for edge devices (e.g., NVIDIA Jetson) or cloud environments. Collaborate with data scientists and product teams to analyze livestock datasets, refine prediction models, and validate outputs. Develop tools for automated annotation, model training pipelines, and continuous performance tracking. Required Qualifications & Skills Computer Vision: Object detection, keypoint estimation, semantic/instance segmentation, stereo imaging, and structure-from-motion. Weight Estimation Techniques: Experience in livestock monitoring, body condition scoring, and volumetric analysis from images/videos. Image Processing: Noise reduction, image normalization, contour extraction, 3D reconstruction, and camera calibration. Data Analysis & Modeling: Statistical modeling, regression techniques, and feature engineering for biological data. Technical Stack Programming Languages: Python (mandatory) Libraries & Frameworks: OpenCV, PyTorch, TensorFlow, Keras, scikit-learn 3D Processing: Open3D, PCL (Point Cloud Library), Blender (optional) Data Handling: NumPy, Pandas, DVC Annotation Tools: LabelImg, CVAT, Roboflow Cloud & DevOps: AWS/GCP, Docker, Git, CI/CD pipelines Deployment Tools: ONNX, TensorRT, FastAPI, Flask (for model serving) Preferred Qualifications Prior experience working in agritech, animal husbandry, or precision livestock farming. Familiarity with Large Language Models (LLMs) and integrating vision + language models for domain-specific insights. Knowledge of edge computing for on-farm device deployment (e.g., NVIDIA Jetson, Raspberry Pi). Contributions to open-source computer vision projects or relevant publications in CVPR, ECCV, or similar conferences. Soft Skills Strong problem-solving and critical thinking skills Clear communication and documentation practices Ability to work independently and collaborate in a remote, cross-functional team Why Join Us? Work at the intersection of AI and sustainability Be part of a dynamic and mission-driven team Opportunity to lead innovation in an emerging field of agritech Flexible remote work environment How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
3.0 years
0 - 0 Lacs
Indore, Madhya Pradesh, India
Remote
Experience : 3.00 + years Salary : GBP 1785-2500 / month (based on experience) Expected Notice Period : 15 Days Shift : (GMT+01:00) Europe/London (BST) Opportunity Type : Remote Placement Type : Full Time Contract for 6 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - UK's Leading AgriTech Company) What do you need for this opportunity? Must have skills required: AgriTech Industry, Large Language Models, Nvidia Jetson, Raspberry PI, Blender, Computer Vision, opencv, Python, Pytorch/tensorflow, Segmentation, Extraction, Regression UK's Leading AgriTech Company is Looking for: Location: Remote Type: 6 months contract Experience Level : 3–5 Years Industry: Agritech | Sustainability | AI for Renewables About Us We're an AI-first company transforming the renewable and sustainable agriculture space. Our mission is to harness advanced computer vision and machine learning to enable smart, data-driven decisions in the livestock and agricultural ecosystem. We focus on practical applications such as automated weight estimation of cattle , livestock monitoring, and resource optimization to drive a more sustainable food system. Role Overview We are hiring a Computer Vision Engineer to develop intelligent image-based systems for livestock management, focusing on cattle weight estimation from images and video feeds. You will be responsible for building scalable vision pipelines, working with deep learning models, and bringing AI to production in real-world farm settings . Key Responsibilities Design and develop vision-based models to predict cattle weight from 2D/3D images, video, or depth data. Build image acquisition and preprocessing pipelines using multi-angle camera data. Implement classical and deep learning-based feature extraction techniques (e.g., body measurements, volume estimation). Conduct camera calibration, multi-view geometry analysis, and photogrammetry for size inference. Apply deep learning architectures (e.g., CNNs, ResNet, UNet, Mask R-CNN) for object detection, segmentation, and keypoint localization. Build 3D reconstruction pipelines using stereo imaging, depth sensors, or photogrammetry. Optimize and deploy models for edge devices (e.g., NVIDIA Jetson) or cloud environments. Collaborate with data scientists and product teams to analyze livestock datasets, refine prediction models, and validate outputs. Develop tools for automated annotation, model training pipelines, and continuous performance tracking. Required Qualifications & Skills Computer Vision: Object detection, keypoint estimation, semantic/instance segmentation, stereo imaging, and structure-from-motion. Weight Estimation Techniques: Experience in livestock monitoring, body condition scoring, and volumetric analysis from images/videos. Image Processing: Noise reduction, image normalization, contour extraction, 3D reconstruction, and camera calibration. Data Analysis & Modeling: Statistical modeling, regression techniques, and feature engineering for biological data. Technical Stack Programming Languages: Python (mandatory) Libraries & Frameworks: OpenCV, PyTorch, TensorFlow, Keras, scikit-learn 3D Processing: Open3D, PCL (Point Cloud Library), Blender (optional) Data Handling: NumPy, Pandas, DVC Annotation Tools: LabelImg, CVAT, Roboflow Cloud & DevOps: AWS/GCP, Docker, Git, CI/CD pipelines Deployment Tools: ONNX, TensorRT, FastAPI, Flask (for model serving) Preferred Qualifications Prior experience working in agritech, animal husbandry, or precision livestock farming. Familiarity with Large Language Models (LLMs) and integrating vision + language models for domain-specific insights. Knowledge of edge computing for on-farm device deployment (e.g., NVIDIA Jetson, Raspberry Pi). Contributions to open-source computer vision projects or relevant publications in CVPR, ECCV, or similar conferences. Soft Skills Strong problem-solving and critical thinking skills Clear communication and documentation practices Ability to work independently and collaborate in a remote, cross-functional team Why Join Us? Work at the intersection of AI and sustainability Be part of a dynamic and mission-driven team Opportunity to lead innovation in an emerging field of agritech Flexible remote work environment How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
3.0 years
0 - 0 Lacs
Visakhapatnam, Andhra Pradesh, India
Remote
Experience : 3.00 + years Salary : GBP 1785-2500 / month (based on experience) Expected Notice Period : 15 Days Shift : (GMT+01:00) Europe/London (BST) Opportunity Type : Remote Placement Type : Full Time Contract for 6 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - UK's Leading AgriTech Company) What do you need for this opportunity? Must have skills required: AgriTech Industry, Large Language Models, Nvidia Jetson, Raspberry PI, Blender, Computer Vision, opencv, Python, Pytorch/tensorflow, Segmentation, Extraction, Regression UK's Leading AgriTech Company is Looking for: Location: Remote Type: 6 months contract Experience Level : 3–5 Years Industry: Agritech | Sustainability | AI for Renewables About Us We're an AI-first company transforming the renewable and sustainable agriculture space. Our mission is to harness advanced computer vision and machine learning to enable smart, data-driven decisions in the livestock and agricultural ecosystem. We focus on practical applications such as automated weight estimation of cattle , livestock monitoring, and resource optimization to drive a more sustainable food system. Role Overview We are hiring a Computer Vision Engineer to develop intelligent image-based systems for livestock management, focusing on cattle weight estimation from images and video feeds. You will be responsible for building scalable vision pipelines, working with deep learning models, and bringing AI to production in real-world farm settings . Key Responsibilities Design and develop vision-based models to predict cattle weight from 2D/3D images, video, or depth data. Build image acquisition and preprocessing pipelines using multi-angle camera data. Implement classical and deep learning-based feature extraction techniques (e.g., body measurements, volume estimation). Conduct camera calibration, multi-view geometry analysis, and photogrammetry for size inference. Apply deep learning architectures (e.g., CNNs, ResNet, UNet, Mask R-CNN) for object detection, segmentation, and keypoint localization. Build 3D reconstruction pipelines using stereo imaging, depth sensors, or photogrammetry. Optimize and deploy models for edge devices (e.g., NVIDIA Jetson) or cloud environments. Collaborate with data scientists and product teams to analyze livestock datasets, refine prediction models, and validate outputs. Develop tools for automated annotation, model training pipelines, and continuous performance tracking. Required Qualifications & Skills Computer Vision: Object detection, keypoint estimation, semantic/instance segmentation, stereo imaging, and structure-from-motion. Weight Estimation Techniques: Experience in livestock monitoring, body condition scoring, and volumetric analysis from images/videos. Image Processing: Noise reduction, image normalization, contour extraction, 3D reconstruction, and camera calibration. Data Analysis & Modeling: Statistical modeling, regression techniques, and feature engineering for biological data. Technical Stack Programming Languages: Python (mandatory) Libraries & Frameworks: OpenCV, PyTorch, TensorFlow, Keras, scikit-learn 3D Processing: Open3D, PCL (Point Cloud Library), Blender (optional) Data Handling: NumPy, Pandas, DVC Annotation Tools: LabelImg, CVAT, Roboflow Cloud & DevOps: AWS/GCP, Docker, Git, CI/CD pipelines Deployment Tools: ONNX, TensorRT, FastAPI, Flask (for model serving) Preferred Qualifications Prior experience working in agritech, animal husbandry, or precision livestock farming. Familiarity with Large Language Models (LLMs) and integrating vision + language models for domain-specific insights. Knowledge of edge computing for on-farm device deployment (e.g., NVIDIA Jetson, Raspberry Pi). Contributions to open-source computer vision projects or relevant publications in CVPR, ECCV, or similar conferences. Soft Skills Strong problem-solving and critical thinking skills Clear communication and documentation practices Ability to work independently and collaborate in a remote, cross-functional team Why Join Us? Work at the intersection of AI and sustainability Be part of a dynamic and mission-driven team Opportunity to lead innovation in an emerging field of agritech Flexible remote work environment How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
1.5 years
0 Lacs
Ahmedabad, Gujarat, India
Remote
🚀 AI/ML Solution Engineer Internship (On-site, Ahmedabad) Start Your Real Tech Career — Build. Deploy. Scale. Location: Ahmedabad (Office-based Only) Internship Duration: 8 Months Training + 1.5 Year Post-Training Bond About the Opportunity: Are you a recent graduate eager to move beyond online courses and actually build and deploy real AI models ? Whatmaction is offering a career-launching opportunity as an AI/ML Solution Engineer Intern — not just an internship, but a full-fledged deep tech journey . If you’re serious about working with production-grade AI, this is where your transformation begins. What You’ll Experience: 🧠 8 Months of Intense Training Like a bootcamp for your brain — hands-on learning, project-based tasks, and direct mentoring from senior engineers. ⚙️ Real-World AI/ML Projects Build and deploy production-level AI systems , including resume parsing tools, chatbots, and smart automation systems. 📦 Model Development: From Scratch & Prebuilt You’ll train models using open-source datasets, fine-tune pre-trained models, and learn how to build custom models from scratch — the real way. 🚀 Production Deployment You’ll also gain hands-on experience in model deployment , REST API integration, and making models accessible in real-time applications. Key Skills & Tech Stack: Languages: Python (strong foundation required) Bash/Shell scripting (basic) AI/ML Libraries: Scikit-learn SpaCy TensorFlow / PyTorch Keras Transformers (Hugging Face) XGBoost / LightGBM OpenCV (for CV tasks) Pandas / NumPy Matplotlib / Seaborn Tools & Platforms: Jupyter Notebooks / Google Colab Git & GitHub Postman Docker (for packaging ML apps) MLflow (for model tracking and versioning) Streamlit / FastAPI / Flask (for ML APIs) Firebase / AWS / Render (for model hosting) PostgreSQL / MongoDB (data handling) Core Concepts: Supervised / Unsupervised Learning Data Preprocessing & Feature Engineering Natural Language Processing (NLP) Model Evaluation & Hyperparameter Tuning Working with Pre-trained Embeddings ML Model Deployment (API-first mindset) Bonus Skills (Good to Have): Working with OpenAI / Hugging Face APIs Building AI Chatbots Basic Backend API skills (FastAPI preferred) Version control with Git Using LangChain / RAG techniques (optional) Who Should Apply: Fresh graduates (2023–2025) from B.Tech / MCA / BCA / M.Sc IT or equivalent. Passionate about AI, ML, and solving real-world problems . Willing to commit to 8 months of full-time training and 1.5-year project bond . Looking to build an actual career , not just do tasks. 📩 How to Apply: Send your updated resume to hr@whatmaction.com with the subject “AI/ML Internship Application – [Your Name]” 🔒 Note: This is an on-site only opportunity. Remote applications will not be considered. This is your launchpad. Build, deploy, and ship AI products that matter — with us at Whatmaction. Show more Show less
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Compelling Opportunity for ML Engineer with Innovative Entity in Insurance Industry Employment | Immediate Location: Hyderabad, India Reporting Manager: Head of Analytics Work Pattern: Full Time, 5 days in the office Minimum Experience as a ML Engineer : 3 to 5 years Position Overview: The Innovative Entity in Insurance Industry is seeking an experienced Machine Learning Engineer with 3 to 5 years of hands-on experience in designing, developing, and deploying machine learning models and systems. The ideal candidate will work closely with data scientists, software engineers, and product teams to create solutions that drive business value. You will be responsible for building scalable and efficient machine learning pipelines, optimizing model performance, and integrating models into production environments. Key Responsibilities: · Model Development & Training: Develop and train machine learning models, including supervised, unsupervised, and deep learning algorithms, to solve business problems. · Data Preparation: Collaborate with data engineers to clean, preprocess, and transform raw data into usable formats for model training and evaluation. · Model Deployment & Monitoring: Deploy machine learning models into production environments, ensuring seamless integration with existing systems and monitoring model performance. · Feature Engineering: Create and test new features to improve model performance, and optimize feature selection to reduce model complexity. · Algorithm Optimization: Research and implement state-of-the-art algorithms to improve model accuracy, efficiency, and scalability. · Collaborative Development: Work closely with data scientists, engineers, and other stakeholders to understand business requirements, develop ML models, and integrate them into products and services. · Model Evaluation: Conduct model evaluation using statistical tests, cross-validation, and A/B testing to ensure reliability and generalizability. · Documentation & Reporting: Maintain thorough documentation of processes, models, and systems. Provide insights and recommendations based on model results to stakeholders. · Code Reviews & Best Practices: Participate in peer code reviews, and ensure adherence to coding best practices, including version control (Git), testing, and continuous integration. · Stay Updated on Industry Trends: Keep abreast of new techniques and advancements in the field of machine learning, and suggest improvements for internal processes and models. Required Skills & Qualifications · Education : Bachelor’s or Master’s degree in Computer Science, Data Science, Machine Learning, or related field. · Experience: 3 to 5 years of hands-on experience working as a machine learning engineer or in a related role. · Programming Languages: Proficiency in Python (preferred), R, or Java. Experience with ML libraries such as TensorFlow, PyTorch, Scikit-learn, and Keras. · Data Manipulation: Strong knowledge of SQL and experience working with large datasets (e.g., using tools like Pandas, NumPy, Spark). · Cloud Services: Experience with cloud platforms like AWS, Google Cloud, or Azure, particularly with ML services such as SageMaker or AI Platform. · Model Deployment: Hands-on experience with deploying ML models using Docker, Kubernetes, and CI/CD pipelines. · Problem-Solving Skills: Strong analytical and problem-solving skills with the ability to understand complex data problems and implement effective solutions. · Mathematics and Statistics: A solid foundation in mathematical concepts related to ML, such as linear algebra, probability, statistics, and optimization techniques. · Communication Skills: Strong verbal and written communication skills to collaborate effectively with cross-functional teams and stakeholders. Preferred Qualifications: · Experience with deep learning frameworks (e.g., TensorFlow, PyTorch). · Exposure to natural language processing (NLP), computer vision, or recommendation systems. · Familiarity with version control systems (e.g., Git) and collaborative workflows. · Experience with model interpretability and fairness techniques. · Familiarity with big data tools (e.g., Hadoop, Spark, Kafka). Screening Criteria · Bachelor’s or Master’s degree in Computer Science, Data Science, Machine Learning, or related field. · 3 to 5 years of hands-on experience working as a machine learning engineer or in a related role. · Proficiency in Python · Experience with ML libraries such as TensorFlow, PyTorch, Scikit-learn, and Keras. · Strong knowledge of SQL · Experience working with large datasets (e.g., using tools like Pandas, NumPy, Spark). · Experience with cloud platforms like AWS, Google Cloud, or Azure, particularly with ML services such as SageMaker or AI Platform · Hands-on experience with deploying ML models using Docker, Kubernetes, and CI/CD pipelines · A solid foundation in mathematical concepts related to ML, such as linear algebra, probability, statistics, and optimization techniques. · Available to work from office in Hyderabad · Available to join within 30 days Considerations · Location – Hyderabad · Working from office · 5 day working Evaluation Process Round 1 – HR Round Round 2 & 3 – Technical Round Round 4 – Discussion with CEO Interested Profiles, Kindly Apply Note o Additional inputs to be gathered from the candidate to put together the application Show more Show less
Posted 1 week ago
0 years
0 Lacs
India
Remote
🧠 Data Science Intern (Remote) Company: Coreline Solutions Type: Internship (3 to 6 Months) Mode: 100% Remote Stipend: Unpaid (Full-time opportunity may be offered upon successful completion) 🌐 About Us Coreline Solutions is a digital consulting and tech company focused on building smart, data-driven solutions for modern businesses. From AI integration to analytics platforms, we empower companies through technology, data science, and intelligent systems. Our team believes in continuous learning, transparency, and innovation — and we’re looking for passionate interns to grow with us. 🎯 About the Internship We’re seeking a Data Science Intern who is eager to explore real-world applications of machine learning, data analysis, and automation. You’ll be working alongside our engineering and analytics team to contribute to projects that improve business processes, insights, and outcomes. This internship is entirely remote, giving you the flexibility to learn and contribute from wherever you are. 📌 Key Responsibilities Collect, clean, and preprocess structured and unstructured data Perform exploratory data analysis (EDA) to extract insights Build and evaluate predictive models using Python and ML libraries Visualize data through tools such as matplotlib, seaborn, or Power BI Support teams with statistical analysis, feature engineering, and reporting Document models, results, and learnings in a collaborative environment ✅ What We’re Looking For Currently pursuing or recently completed a degree in Data Science, Computer Science, Statistics, or a related field Good understanding of Python, NumPy, pandas, scikit-learn, and basic ML algorithms Familiarity with SQL and data visualization tools Analytical thinking and a curiosity for solving complex problems Ability to work independently and meet project deadlines remotely 💡 Bonus Skills (Preferred but Not Required) Exposure to cloud services (AWS, GCP, or Azure) Basic knowledge of Git/GitHub for version control Interest in NLP, deep learning, or data engineering 🎁 What You’ll Gain Hands-on experience with live data projects and business use-cases Mentorship from experienced data scientists and tech leads Internship Certificate upon completion Letter of Recommendation for high-performing interns Possibility of full-time placement based on performance and company needs 🤝 Our Commitment We are proud to be an equal opportunity organization. Coreline Solutions values diversity and is committed to creating an inclusive space where all team members, interns, and applicants feel respected and supported. All internship communications and personal data will be handled responsibly and securely, in alignment with LinkedIn’s Privacy Policy and Professional Community Policies. 📬 How to Apply To apply, send your updated resume and a brief introduction to: 📩 hr@corelinesolutions.site Use the subject line: "Application – Data Science Intern – [Your Full Name]" 📌 Before applying, make sure your LinkedIn profile reflects your latest skills and projects. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Sahibzada Ajit Singh Nagar, Punjab, India
On-site
iTechnolabs is a software development company specializing in web applications, mobile apps and digital marketing services for businesses of all sizes. We help clients with consulting on technology and business strategies to achieve their goals and objectives Job description We are looking for a passionate, highly talented Python Developer with a minimum of 3+ years of experience to develop web applications using Django(Preferable)/flask. The candidate should be a strong individual contributor and be part of the development team and enjoys coding, thinks strongly about how things should be done. Must have: Hands-on experience in python Design, build and maintain efficient, reusable and reliable python code Hands-on experience in frameworks like Django or Flask Should adhere to best practices in Python/Django/Flask Good problem solving skills Knowledge of any of the databases. Relational (PostgreSQL, MySQL) or NoSQL (MongoDB). Testing and debugging software applications with Python test framework tools like Robot, PyTest, PyUnit, etc. Understanding of WebServer, Load Balancer, deployment process / activities, queuing mechanisms and background tasks. Good communication skills Good to have: Knowledge of front-end interfaces using HTML, CSS would be an added advantage. Familiarity with deployments using AWS will be a plus. Experience in Data Analytics/Data Science/Machine Learning. Working with Python libraries like Pandas, NumPy, etc. Familiarity with building RESTful APIs. Experience in monitoring and improving application performance will be a plus. Show more Show less
Posted 1 week ago
30.0 years
0 Lacs
Pune, Maharashtra, India
On-site
NVIDIA has been redefining computer graphics and accelerated computing for 30 years. Hiring the most creative minds and making sure that groundbreaking innovations can be accelerated through NVIDIA's platform have been key to this outstanding legacy of technological innovation. Today, we’re tapping into the unlimited potential of AI to define the next era of computing. Doing what’s never been done before takes vision, innovation, and the world’s best talent. As an NVIDIAN, you’ll be immersed in a diverse, supportive environment where everyone is encouraged to do their best work and make a lasting impact on the world! Are you passionate about inspiring change, building data-driven tools to improve software quality, and ensuring customers have the best experience? If so, we have a phenomenal opportunity for you! NVIDIA is seeking a creative and hands-on software engineer with a test-to-failure approach who is a quick learner, can understand software and hardware specifications, and build reliable tests and tools in C++/C#/Python to improve quality and accelerate the delivery of NVIDIA products. What You Will Be Doing QA & Prompt Design for Generative AI: Evaluate image/video outputs from generative AI models using thoughtful test prompts and clear quality metrics like realism, coherence, and safety. Automation & Tooling: Build and maintain Python-based tools, automation frameworks, and CI/CD test integrations to streamline QA workflows and improve test coverage. Graphics and Gaming Validation: Develop and implement detailed test plans for NVIDIA application software, which includes testing leading PC games. Emphasize NVIDIA features that improve visual quality, performance, compatibility, and overall user experience in authentic gaming environments. What We Need To See B.E./B. Tech degree in Computer Science/IT/Electronics engineering with strong academics or equivalent experience 5+ years of programming experience in Python/C#/C++ with experience in applying Object-Oriented Programming concepts Automation Expertise: Practical experience with test automation tools (e.g., Selenium), Git, and integration into CI/CD pipelines. Gaming & Hardware Knowledge: Deep understanding of PC gaming, GPU features, and testing on Windows systems; experienced with solving hardware/software interactions. Communication & Partnership: Strong critical thinking, clear documentation, and partnership in hybrid, multi-functional environments. QA Leadership: Experience mentoring QA teams or leading structured test planning efforts. Generative AI QA Experience: Hands-on testing or usage of image/video models (e.g., Sora, DALLE, Stable Diffusion), with Prompt engineering and evaluation of hallucination, relevance, and safety. Ways To Stand Out From The Crowd Deep AI Evaluation Skills: Designed evaluation protocols for diffusion/video models using structured Prompt testing and perceptual metrics. Computer Vision Background: Strong understanding of image/video quality, human perception, and content alignment in AI outputs. Python & ML Libraries: Hands-on experience with AI/ML libraries like PyTorch, OpenCV, PIL, Transformers, and NumPy. GPU & Graphics Expertise: Familiarity with NVIDIA technologies (DLSS, G-SYNC, PhysX, or equivalent experience), display drivers, and PC visual quality benchmarks. Proactive Problem Solver: Known for creatively identifying root causes in sophisticated QA issues and working to improve processes. With competitive salaries and a generous benefits package, we are widely considered to be one of the technology world’s most desirable employers. Due to outstanding growth, our elite engineering teams are rapidly growing. If you're creative with a real passion for technology, we want to hear from you. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, perform crucial job functions, and receive other benefits and privileges of employment. JR1997493 Show more Show less
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
About Us: Traya is an Indian direct-to-consumer hair care brand platform provides a holistic treatment for consumers dealing with hairloss. The Company provides personalized consultations that help determine the root cause of hair fall among individuals, along with a range of hair care products that are curated from a combination of Ayurveda, Allopathy, and Nutrition. Traya's secret lies in the power of diagnosis. Our unique platform diagnoses the patient’s hair & health history, to identify the root cause behind hair fall and delivers customized hair kits to them right at their doorstep. We have a strong adherence system in place via medically-trained hair coaches and proprietary tech, where we guide the customer across their hair growth journey, and help them stay on track. Traya is founded by Saloni Anand, a techie-turned-marketeer and Altaf Saiyed, a Stanford Business School alumnus. Our Vision: Traya was created with a global vision to create awareness around hair loss, de-stigmatise it while empathizing with the customers that it has an emotional and psychological impact. Most importantly, to combine 3 different sciences (Ayurveda, Allopathy and Nutrition) to create the perfect holistic solution for hair loss patients. Responsibilities: Data Analysis and Exploration: Conduct in-depth analysis of large and complex datasets to identify trends, patterns, and anomalies. Perform exploratory data analysis (EDA) to understand data distributions, relationships, and quality. Machine Learning and Statistical Modeling: Develop and implement machine learning models (e.g., regression, classification, clustering, time series analysis) to solve business problems. Evaluate and optimize model performance using appropriate metrics and techniques. Apply statistical methods to design and analyze experiments and A/B tests. Implement and maintain models in production environments. Data Engineering and Infrastructure: Collaborate with data engineers to ensure data quality and accessibility. Contribute to the development and maintenance of data pipelines and infrastructure. Work with cloud platforms (e.g., AWS, GCP, Azure) and big data technologies (e.g., Spark, Hadoop). Communication and Collaboration: Effectively communicate technical findings and recommendations to both technical and non-technical audiences. Collaborate with product managers, engineers, and other stakeholders to define and prioritize projects. Document code, models, and processes for reproducibility and knowledge sharing. Present findings to leadership. Research and Development: Stay up-to-date with the latest advancements in data science and machine learning. Explore and evaluate new tools and techniques to improve data science capabilities. Contribute to internal research projects. Qualifications: Bachelor's or Master's degree in Computer Science, Statistics, Mathematics, or a related field. 3-5 years of experience as a Data Scientist or in a similar role. Leverage SageMaker's features, including SageMaker Studio, Autopilot, Experiments, Pipelines, and Inference, to optimize model development and deployment workflows. Proficiency in Python and relevant libraries (e.g., scikit-learn, pandas, NumPy, TensorFlow, PyTorch). Solid understanding of statistical concepts and machine learning algorithms. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Experience deploying models to production. Experience with version control (Git) Preferred Qualifications: Experience with specific industry domains (e.g., e-commerce, finance, healthcare). Experience with natural language processing (NLP) or computer vision. Experience with building recommendation engines. Experience with time series forecasting. Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Be a part of India’s largest and most admired news network! Network18 is India's most diversified Media Company in the fast growing Media market. The Company has a strong Heritage and we possess a strong presence in Magazines, Television and Internet domains. Our brands like CNBC, Forbes and Moneycontrol are market leaders in their respective segments. The Company has over 7,000 employees across all major cities in India and has been consistently managed to stay ahead of the growth curve of the industry. Network 18 brings together employees from varied backgrounds under one roof united by the hunger to create immersive content and ideas. We take pride in our people, who we believe are the key to realizing the organization’s potential. We continually strive to enable our employees to realize their own goals, by providing opportunities to learn, share and grow. Role Overview: We are seeking a passionate and skilled Data Scientist with over a year of experience to join our dynamic team. You will be instrumental in developing and deploying machine learning models, building robust data pipelines, and translating complex data into actionable insights. This role offers the opportunity to work on cutting-edge projects involving NLP, Generative AI, data automation, and cloud technologies to drive business value. Key Responsibilities: Design, develop, and deploy machine learning models, with a strong focus on NLP (including advanced techniques and Generative AI) and other AI applications. Build, maintain, and optimize ETL pipelines for automated data ingestion, transformation, and standardization from various sources Work extensively with SQL for data extraction, manipulation, and analysis in environments like BigQuery. Develop solutions using Python and relevant data science/ML libraries (Pandas, NumPy, Hugging Face Transformers, etc.). Utilize Google Cloud Platform (GCP) services for data storage, processing, and model deployment. Create and maintain interactive dashboards and reporting tools (e.g., Power BI) to present insights to stakeholders. Apply basic Docker concepts for containerization and deployment of applications. Collaborate with cross-functional teams to understand business requirements and deliver data-driven solutions. Stay abreast of the latest advancements in AI/ML and NLP best practices. Required Qualifications & Skills: 2+ years of hands-on experience as a Data Scientist or in a similar role. Solid understanding of machine learning fundamentals, algorithms, and best practices. Proficiency in Python and relevant data science libraries. Good SQL skills for complex querying and data manipulation. Demonstrable experience with Natural Language Processing (NLP) techniques, including advanced models (e.g., transformers) and familiarity with Generative AI concepts and applications. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Preferred Qualifications & Skills: Familiarity and hands-on experience with Google Cloud Platform (GCP) services, especially BigQuery, Cloud Functions, and Vertex AI. Basic understanding of Docker and containerization for deploying applications. Experience with dashboarding tools like Power BI and building web applications with Streamlit. Experience with web scraping tools and techniques (e.g., BeautifulSoup, Scrapy, Selenium). Knowledge of data warehousing concepts and schema design. Experience in designing and building ETL pipelines. Disclaimer: Please note Network18 and related group companies do not use the services of vendors or agents for recruitment. Please beware of such agents or vendors providing assistance. Network18 will not be responsible for any losses incurred. “We correspond only from our official email address” Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Is your next career move to work in a team which uses data, reporting and analytical skills to help answer business questions to make DAZN a data-driven company? DAZN is a tech-first sport streaming platform that reaches millions of users every week. We are challenging a traditional industry and giving power back to the fans. Our new Hyderabad tech hub will be the engine that drives us forward to the future. We’re pushing boundaries and doing things no-one has done before. Here, you have the opportunity to make your mark and the power to make change happen - to make a difference for our customers. When you join DAZN you will work on projects that impact millions of lives thanks to your critical contributions to our global products This is the perfect place to work if you are passionate about technology and want an opportunity to use your creativity to help grow and scale a global range of IT systems, Infrastructure, and IT Services. Our cutting-edge technology allows us to stream sports content to millions of concurrent viewers globally across multiple platforms and devices. DAZN’s Cloud based architecture unifies a range of technologies to deliver a seamless user experience and support a global user base and company infrastructure. This role will be based in our brand-new Hyderabad office. Join us in India’s beautiful “City of Pearls” and bring your ambition to life. Job Title : Data Scientist Location: Hyderabad : Work from Office. Job Summary: We are seeking a passionate and experienced Data Scientist / Machine Learning Engineer with a strong foundation in Machine Learning, Deep Learning, and Generative AI (GenAI). The ideal candidate will have hands-on experience with modern AI frameworks and libraries, a proven ability to design, develop, and deploy scalable ML solutions, and expertise in working with Large Language Models (LLMs). You will leverage your skills to drive impactful solutions, staying at the forefront of advancements in ML, DL, and GenAI. You will work on cutting-edge projects, incorporating the latest technologies into production-ready systems. If you are excited about advancing your career in a dynamic, innovation-driven environment, we invite you to join us! Key Responsibilities: • Develop, train, and fine-tune machine learning and deep learning models for diverse business use cases. With an Overall experience of (2-8 Years) • Work with Generative AI and LLMs, including customizing, fine-tuning, and deploying models for real-world applications. • Stay updated with the latest advancements in Machine Learning, Deep Learning, and Generative AI, and incorporate them into projects. Required Skills: • Machine Learning Frameworks: Proficiency in Scikit-learn, TensorFlow or PyTorch. • Deep Learning: Experience building and training deep learning models, including CNNs, RNNs, Transformers, and other architectures. • Generative AI: Hands-on experience with Generative AI models, including fine-tuning and deploying LLMs. • Programming: Strong skills in Python; familiarity with libraries like NumPy, pandas, and Matplotlib. • Data Handling: Experience working with structured and unstructured data, including text, images, and time-series. Good to have Skills: • Exposure to tools like Hugging Face Transformers for working with LLMs. • Understanding of MLOps practices, including CI/CD pipelines for ML models. • Experience with GPU-accelerated computing. • Experience with distributed training frameworks (e.g., Horovod, PyTorch Lightning). • Cloud Platforms: Familiarity with cloud services like AWS Qualifications: • Bachelor’s or master’s degree in computer science, Data Science, Machine Learning, or a related field. • 1-2+ years of professional experience in Machine Learning and Deep Learning. • Proven track record of deploying ML/DL models in production environments. Show more Show less
Posted 1 week ago
0.0 - 1.0 years
0 Lacs
Bengaluru, Karnataka
Remote
Job Title: AI/ML Developer – (Intern) Company: VASPP Technologies Pvt. Ltd. Location: Bengaluru, Karnataka, India Job Type: Full-Time Experience: Fresher (0–1 year) Department: Technology / Development About VASPP Technologies: VASPP Technologies Pvt. Ltd. is a fast-growing software company focused on delivering cutting-edge digital transformation solutions for global enterprises. Our innovative projects span across AI/ML, data analytics, enterprise solutions, and cloud computing. We foster a collaborative and dynamic environment that encourages learning and growth. Job Summary: We are seeking a motivated and enthusiastic AI/ML Developer – Fresher to join our growing technology team. The ideal candidate will have a foundational understanding of machine learning algorithms, data analysis, and model deployment. You will work closely with senior developers to contribute to real-world AI/ML projects and software applications. Responsibilities: ·Assist in the design, development, training, and deployment of AI and machine learning models. Collaborate with cross-functional teams including software engineers, data scientists, and product managers to build intelligent applications. Perform data collection, cleaning, transformation, and exploratory data analysis (EDA). Test various ML algorithms (e.g., classification, regression, clustering) and optimize them for performance. Implement model evaluation metrics and fine-tune hyperparameters. Contribute to integrating ML models into software applications using REST APIs or embedded services. Stay updated with the latest AI/ML frameworks, research papers, and industry trends. Document all work including model development, experiments, and deployment steps in a structured format. Required Skills: Proficiency in Python and libraries such as NumPy, Pandas, Scikit-learn, TensorFlow, or PyTorch. Solid understanding of machine learning principles: supervised/unsupervised learning, overfitting, cross-validation, etc. Familiarity with data visualization tools: Matplotlib, Seaborn, Plotly. Basic knowledge of SQL and working with relational databases. Good understanding of software development basics, version control (Git), and collaborative tools. Strong problem-solving mindset, eagerness to learn, and ability to work in a team environment. Educational Qualification: Bachelor’s degree in Computer Science , Information Technology , Data Science , Artificial Intelligence , or related fields from a recognized institution. Preferred Qualifications (Optional): Internship or academic projects related to AI/ML. Participation in online competitions (e.g., Kaggle, DrivenData) or open-source contributions. Exposure to cloud platforms like AWS, Google Cloud (GCP), or Microsoft Azure. Familiarity with model deployment techniques using Flask/FastAPI, Docker, or Streamlit. Compensation: CTC/ Stipend: 5000 or 8000 rs per month How to Apply: Send your updated resume and portfolio to: Email: piyush.vs@vaspp.com or aparna.bs@vaspp.com Job Type: Internship Contract length: 2 months Pay: ₹5,000.00 - ₹8,000.00 per month Benefits: Paid sick time Work from home Schedule: Monday to Friday Morning shift Application Question(s): This is an 2 month Internship and the stipend will be based on performance and interview process so, is it okay for you ? Education: Bachelor's (Preferred) Experience: AI: 1 year (Preferred) Language: English (Preferred) Location: Bangalore, Karnataka (Required) Work Location: In person Application Deadline: 14/06/2025
Posted 1 week ago
10.0 years
0 Lacs
India
Remote
Senior Python Developer (Financial Services) Minimum experience level required 10 years Skill: Python, NumPy, Pandas, SciPy, Django/Flask, asyncio, SQL Server, PostgreSQL, MongoDB, Cassandra, Kafka, RabbitMQ Level: SSE (Sr. Software Engg) Location: Remote currently (Pune & Mumbai for new projects.) We are looking for candidates who can relocate to Mumbai or Pune later o n. Required Skills & Experience: 10+ years of professional software development experience, with a significant focus on Python. Demonstrable experience within the financial services industry Expert-level proficiency in Python and its ecosystem (e.g., NumPy, Pandas, SciPy, Django/Flask, asyncio). Strong understanding of data structures, algorithms, and object-oriented design principles. Proven experience with relational and/or NoSQL databases (e.g., SQL Server, PostgreSQL, MongoDB, Cassandra). Experience with real-time data processing and messaging systems (e.g., Kafka, RabbitMQ). Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and containerization technologies (Docker, Kubernetes). Proficiency with version control systems (Git) and CI/CD pipelines. Exceptional problem-solving abilities and a keen analytical mind. Demonstrated ability to learn new technologies and financial concepts rapidly and independently. Strong communication skills (written and verbal) with the ability to articulate complex technical concepts to both technical and non-technical audiences. Proactive, self-motivated, and able to work effectively in a fast-paced, dynamic environment. Education: Bachelor's or Master's degree in Computer Science, Engineering, Mathematics, Physics, or a related quantitative field. Show more Show less
Posted 1 week ago
0.0 - 6.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Role: Senior Analyst - Data Engineering Experience: 3 to 6 years Location: Bengaluru, Karnataka , India (BLR) Job Descrition: We are seeking a highly experienced and skilled Senior Data Engineer to join our dynamic team. This role requires hands-on experience with databases such as Snowflake and Teradata, as well as advanced knowledge in various data science and AI techniques. The successful candidate will play a pivotal role in driving data-driven decision-making and innovation within our organization. Job Resonsbilities: Design, develop, and implement advanced machine learning models to solve complex business problems. Apply AI techniques and generative AI models to enhance data analysis and predictive capabilities. Utilize Tableau and other visualization tools to create insightful and actionable dashboards for stakeholders. Manage and optimize large datasets using Snowflake and Teradata databases. Collaborate with cross-functional teams to understand business needs and translate them into analytical solutions. Stay updated with the latest advancements in data science, machine learning, and AI technologies. Mentor and guide junior data scientists, fostering a culture of continuous learning and development. Communicate complex analytical concepts and results to non-technical stakeholders effectively. Key Technologies & Skills: Machine Learning Models: Supervised learning, unsupervised learning, reinforcement learning, deep learning, neural networks, decision trees, random forests, support vector machines (SVM), clustering algorithms, etc. AI Techniques: Natural language processing (NLP), computer vision, generative adversarial networks (GANs), transfer learning, etc. Visualization Tools: Tableau, Power BI, Matplotlib, Seaborn, Plotly, etc. Databases: Snowflake, Teradata, SQL, NoSQL databases. Programming Languages: Python (essential), R, SQL. Python Libraries: TensorFlow, PyTorch, scikit-learn, pandas, NumPy, Keras, SciPy, etc. Data Processing: ETL processes, data warehousing, data lakes. Cloud Platforms: AWS, Azure, Google Cloud Platform. Big Data Technologies: Apache Spark, Hadoop. Job Snapshot Updated Date 11-06-2025 Job ID J_3679 Location Bengaluru, Karnataka, India Experience 3 - 6 Years Employee Type Permanent
Posted 1 week ago
0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
Job Description : : Data Science : Indore | : 3-6 : Data Science, Machine Learning, Deep Learning, NLP, Python, Flask, Numpy, Science Developer Responsibilities : Formulating, suggesting, and managing data-driven projects which are geared at furthering the business's interests. Collating and cleaning data from various entities for later use by junior data scientists. Delegating tasks to Junior Data Scientists in order to realise the successful completion of projects. Monitoring the performance of Junior Data Scientists and providing them with practical guidance, as needed. Selecting and employing advanced statistical procedures to obtain actionable insights. Cross-validating models to ensure their generalizability. Producing and disseminating non-technical reports that detail the successes and limitations of each project. Suggesting ways in which insights obtained might be used to inform business strategies. Staying informed about developments in Data Science and adjacent fields to ensure that outputs are always Science Developer Requirements : Advanced degree in data science, statistics, computer science, or similar. Extensive experience as a data scientist. Proficiency in Python, Numpy, Panda, Scikit Learn, Sql Stream or Similar AWS Services. In-depth understanding of SQL. Competent in machine learning principles and techniques. Demonstrable history of devising and overseeing data-centred projects. Outstanding supervision and mentorship abilities. Capacity to foster a healthy, stimulating work environment that frequently harnesses teamwork. Compliance with prevailing ethical : Data Science, Machine Learning, Deep Learning, NLP, Python, Flask, Numpy, Panda. (ref:hirist.tech) Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Itanagar, Arunachal Pradesh, India
On-site
Salary - 10 to 25 LPA Title : Sr. Data Scientist/ML Engineer (4+ years & above) Required Technical Skillset Language : Python, PySpark Framework : Scikit-learn, TensorFlow, Keras, PyTorch Libraries : NumPy, Pandas, Matplotlib, SciPy, Scikit-learn - DataFrame, Numpy, boto3 Database : Relational Database(Postgres), NoSQL Database (MongoDB) Cloud : AWS cloud platforms Other Tools : Jenkins, Bitbucket, JIRA, Confluence A machine learning engineer is responsible for designing, implementing, and maintaining machine learning systems and algorithms that allow computers to learn from and make predictions or decisions based on data. The role typically involves working with data scientists and software engineers to build and deploy machine learning models in a variety of applications such as natural language processing, computer vision, and recommendation systems. The key responsibilities of a machine learning engineer includes : Collecting and preprocessing large volumes of data, cleaning it up, and transforming it into a format that can be used by machine learning models. Model building which includes Designing and building machine learning models and algorithms using techniques such as supervised and unsupervised learning, deep learning, and reinforcement learning. Evaluating the model performance of machine learning models using metrics such as accuracy, precision, recall, and F1 score. Deploying machine learning models in production environments and integrating them into existing systems using CI/CD Pipelines, AWS Sagemaker Monitoring the performance of machine learning models and making adjustments as needed to improve their accuracy and efficiency. Working closely with software engineers, product managers and other stakeholders to ensure that machine learning models meet business requirements and deliver value to the organization. Requirements And Skills Mathematics and Statistics : A strong foundation in mathematics and statistics is essential. They need to be familiar with linear algebra, calculus, probability, and statistics to understand the underlying principles of machine learning algorithms. Programming Skills : Should be proficient in programming languages such as Python. The candidate should be able to write efficient, scalable, and maintainable code to develop machine learning models and algorithms. Machine Learning Techniques : Should have a deep understanding of various machine learning techniques, such as supervised learning, unsupervised learning, and reinforcement learning and should also be familiar with different types of models such as decision trees, random forests, neural networks, and deep learning. Data Analysis and Visualization : Should be able to analyze and manipulate large data sets. The candidate should be familiar with data cleaning, transformation, and visualization techniques to identify patterns and insights in the data. Deep Learning Frameworks : Should be familiar with deep learning frameworks such as TensorFlow, PyTorch, and Keras and should be able to build and train deep neural networks for various applications. Big Data Technologies : A machine learning engineer should have experience working with big data technologies such as Hadoop, Spark, and NoSQL databases. They should be familiar with distributed computing and parallel processing to handle large data sets. Software Engineering : A machine learning engineer should have a good understanding of software engineering principles such as version control, testing, and debugging. They should be able to work with software development tools such as Git, Jenkins, and Docker. Communication and Collaboration : A machine learning engineer should have good communication and collaboration skills to work effectively with cross-functional teams such as data scientists, software developers, and business stakeholders. (ref:hirist.tech) Show more Show less
Posted 1 week ago
0 years
0 Lacs
Greater Chennai Area
On-site
Responsibilities Proficiency in Azure Data Factory, Azure Databricks (including Spark and Delta Lake), and other Azure data services. Strong programming skills in Python, with experience in data processing libraries such as Pandas, PySpark, and NumPy. Experience with SQL and relational databases, as well as NoSQL databases. Familiarity with data warehousing concepts and tools (e.g., Azure Synapse Analytics). Proven experience as an Azure Data Engineer or similar role. Strong proficiency in Azure Databricks, including Spark and Delta Lake. Experience with Azure Data Factory, Azure Data Lake Storage, and Azure SQL Database. Proficiency in data integration and ETL processes and T-SQL. Proficiency in Data Warehousing concepts and Python (ref:hirist.tech) Show more Show less
Posted 1 week ago
1.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Position Overview We are seeking a ML Engineer with a strong background in Machine Learning, Natural Language Processing (NLP), Generative AI, and Retrieval-Augmented Generation (RAG). The ideal candidate will possess 1+ years of hands-on experience in developing and deploying advanced data-driven solutions. You will play a key role in our AI-CoE team, contributing to cutting-edge projects that drive innovation and business value. A special focus area for this role would be to build AI enabled products that would result in the creation of monetizable product differentiators for Tata Communications products and services. Detailed Job Description & Key Responsibilities Design, develop, and deploy machine learning systems, including Generative AI models and LLMs. Research and implement state-of-the-art ML algorithms and tools. Conduct data preprocessing, feature engineering, and statistical analysis. Train, fine-tune, and optimize machine learning models for performance and accuracy. Collaborate with cross-functional teams, including data scientists, software engineers, and domain experts. Extend existing ML frameworks and libraries to meet project requirements. Stay updated with the latest advancements in machine learning and AI. Skills Working knowledge of machine learning and deep learning skills. Strong knowledge of programming knowledge - Python, SQL and commonly used frameworks & tools - PyTorch, Sci-kit, NumPy, Gen AI tools like langchain/llamaIndex Working knowledge of MLOPs principles and implementing projects with Big Data in batch and streaming mode. - must have Experience in handling databases (SQL and NoSQL) Exposure to MLFlow, KubeFlow, Git CI/CD Experience with containerization tools like Docker, and orchestration tools like Kubernetes Excellent problem-solving skills and a proactive attitude. Strong communication and teamwork abilities. Ability to manage multiple projects and meet deadlines Interview will involve coding test (ref:hirist.tech) Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Greater Kolkata Area
On-site
Company Description UsefulBI Corporation provides comprehensive solutions across Data Engineering, Data Science, AI/ML, and Business Intelligence. The company's mission is to empower astute business decisions through integrating data insights and cutting-edge AI. UsefulBI excels in data architecture, cloud strategies, Business Intelligence, and Generative AI to deliver outcomes that surpass individual capabilities. Role Description We are seeking a skilled R and Python Developer with hands-on experience developing and deploying applications using Posit (formerly RStudio) tools, including Shiny Server, Posit Connect, and R Markdown. The ideal candidate will have a strong background in data analysis, application development, and creating interactive dashboards for data-driven decision-making. Key Responsibilities Design, develop, and deploy interactive web applications using R Shiny and Posit Connect. Write clean, efficient, and modular code in R and Python for data processing and analysis. Build and maintain R Markdown reports and Python notebooks for business reporting. Integrate R and Python scripts for advanced analytics and automation workflows. Collaborate with data scientists, analysts, and business users to gather requirements and deliver scalable solutions. Troubleshoot application issues and optimize performance on Posit platform (RStudio Server, Posit Connect). Work with APIs, databases (SQL, NoSQL), and cloud platforms (e.g., AWS, Azure) as part of application development. Ensure version control using Git and CI/CD for application deployment. Required Qualifications 4+ years of development experience using R and Python. Strong experience with Shiny apps, R Markdown, and Posit Connect. Proficient in using packages like dplyr, ggplot2, plotly, reticulate, and shiny. Experience with Python data stack (pandas, numpy, matplotlib, etc.) Hands-on experience with deploying apps on Posit Server / Connect. Familiarity with Git, Docker, and CI/CD tools. Excellent problem-solving and communication skills. (ref:hirist.tech) Show more Show less
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
About The Job WANTED: ASPIRING AI INNOVATOR & TECHNICAL ENTHUSIAST Kickstart your AI journey! Send your resume to [email address removed] with "Application for Junior AI/ML Engineer" as the subject line. This role follows a hybrid work model, requiring employees to work from the office twice a week. Candidates can choose to join any of our office locationsPune, Mumbai, or Coimbatorebased on their preference and convenience. Ready to contribute to the AI revolution and learn from the best? This is your opportunity to be part of a dynamic team shaping the future of technology. The Mission Contribute to the development of cutting-edge AI solutions at Fulcrum Digital. As a Junior AI/ML Engineer, you will work alongside experienced professionals to build and implement innovative AI models and applications that solve real-world problems. Why This Is More Than Just an Entry-Level Role : Gain hands-on experience in developing and deploying AI/ML models. Collaborate with a team of passionate and experienced AI professionals. Work on impactful projects that contribute to the future of technology. Learn and grow in a dynamic and innovative environment. Transform your passion for AI into practical skills and expertise. What You'll Do Assist in the development and implementation of AI/ML models and algorithms. Contribute to the data preprocessing, cleaning, and feature engineering processes. Support the evaluation and optimization of AI/ML models. Collaborate with senior team members on research and development initiatives. Assist in the deployment and monitoring of AI/ML solutions. Stay up-to-date with the latest advancements in AI/ML. The Ideal Candidate Will Have A foundational understanding of machine learning concepts and algorithms. Familiarity with programming languages such as Python. Exposure to relevant libraries and frameworks like TensorFlow or PyTorch (preferred). Basic understanding of data manipulation and analysis using libraries like Pandas and NumPy. A strong analytical and problem-solving aptitude. A proactive and eager-to-learn attitude. Your Developing Superpowers Should Include The ability to identify patterns and insights from data (with guidance). The capacity to understand and explain technical concepts clearly (with support). An interest in ethical considerations in AI development. A strong desire for continuous learning and growth in the field of AI. Basic Requirements A Bachelor's degree in Computer Science, Data Science, or a related field. A basic understanding of the AI/ML landscape. Strong communication and teamwork skills. A passion for artificial intelligence and machine learning. The Perks Of Starting Your Legend Work on exciting and challenging AI projects. Learn from and be mentored by experienced AI professionals. Contribute to the development of innovative technological solutions. Be part of a collaborative and supportive team environment. Gain invaluable experience in a rapidly evolving field (ref:hirist.tech) Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Numpy is a widely used library in Python for numerical computing and data analysis. In India, there is a growing demand for professionals with expertise in numpy. Job seekers in this field can find exciting opportunities across various industries. Let's explore the numpy job market in India in more detail.
The average salary range for numpy professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
Typically, a career in numpy progresses as follows: - Junior Developer - Data Analyst - Data Scientist - Senior Data Scientist - Tech Lead
In addition to numpy, professionals in this field are often expected to have knowledge of: - Pandas - Scikit-learn - Matplotlib - Data visualization
np.where()
function in numpy. (medium)np.array
and np.matrix
in numpy. (advanced)As you explore job opportunities in the field of numpy in India, remember to keep honing your skills and stay updated with the latest developments in the industry. By preparing thoroughly and applying confidently, you can land the numpy job of your dreams!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.