Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
2.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About Trademo Trademo is a Global Supply Chain Intelligence SaaS Company, headquartered in Palo-Alto, US. Trademo collects public and private data on global trade transactions, sanctioned parties, trade tariffs, ESG and other events using its proprietary algorithms.Trademo analyzes and performs advanced data processing on billions of data points (50Tb+) using technologies like Graph Databases, Vector Databases, ElasticSearch, MongoDB, NLP and Machine Learning (LLMs) to build end-to-end visibility on Global Supply Chains. Trademo’s vision is to build a single truth on global supply chains to different stakeholders in global supply chains - discover new commerce opportunities, ensure compliance with trade regulations, and automation for border security. Trademo stands out as one of the rarest Indian SaaS startups to secure 12.5 mn in seed funding. Founded by Shalabh Singhal, who is a third-time tech entrepreneur and an alumni of IIT BHU, CFA Institute USA, and Stanford GSB SEED. Our Trademo is backed by a remarkable team of leaders and entrepreneurs like Amit Singhal (Former Head of Search at Google), Sridhar Ramaswamy (CEO, Snowflake), Neeraj Arora (MD, General Catalyst & Former CBO, Whatsapp Group) . —---------------------------------------------------------------------------------------- Role: SDE 2 - Data Website: www.trademo.com Location: Onsite - Gurgaon What will you be doing here? Responsible for the maintenance and growth of a 50TB+ data pipeline serving global SaaS products for businesses, including onboarding new data and collaborating with pre-sales to articulate technical solutions Solves complex problems across large datasets by applying algorithms, particularly within the domains of Natural Language Processing (NLP) and Large Language Models (LLM) Leverage bleeding-edge technology to work with large volumes of complex data Be hands-on in development - Python, Pandas, NumPy, ETL frameworks. Preferred exposure to distributed computing frameworks like Apache Spark , Kafka, Airflow Along with individual data engineering contributions, actively help peers and junior team members on architecture and code to ensure the development of scalable, accurate, and highly available solutions Collaborate with teams and share knowledge via tech talks and promote tech and engineering best practices within the team. Requirement B-Tech/M-Tech in Computer Science from IIT or equivalent Tier 1 Colleges. Mandatory experience with Product based company. 2+ years of relevant work experience in data engineering or related roles. Proven ability to efficiently work with a high variety and volume of data (50TB+ pipeline experience is a plus). Solid understanding and preferred exposure to NoSQL databases, including Elasticsearch, MongoDB, and GraphDB. Basic understanding of working within Cloud infrastructure and Cloud Native Apps (AWS, Azure, IBM , etc.). Exposure to core data engineering concepts and tools: Data warehousing, ETL processes, SQL, and NoSQL databases. Great problem-solving ability over a larger set of data and the ability to apply algorithms, with a plus for experience using NLP and LLM. Willingness to learn and apply new techniques and technologies to extract intelligence from data, with prior exposure to Machine Learning and NLP being a significant advantage. Sound understanding of Algorithms and Data Structures. Ability to write well-crafted, readable, testable, maintainable, and modular code. Desired Profile: A hard-working, humble disposition. Desire to make a strong impact on the lives of millions through your work. Capacity to communicate well with stakeholders as well as team members and be an effective interface between the Engineering and Product/Business team. A quick thinker who can adapt to a fast-paced startup environment and work with minimum supervision What we offer: At Trademo, we want our employees to be comfortable with their benefits so they focus on doing the work they love. Parental leave - Maternity and Paternity Health Insurance Flexible Time Offs Stock Options Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
Job Summary The Associate Data Scientist will be responsible for developing and implementing machine learning models using computer vision and PyTorch. This individual will work closely with a team of data scientists and engineers to support the development of innovative solutions in a fast-paced, collaborative environment. Key Responsibilities Develop and implement machine learning models using computer vision and PyTorch Collaborate with a team of data scientists and engineers to support the development of innovative solutions Conduct data analysis and feature engineering to support the development of machine learning models Use computer vision techniques to extract and analyze data from images and videos Support the deployment and maintenance of machine learning models in a production environment Contribute to the continuous improvement of machine learning processes and practices Key Skills Python Pytorch, Pandas, Numpy, CV2 Experience working on Computer Vision projects Experience with Cloud is a plus Show more Show less
Posted 2 weeks ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Software Engineer II Are you interested in bringing your technical expertise to projects? Are you a detail-oriented paralegal with a 'can do' attitude? About Our Team LexisNexis Legal & Professional, which serves customers in more than 150 countries with 11,300 employees worldwide, is part of RELX, a global provider of information-based analytics and decision tools for professional and business customers. About The Role This position performs moderate research, design, and software development assignments within a specific software functional area or product line. Responsibilities Write and review portions of detailed specifications for the development of system components of moderate complexity. Complete simple bug fixes. Work closely with other development team members to understand product requirements and translate them into software designs. Operate in various development environments (Agile, Waterfall, etc.) while collaborating with key stakeholders. Resolve technical issues as necessary. Keep abreast of new technological developments. All other duties as assigned. Requirements 2+ years of experience as a Python Developer Working knowledge of API integration In-depth understanding of the Python software development stacks, ecosystems, frameworks, and tools such as Numpy, Scipy, Pandas, FAST API, Django, etc. Experience with popular Python frameworks such as Django, Flask, or Fast API. Experience with front-end development using HTML, CSS, and JavaScript. Familiarity with database technologies such as SQL and NoSQL. Excellent problem-solving ability with solid communication and collaboration skills. Knowledge of AWS , DOCKER, LAMBDA Functions Strong communication and collaboration skills including the ability to co-operate with multiple parties and align on priorities Knowledge Agile Methodology Ability to work with simple data models. Familiarity of industry best practices — code coverage. Work in a way that works for you We promote a healthy work/life balance across the organisation. We offer an appealing working prospect for our people. With numerous wellbeing initiatives, shared parental leave, study assistance and sabbaticals, we will help you meet your immediate responsibilities and your long-term goals. Working flexible hours - flexing the times when you work in the day to help you fit everything in and work when you are the most productive Working for you Benefits We know that your wellbeing and happiness are key to a long and successful career. These are some of the benefits we are delighted to offer: Comprehensive Health Insurance: Covers you, your immediate family, and parents. Enhanced Health Insurance Options: Competitive rates negotiated by the company. Group Life Insurance: Ensuring financial security for your loved ones. Group Accident Insurance: Extra protection for accidental death and permanent disablement. Flexible Working Arrangement: Achieve a harmonious work-life balance. Employee Assistance Program: Access support for personal and work-related challenges. Medical Screening: Your well-being is a top priority. Modern Family Benefits: Maternity, paternity, and adoption support. Long-Service Awards: Recognizing dedication and commitment. New Baby Gift: Celebrating the joy of parenthood. Subsidized Meals in Chennai: Enjoy delicious meals at discounted rates. Various Paid Time Off: Take time off with Casual Leave, Sick Leave, Privilege Leave, Compassionate Leave, Special Sick Leave, and Gazetted Public Holidays. Free Transport pick up and drop from the home -office - home (applies in Chennai) About The Business LexisNexis Legal & Professional® provides legal, regulatory, and business information and analytics that help customers increase their productivity, improve decision-making, achieve better outcomes, and advance the rule of law around the world. As a digital pioneer, the company was the first to bring legal and business information online with its Lexis® and Nexis® services. LexisNexis, a division of RELX, is an equal opportunity employer: qualified applicants are considered for and treated during employment without regard to race, color, creed, religion, sex, national origin, citizenship status, disability status, protected veteran status, age, marital status, sexual orientation, gender identity, genetic information, or any other characteristic protected by law. We are committed to providing a fair and accessible hiring process. If you have a disability or other need that requires accommodation or adjustment, please let us know by completing our Applicant Request Support Form: https://forms.office.com/r/eVgFxjLmAK , or please contact 1-855-833-5120. Please read our Candidate Privacy Policy. Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
Mohali, Punjab
On-site
Chicmic Studios Job Role: Data Scientist Experience Required: 3+ Years Skills Required: Data Science, Python, Pandas, Matplotlibs Job Description: We are seeking a Data Scientist with strong expertise in data analysis, machine learning, and visualization. The ideal candidate should be proficient in Python, Pandas, and Matplotlib, with experience in building and optimizing data-driven models. Some experience in Natural Language Processing (NLP) and Named Entity Recognition (NER) models would be a plus. Roles & Duties: Analyze and process large datasets using Python and Pandas. Develop and optimize machine learning models for predictive analytics. Create data visualizations using Matplotlib and Seaborn to support decision-making. Perform data cleaning, feature engineering, and statistical analysis. Work with structured and unstructured data to extract meaningful insights. Implement and fine-tune NER models for specific use cases (if required). Collaborate with cross-functional teams to drive data-driven solutions Required Skills & Qualifications: Strong proficiency in Python and data science libraries (Pandas, NumPy, Scikit-learn, etc.). Experience in data analysis, statistical modeling, and machine learning. Hands-on expertise in data visualization using Matplotlib and Seaborn. Understanding of SQL and database querying. Familiarity with NLP techniques and NER models is a plus. Strong problem-solving and analytical skills. Contact: 9875952836 Office Address: F273, Phase 8B industrial Area, Mohali, Punjab. Job Type: Full-time Schedule: Day shift Monday to Friday Work Location: In person
Posted 2 weeks ago
0 years
0 Lacs
India
Remote
🧠 Data Science Intern (Remote) 🌐 🔍 Do you love exploring data, building models, and uncovering insights with the power of machine learning and statistics? Ready to begin your data science journey — all from anywhere in the world? This role is for you! 📍 Location: Remote / Virtual 💼 Job Type: Internship (Unpaid) 🕒 Schedule: Flexible working hours 🌟 About the Role: We’re looking for a curious and driven Data Science Intern to join our remote team! This internship is perfect for students or recent graduates who want hands-on experience with real-world datasets, algorithms, and predictive analytics. You’ll work on data cleaning, exploration, modeling, and visualization to help solve real problems and deliver valuable insights — all while collaborating with a friendly, remote-first team. 🚀 What You’ll Gain: ✅ 100% Remote – Work from anywhere 🌍 ✅ Flexible Schedule – Learn and contribute on your time ⏰ ✅ Real-World Experience – Apply data science to real projects 📊 ✅ Skill Building – Sharpen your Python, ML, and data analysis toolkit 🛠️ ✅ Mentorship – Learn from experienced data professionals 👥 👀 Ideal Candidate: 🎓 Currently studying or recently graduated in Data Science, Computer Science, Statistics, or a related field 🧠 Solid understanding of statistics, data analysis, and machine learning concepts 🛠️ Comfortable with Python, Pandas, NumPy, and tools like Jupyter Notebook; knowledge of scikit-learn or TensorFlow is a plus 📈 Passionate about solving problems with data and eager to learn 💬 Self-motivated and able to work independently in a remote setting 📅 Apply By: June 10th Excited to take your first step into the data science world? We’d love to hear from you! Let’s build models and make discoveries through data — together! 📊🧠💡 Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
India
Remote
🧠 Data Science Intern (Remote) 🌐 🔍 Do you love exploring data, building models, and uncovering insights with the power of machine learning and statistics? Ready to begin your data science journey — all from anywhere in the world? This role is for you! 📍 Location: Remote / Virtual 💼 Job Type: Internship (Unpaid) 🕒 Schedule: Flexible working hours 🌟 About the Role: We’re looking for a curious and driven Data Science Intern to join our remote team! This internship is perfect for students or recent graduates who want hands-on experience with real-world datasets, algorithms, and predictive analytics. You’ll work on data cleaning, exploration, modeling, and visualization to help solve real problems and deliver valuable insights — all while collaborating with a friendly, remote-first team. 🚀 What You’ll Gain: ✅ 100% Remote – Work from anywhere 🌍 ✅ Flexible Schedule – Learn and contribute on your time ⏰ ✅ Real-World Experience – Apply data science to real projects 📊 ✅ Skill Building – Sharpen your Python, ML, and data analysis toolkit 🛠️ ✅ Mentorship – Learn from experienced data professionals 👥 👀 Ideal Candidate: 🎓 Currently studying or recently graduated in Data Science, Computer Science, Statistics, or a related field 🧠 Solid understanding of statistics, data analysis, and machine learning concepts 🛠️ Comfortable with Python, Pandas, NumPy, and tools like Jupyter Notebook; knowledge of scikit-learn or TensorFlow is a plus 📈 Passionate about solving problems with data and eager to learn 💬 Self-motivated and able to work independently in a remote setting 📅 Apply By: June 10th Excited to take your first step into the data science world? We’d love to hear from you! Let’s build models and make discoveries through data — together! 📊🧠💡 Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
India
On-site
Job Summary: We are looking for an experienced Database & Data Engineer who can own the full lifecycle of our cloud data systems—from database optimization to building scalable data pipelines. This hybrid role demands deep expertise in SQL performance tuning , cloud-native ETL/ELT , and modern Azure data engineering using tools like Azure Data Factory, Databricks, and PySpark . Ideal candidates will be comfortable working across Medallion architecture , transforming raw data into high-quality assets ready for analytics and machine learning. Key Responsibilities: 🔹 Database Engineering Implement and optimize indexing, partitioning, and sharding strategies to improve performance and scalability. Tune and refactor complex SQL queries, stored procedures, and triggers using execution plans and profiling tools. Perform database performance benchmarking, query profiling , and resource usage analysis. Address query bottlenecks, deadlocks, and concurrency issues using diagnostic tools and SQL optimization. Design and implement read/write splitting and horizontal/vertical sharding for distributed systems. Automate backup, restore, high availability, and disaster recovery using native Azure features. Maintain schema versioning and enable automated deployment via CI/CD pipelines and Git . 🔹 Data Engineering Build and orchestrate scalable data pipelines using Azure Data Factory (ADF), Databricks , and PySpark . Implement Medallion architecture with Bronze, Silver, and Gold layers in Azure Data Lake. Process and transform data using PySpark, Pandas, and NumPy . Create and manage data integrations from REST APIs , flat files, databases, and third-party systems. Develop and manage incremental loads , SCD Type 1 & 2 , and advanced data transformation workflows . Leverage Azure services like Synapse, Azure SQL DB, Azure Blob Storage , and Azure Data Lake Gen2 . Ensure data quality, consistency, and lineage across environments. 🔹 Collaboration & Governance Work with cross-functional teams including data science, BI, and business analysts. Maintain standards around data governance, privacy, and security compliance . Contribute to internal documentation and team knowledge base using tools like JIRA, Confluence, and SharePoint . Participate in Agile workflows and help define sprint deliverables for data engineering tasks. Required Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. 5+ years of hands-on experience in data engineering and SQL performance optimization in cloud environments. Expertise in Azure Data Factory, Azure Data Lake, Azure SQL, Azure Synapse , and Databricks . Proficient in SQL, Python, PySpark, Pandas, and NumPy . Strong experience in query performance tuning, indexing, and partitioning . Familiar with PostgreSQL (PGSQL) and handling NoSQL databases like Cosmos DB or Elasticsearch . Experience with REST APIs , flat files, and real-time integrations. Working knowledge of version control (Git) and CI/CD practices in Azure DevOps or equivalent. Solid understanding of Medallion architecture , lakehouse concepts, and data reliability best practices. Preferred Qualifications: Microsoft Certified: Azure Data Engineer Associate or equivalent. Familiarity with Docker, Kubernetes , or other containerization tools. Exposure to streaming platforms such as Kafka, Azure Event Hubs, or Azure Stream Analytics. Industry experience in supply chain, logistics, or finance is a plus. Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
India
Remote
Who we are We are software artisans passionate about what we do: help companies build awesome solutions. With an agile process that is built on top of the best engineering practices. Our team is comprised of full-stack developers and architects, who are versed in the very latest technologies and love what they do! We believe transparent, honest and fluent communication, both remotely and on-site is a key factor to the success of any project. What are we looking for? Our ideal candidate is a seasoned Software Engineer oriented to Machine Learning and Artificial Intelligence ecosystem (Python, AI Agent, MLOps, RAG, NumPy, Pandas), someone who also has wide experience in enterprise infrastructure, On-Premise and Cloud, and working in agile teams leveraging the tools that enable success such as CI/CD pipelines and following best practices. Responsibilities Own the development of new customer-facing AI solutions and product experiences Develop and use modern software engineering practices to deploy ML solutions at scale, including building CI/CD pipelines and automated testing Work together with a team of professional engineers with the objective of automating processes, deploying and building infrastructure as code and managing the architecture of multicloud systems. To Participate in agile ceremonies, weekly demos and such. To Communicate your daily commitments. Qualifications 3+ years of relevant work experience. Experience with building and maintaining infrastructure for model performance evaluation, including data collection, preprocessing, and analysis Strong understanding of machine learning concepts and algorithms, and experience with developing and deploying machine learning models in production Solid understanding of DevOps CI CD and monitoring. Experience with Agile ceremonies. Passionate about good engineering practices and testing. Ability to organize, prioritize and communicate daily/weekly goals. You like to learn, are curious, humble and like to get things done. Passion for working in a customer facing setting. Required Skills Proficient with Python. Proficient with AI Agent (such as: LangGraph or LangChain) Competent with MLOps Proficient with RAG (Retrieval-Augmented Generation) Proficient with GIT. Nice to have Proficient with Databricks. Proficient with NumPy, Pandas, SciKit Learn Proficient with NLP (NET) Proficient with GenAI familiarity Proficient with Cohor analysis Our solutions support enterprise information management, business intelligence, machine learning and data science. You should excel in working remotely and have outstanding communication skills ( Transparency is one of our core values ). Show more Show less
Posted 2 weeks ago
5.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We’re looking for a Senior Developer to join our growing team. If you’re passionate and deeply curious about finding solutions to new challenges and perfecting the design of our visionary Epicor product suite, you could be the perfect fit for this role. We’re seeking a candidate who has experience in developing AI/ML solutions in Python. Does this sound like you? Good, keep reading! What You’ll Do It’s all in the name. Develop our software products to make them the best in the industry. Dream up and build new applications. Get creative and focus on solving problems for our customers. Write, test, and deploy code using established standards and procedures. Stay on track by following best practices and established processes for large, complex projects. Work on ML models and implement solutions. Develop end to endflow for creating and pushing ML models to real application Maintain, refine, and troubleshoot challenges with existing software to make sure it’s up to code (see what we did there?) What You Need To Succeed 5 to 8 Years of experience with great problem-solving skills Collaborative team player with good communication. Strong proficiency in python programming for ML development Hand on experience working with ML frameworks (TensorFlow, ScikitLearn, etc.) Knowledge on Azure cloud and especially working with Azure ML studio and cognitive services. Knowledge on working with SQL, NO SQL Databases and REST APIs Knowledge on Azure OpenAI is good have and preferred Dataset preparation and cleansing for model creation Working knowledge of different types of data (structured, semi-structured, and unstructured) Working knowledge in Python frameworks such as Flask, Django, and Pyramid. Working with huge data sets and data analysis with Pandas and NumPy Working with Python ORM Libraries Ability to handle large datasets Experience of creating custom ML model Practical implementation of NLP modules Experience on training enterprise ML models Additional Skills That Could Set You Apart Development of Enterprise Applications Knowledge on ERP/distribution/retail domain. #Hybrid About Epicor At Epicor, we’re truly a team. Join 5,000 talented professionals in creating a world of better business through data, AI, and cognitive ERP. We help businesses stay future-ready by connecting people, processes, and technology. From software engineers who command the latest AI technology to business development reps who help us seize new opportunities, the work we do matters. Together, Epicor employees are creating a more resilient global supply chain. We’re Proactive, Proud, Partners. Whatever your career journey, we’ll help you find the right path. Through our training courses, mentorship, and continuous support, you’ll get everything you need to thrive. At Epicor, your success is our success. And that success really matters, because we’re the essential partners for the world’s most essential businesses—the hardworking companies who make, move, and sell the things the world needs. Competitive Pay & Benefits Health and Wellness: Comprehensive health and wellness benefits designed to support your overall well-being. Internal Mobility: Opportunities for mentorship, continuing education, and focused career goal setting, with 25% of positions filled internally. Career Development: Free LinkedIn Learning licenses for everyone, along with our Mentoring Program to boost your personal development. Education Support: Geographically specific programs to balance the cost of education with the benefits of continued learning and personal development. Inclusive Workplace: Collaborate with a diverse team in an inclusive, global workplace that fosters innovation and celebrates partnership. Work-Life Balance: Policies built on mutual trust and support, encouraging time off to rest, recharge, and reconnect. Global Mobility: Comprehensive support for international relocations and permanent residency processes. Equal Opportunities and Accommodations Statement Epicor is committed to creating a workplace and global community where inclusion is valued; where you bring the whole and real you—that’s who we’re interested in. If you have interest in this or any role- but your experience doesn’t match every qualification of the job description, that’s okay- consider applying regardless. We are an equal-opportunity employer. Recruiter Shweta Halyal Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Itanagar, Arunachal Pradesh, India
On-site
Title : Sr. Data Scientist/ML Engineer (4+ years & above) Required Technical Skillset Language : Python, PySpark Framework : Scikit-learn, TensorFlow, Keras, PyTorch, Libraries : NumPy, Pandas, Matplotlib, SciPy, Scikit-learn - DataFrame, Numpy, boto3 Database : Relational Database(Postgres), NoSQL Database (MongoDB) Cloud : AWS cloud platforms Other Tools : Jenkins, Bitbucket, JIRA, Confluence A machine learning engineer is responsible for designing, implementing, and maintaining machine learning systems and algorithms that allow computers to learn from and make predictions or decisions based on data. The role typically involves working with data scientists and software engineers to build and deploy machine learning models in a variety of applications such as natural language processing, computer vision, and recommendation systems. The key responsibilities of a machine learning engineer includes : Collecting and preprocessing large volumes of data, cleaning it up, and transforming it into a format that can be used by machine learning models. Model building which includes Designing and building machine learning models and algorithms using techniques such as supervised and unsupervised learning, deep learning, and reinforcement learning. Evaluating the model performance of machine learning models using metrics such as accuracy, precision, recall, and F1 score. Deploying machine learning models in production environments and integrating them into existing systems using CI/CD Pipelines, AWS Sagemaker Monitoring the performance of machine learning models and making adjustments as needed to improve their accuracy and efficiency. Working closely with software engineers, product managers and other stakeholders to ensure that machine learning models meet business requirements and deliver value to the organization. Requirements And Skills Mathematics and Statistics : A strong foundation in mathematics and statistics is essential. They need to be familiar with linear algebra, calculus, probability, and statistics to understand the underlying principles of machine learning algorithms. Programming Skills Should be proficient in programming languages such as Python. The candidate should be able to write efficient, scalable, and maintainable code to develop machine learning models and algorithms. Machine Learning Techniques Should have a deep understanding of various machine learning techniques, such as supervised learning, unsupervised learning, and reinforcement learning and should also be familiar with different types of models such as decision trees, random forests, neural networks, and deep learning. Data Analysis And Visualization Should be able to analyze and manipulate large data sets. The candidate should be familiar with data cleaning, transformation, and visualization techniques to identify patterns and insights in the data. Deep Learning Frameworks Should be familiar with deep learning frameworks such as TensorFlow, PyTorch, and Keras and should be able to build and train deep neural networks for various applications. Big Data Technologies A machine learning engineer should have experience working with big data technologies such as Hadoop, Spark, and NoSQL databases. They should be familiar with distributed computing and parallel processing to handle large data sets. Software Engineering A machine learning engineer should have a good understanding of software engineering principles such as version control, testing, and debugging. They should be able to work with software development tools such as Git, Jenkins, and Docker. Communication And Collaboration A machine learning engineer should have good communication and collaboration skills to work effectively with cross-functional teams such as data scientists, software developers, and business stakeholders. (ref:hirist.tech) Show more Show less
Posted 2 weeks ago
25.0 years
0 Lacs
India
Remote
Opportunities: Full-time remote or work-from-home Day shift, AEST Health Insurance Career Growth About the Role: We are looking for a passionate and motivated individual to join our team as an AI & Data Science Engineer. If you have a strong foundation in Python programming, SQL, and working with APIs, and are eager to learn and grow in the field of Artificial Intelligence (AI), Natural Language Processing (NLP), and Machine Learning (ML), this role is perfect for you! As part of our team, you will have the opportunity to work on cutting-edge AI technologies, including generative AI models, and develop solutions that solve real-world problems. Key Responsibilities: Learn and contribute to the design and development of AI and machine learning models. Work with structured and unstructured data to uncover insights and build predictive models. Assist in creating NLP solutions for tasks like text classification, sentiment analysis, and summarisation. Gain hands-on experience in deep learning for image processing, speech recognition, and generative AI. Write clean and efficient Python code for data analysis and model development. Work with SQL databases to retrieve and analyse data. Learn how to integrate APIS into AI workflows. Explore Generative AI technologies (e.g., GPT, DALL·E) and contribute to innovative solutions. Collaborate with senior team members to develop impactful AI-powered applications. Document your findings and contribute to knowledge-sharing within the team. Required Skills & Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or a related field. Strong Python programming skills and familiarity with libraries like Pandas, NumPy, and Matplotlib. Basic knowledge of SQL for data manipulation and extraction. Understanding of Machine Learning concepts and algorithms. Interest in Natural Language Processing (NLP) and familiarity with tools like spaCy, NLTK, or Hugging Face is a plus. Willingness to learn and work with Deep Learning frameworks such as TensorFlow or PyTorch. Problem-solving mindset with the ability to work independently and within a team. Good communication skills and enthusiasm for learning new technologies. Technical requirements: Windows 11 operating system or MacOS 13+ 256GB Storage space - minimum 16GB RAM - minimum Dual Core CPU - minimum Camera: HD Webcam (720p) Headset: Noise-cancelling (preferably) Internet Speed: 50 Mbps - minimum Why Join Us? Opportunity to work on cutting-edge data science, machine learning, and AI projects. A collaborative and inclusive work environment that values continuous learning and innovation. Access to resources and mentorship to enhance your skills in NLP, ML, DL, and Generative AI . Competitive compensation package and growth opportunities. Note: Include your LinkedIn Account in your Resume About The Company: Freedom Property Investors is the largest and number one property investment company in Australia, with its main offices in the Sydney and Melbourne CBDs. We were awarded the 3rd fastest-growing business in Australia across all industries according to the Australian Financial Review. We are privileged to have 25+ years of combined experience between our two founders, who served over 10,000 valued members and over 300 full-time staff spread across Australia and growing. We pride ourselves on being the industry leaders. It is our mission to serve our valued members, earning over 2,054 positive Google reviews and a 4.8 Star rating, this is unheard of in our industry. We are in need of people who share the same values as we do. This opportunity is open to all driven individuals who are committed to helping people and earning life-changing income. Join Australia’s largest and number 1 property investment team and contribute to our mission to help Australians achieve their goals of financial freedom every day. Apply now!!! Show more Show less
Posted 2 weeks ago
3.0 - 5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description Key Responsibilities: Contribute to the creation of technical specifications for applications, infrastructure, or complete solutions. Assist in technical analysis of potential solutions, evaluating the technical fit and viability of commercial off-the-shelf products versus custom-built solutions. Deliver solution designs, adhering to standards and leveraging re-use of components. Create high-level and detailed designs of infrastructure or applications, interfaces, conversions, extensions, reports, and workflows while meeting architecture, security, performance, scalability, and maintainability requirements. Assist technical and infrastructure groups in understanding the solution design and specifications. Participate in formal design reviews and code reviews to ensure detailed design specifications are understood and coding standards are adhered to. Leverage re-usability of component designs to reduce costs and shorten time to deliver solutions. Ensure efficiency in the build and deploy processes, leveraging automation where possible. Assist in the test strategy and execution of the master test plan. Contribute to the creation of standards, processes, procedures, and guidelines for the IT design and development community, as well as work instructions or ‘runbooks’ used for end-user support. Analyze and revise existing systems and documentation to identify remediation or improvements in the application or infrastructure solution. Provide level 3 support for critical issues. Work closely with IT technical service providers to ensure outsourced work packages are delivered to specifications, meeting key parameters of quality, schedule, cost, security, performance, and scalability. Responsibilities Qualifications: College, university, or equivalent degree in Computer Science, Information Technology, Business, or related subject, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Competencies Collaborates: Building partnerships and working collaboratively with others to meet shared objectives. Customer Focus: Building strong customer relationships and delivering customer-centric solutions. Interpersonal Savvy: Relating openly and comfortably with diverse groups of people. Plans and Aligns: Planning and prioritizing work to meet commitments aligned with organizational goals. Tech Savvy: Anticipating and adopting innovations in business-building digital and technology applications. Solution Configuration: Configures, creates, and tests solutions for commercial off-the-shelf (COTS) applications using industry standards and tools, version control, and build and test automation to meet business, technical, security, governance, and compliance requirements. Solution Design: Creates and defines solution designs complete with instrumentation and measurement, using industry standards and tools, version control, and build and test automation to synthesize diagrams, models, and documentation to build solutions that meet buildability, business, technical, security, governance, and compliance requirements. Solution Functional Fit Analysis: Composes and decomposes systems into component parts using procedures, tools, and work aides to study how well the component parts were designed, purchased, and configured to interact holistically to meet business, technical, security, governance, and compliance requirements. Solution Modeling: Creates, designs, and formulates models, diagrams, and documentation using industry standards, tools, version control, and build and test automation to meet business, technical, security, governance, and compliance requirements. Solution Validation Testing: Validates configuration item changes or solutions using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools, and metrics, to ensure they work as designed and meet customer requirements. Values Differences: Recognizing the value that different perspectives and cultures bring to an organization. Qualifications Skills and Experience Needed: 3-5 years of experience. Proficiency in SQL, DAX, M Query, and Python. Familiarity with C# and R (TensorFlow, PyTorch, and NumPy would be a plus). Experience with Azure and Databricks. Experience with data pipeline development, maintenance, and improvement with Azure DataLake. Experience with SSIS (SQL Server Integration Services), SSAS (SQL Server Analysis Services), and Power BI Service. Experience in developing, maintaining, optimizing, automating, and supporting Power BI reports. Experience in data mining, data auditing, and data insight generation. Understanding of AI (Artificial Intelligence)/ML (Machine Learning) concepts and tools in Databricks. Strong technical writing skills for documentation and communication. Knowledge of cybersecurity principles and practices. Knowledge in data governance and knowledgebase management techniques. Proficiency with Agile tools like Jira and Confluence for project management and collaboration. Proficiency in optimizing SSIS, SSAS, AAS (Azure Analytics Services), and Databricks workflows for performance of large datasets. Job Systems/Information Technology Organization Cummins Inc. Role Category Hybrid Job Type Exempt - Experienced ReqID 2414670 Relocation Package No Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We are seeking a highly skilled and motivated Python Developer to join our CLIENT. The ideal candidate will have hands-on experience in Python programming, AI/ML integration, and working with advanced libraries such as Langchain, Pandas, and NumPy. This role involves developing scalable applications, integrating AI models, and working in an Agile environment to deliver high-quality solutions. Key Responsibilities: Design, develop, and maintain efficient, reusable, and reliable Python code. Build and deploy AI-driven applications using Langchain and other relevant frameworks. Perform data analysis and transformation using Pandas and NumPy. Collaborate with data scientists and ML engineers to implement machine learning models into production. Debug and troubleshoot issues across the application stack. Utilize Git for version control and participate in code reviews and CI/CD processes. Work within Agile methodologies, participating in sprint planning, stand-ups, and retrospectives. Write clear technical documentation and contribute to knowledge-sharing sessions. Key Skills & Qualifications: Must-Have: Proficient in Python with a solid understanding of object-oriented programming. Experience with Langchain and building applications using LLMs (Large Language Models). Strong knowledge of Pandas and NumPy for data manipulation and analysis. Familiarity with machine learning concepts and integration of ML models. Understanding of AI frameworks and tools. Solid experience in debugging, testing, and optimizing Python code. Proficient with Git and collaborative development workflows. Experience working in Agile development environments (Scrum/Kanban). Nice-to-Have: Knowledge of vector databases (e.g., FAISS, Pinecone). Experience with APIs, microservices, or cloud platforms (AWS, GCP, Azure). Familiarity with containerization (Docker, Kubernetes). Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Job Description: Data Scientist Position Summary We are seeking a highly motivated Data Scientist to join our advanced analytics team. This role offers the opportunity to lead and contribute to a wide spectrum of machine learning and data-driven projects that influence strategic decisions across the organization. You will be expected to design and deploy innovative solutions in areas such as predictive analytics, customer segmentation, recommendation systems, churn prediction, time series forecasting, and anomaly detection . The ideal candidate is passionate about solving real-world business problems using data and has hands-on experience with Python, SQL, machine learning frameworks, and exploratory data analysis . Key Responsibilities Machine Learning & AI Development : Build, train, and optimize machine learning models using both supervised and unsupervised techniques, including but not limited to classification, regression, clustering, and dimensionality reduction. Exploratory Data Analysis (EDA) : Analyze complex datasets to discover trends, patterns, and actionable insights. Feature Engineering : Transform and optimize raw data into usable features that enhance model performance. Model Evaluation & Tuning : Evaluate model accuracy, perform hyperparameter tuning, and monitor performance over time. Stakeholder Collaboration : Translate business requirements into analytical solutions and clearly communicate insights and recommendations to both technical and non-technical stakeholders. Data Handling : Extract, clean, transform, and manage structured and unstructured datasets using SQL and Python. Solution Deployment : Integrate models into production environments and ensure they align with ongoing business goals. Required Skills & Qualifications Experience : 3–5 years of hands-on experience in Data Science, Machine Learning, or related analytics roles. Proven track record in building and deploying predictive models, segmentation algorithms, and clustering techniques. Technical Proficiency : Strong programming skills in Python (NumPy, pandas, scikit-learn, etc.) Expert in SQL for data extraction and manipulation. Solid grasp of statistical concepts, machine learning theory, and EDA practices. Experience with data visualization tools and presenting insights effectively. Preferred : Exposure to cloud platforms like AWS, GCP, or Azure. Experience with big data tools (e.g., PySpark) is a plus. Education : Bachelor’s or master’s degree in computer science, Statistics, Mathematics, or a related field. Degrees from Tier I/II institutions are preferred. Soft Skills : Excellent communication skills—both written and verbal. Strong analytical thinking, problem-solving ability, and a collaborative mindset. What We Offer EXL Analytics offers an exciting, fast paced and innovative environment, which brings together a group of sharp and entrepreneurial professionals who are eager to influence business decisions. From your very first day, you get an opportunity to work closely with highly experienced, world-class analytics consultants. You can expect to learn about many aspects of businesses that our clients engage in. You will also learn effective teamwork and time management skills - key aspects for personal and professional growth. Analytics requires different skill sets at different levels within the organization. At EXL Analytics, we invest heavily in training you in all aspects of analytics as well as in leading analytical tools and techniques. We provide guidance/ coaching to every employee through our mentoring program wherein every junior level employee is assigned as a senior level professional as advisors. Sky is the limit for our team members. The unique experiences gathered at EXL Analytics sets the stage for further growth and development in our company and beyond. Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description The Global Data Insights and Analytics (GDI&A) department at Ford Motors Company is looking for qualified people who can develop scalable solutions to complex real-world problems using Machine Learning, Big Data, Statistics, Econometrics, and Optimization. The goal of GDI&A is to drive evidence-based decision making by providing insights from data. Applications for GDI&A include, but are not limited to, Connected Vehicle, Smart Mobility, Advanced Operations, Manufacturing, Supply chain, Logistics, and Quality Analytics. Potential candidates should have excellent depth and breadth of knowledge in machine learning, data mining, and statistical modeling. They should possess the ability to translate a business problem into an analytical problem, identify the relevant data sets needed for addressing the analytical problem, recommend, implement, and validate the best suited analytical algorithm(s), and generate/deliver insights to stakeholders. Candidates are expected to regularly refer to research papers and be at the cutting-edge with respect to algorithms, tools, and techniques. The role is that of an individual contributor; however, the candidate is expected to work in project teams of 2 to 3 people and interact with Business partners on regular basis. Responsibilities Understand business requirements and analyze datasets to determine suitable approaches to meet analytic business needs and support data-driven decision-making Design and implement data analysis and ML models, hypotheses, algorithms and experiments to support data-driven decision-making Apply various analytics techniques like data mining, predictive modeling, prescriptive modeling, math, statistics, advanced analytics, machine learning models and algorithms, etc.; to analyze data and uncover meaningful patterns, relationships, and trends Design efficient data loading, data augmentation and data analysis techniques to enhance the accuracy and robustness of data science and machine learning models, including scalable models suitable for automation Research, study and stay updated in the domain of data science, machine learning, analytics tools and techniques etc.; and continuously identify avenues for enhancing analysis efficiency, accuracy and robustness Qualifications Minimum Qualifications Bachelor’s degree in Data science, computer science, Operational research, Statistics, Applied mathematics, or in any other engineering discipline. 3+ years of hands-on experience in Python programming for data analysis, machine learning, and with libraries such as NumPy, Pandas, Matplotlib, Scikit-learn, TensorFlow, PyTorch, NLTK, spaCy, and Gensim. 2+ years of experience with both supervised and unsupervised machine learning techniques. 2+ years of experience with data analysis and visualization using Python packages such as Pandas, NumPy, Matplotlib, Seaborn, or data visualization tools like Dash or QlikSense. 1+ years' experience in SQL programming language and relational databases. Preferred Qualifications An MS/PhD in Computer Science, Operational research, Statistics, Applied mathematics, or in any other engineering discipline. PhD strongly preferred. Experience working with Google Cloud Platform (GCP) services, leveraging its capabilities for ML model development and deployment. Experience with Git and GitHub for version control and collaboration. Besides Python, familiarity with one more additional programming language (e.g., C/C++/Java) Strong background and understanding of mathematical concepts relating to probabilistic models, conditional probability, numerical methods, linear algebra, neural network under the hood detail. Experience working with large language models such GPT-4, Google, Palm, Llama-2, etc. Excellent problem solving, communication, and data presentation skills. Show more Show less
Posted 2 weeks ago
2.0 - 5.0 years
10 - 20 Lacs
Mumbai, Thane, Mumbai (All Areas)
Hybrid
Company Description Quantanite is a specialist business process outsourcing (BPO) and customer experience (CX) solutions company that helps fast-growing companies and leading global brands to transform and grow. We do this through a collaborative and consultative approach, rethinking business processes and ensuring our clients employ the optimal mix of automation and human intelligence. We are an ambitious team of professionals spread across four continents and looking to disrupt our industry by delivering seamless customer experiences for our clients, backed-up with exceptional results. We have big dreams, and are constantly looking for new colleagues to join us who share our values, passion and appreciation for diversity. The company is headquartered in London, with delivery centers in Dhaka (Bangladesh), Johannesburg (South Africa) and Thane (India). The current positions are for our Thane, India delivery center. Job Description We are seeking a Python Full Stack developer to be an extension to our team and play a critical role in implementation of several solutions in the form of products, or tools & utilities. The ideal candidate will have a strong experience of end-to-end development of small-scale products & utilities independently. As a Full Stack developer, you will be responsible for front-end and back-end development using various technology and frameworks, working closely with the operations team as well as business analyst, project management and our technology partners. Key Responsibilities Develop, test, and maintain Al apps and machine learning models using Python. Create and maintain Al bots based on custom/ application APls. Implement data manipulation and analysis using libraries such as NumPy and Pandas Troubleshoot and debug Al applications and fine tune models. Participate in code reviews to ensure code quality and share knowledge with the team Stay updated with the latest advancements in Al and machine learning technologies. Ensure that solutions are scalable, maintainable, and meet the best practices for security, performance and data management Stay current with emerging technologies and trends in process automation and make Python Full Stack Developer recommendations for incorporating them into client solutions Collaborate, validate, and provide frequent updates to internal stakeholders throughout the project lifecycle Continuously research emerging technologies and determine credible solution Positively and constructively engage with clients and operations teams where required. Qualifications Experience in Azure. Proficiency in Python as the primary programming language. • Building LLM powered applications in a production environment. Knowledge of Python backend frameworks like Django or Flask, FastAPl etc. Knowledge of LLM internals and good knowledge of LLM ecosystem both private (Open Al, Gemini, Anthropic etc) and open source. Experience with version control systems like Git/ Github, Devops knowledge. Strong experience with NumPy and Pandas libraries for data manipulation and analysis. Solid understanding of machine learning principles and algorithms Preferred candidate profile Experience with Langchain. Knowledge of ReactJS(with Next & TypeScipt) Azure(deployment on webapps, Functions) Familiarity with Hugging Face Transformers for natural language processing. Experience in BPO Outsourcing/ IT outsourcing Qualifications Bachelor's degree in Computer Science, Artificial Intelligence, Engineering, or a related field, or equivalent work experience. 3-5 years of experience in Al, machine learning, and Python development. Proven ability to work independently and as part of a team.
Posted 2 weeks ago
8.0 - 12.0 years
11 - 16 Lacs
Mumbai
Work from Office
locationsMumbai - Hiranandaniposted onPosted 10 Days Ago time left to applyEnd DateJune 3, 2025 (3 days left to apply) job requisition idR_308144 Company: Marsh Description: We are seeking a talented individual to join our Data Science team at Marsh. This role will be based in Mumbai. This is a hybrid role that has a requirement of working at least three days a week in the office. Senior Manager - Data Science and Automation We will count on you to: Identify opportunities which add value to the business and make the process more efficient. Invest in understand the core business including products, process, documents, and data points with the objective of identifying efficiency and value addition opportunities. Design and develop end-to-end NLP/LLM solutions for document parsing, information extraction, and summarization from PDFs and scanned text. Develop AI applications to automate manual and repetitive tasks using generative AI and machine learning. Fine-tune open-source LLMs (like LLaMA, Mistral, Falcon, or similar) or build custom pipelines using APIs (OpenAI, Anthropic, Azure OpenAI). Build custom extraction logic using tools like LangChain, Haystack, Hugging Face Transformers, and OCR libraries like Tesseract or Azure Form Recognizer. Create pipelines to convert outputs into formatted Microsoft Word or PDF files using libraries like docx, PDFKit, ReportLab, or LaTeX. Collaborate with data engineers and software developers to integrate AI models into production workflows. Ensure model performance, accuracy, scalability, and cost-efficiency across business use cases. Stay updated with the latest advancements in generative AI, LLMs, and NLP research to identify innovative solutions. Design, develop, and maintain robust data pipelines for extracting, transforming, and loading (ETL) data from diverse sources. As the operational scales up design and implement scalable data storage solutions and integrate them with existing systems. Utilize cloud platforms (AWS, Azure, Google Cloud) for data storage and processing. Conduct code reviews and provide mentorship to junior developers. Stay up-to-date with the latest technology trends and best practices in data engineering and cloud services. Ability to lead initiatives and deliver results by engaging with cross-functional teams and resolving data ambiguity issues. Be responsible for the professional development of your projects and institute a succession plan. What you need to have: Bachelor's degree in Engineering, Analytics, or a related field, MBA, Computer Applications, IT, Business Analytics, or any discipline. Proven experience of 8-12 years in Python development Hands-on experience with frameworks and libraries like Transformers, LangChain, PyTorch/TensorFlow, spaCy, Hugging Face, and Haystack. Strong expertise in document parsing, OCR (Tesseract, AWS Textract, Azure Form Recognizer), and entity extraction. Proficiency in Python and familiarity with cloud-based environments (Azure, AWS, GCP). Experience deploying models as APIs/microservices using FastAPI, Flask, or similar. Familiarity with PDF parsing libraries (PDFMiner, PyMuPDF, Apache PDFBox) and Word generation libraries (python-docx, PDFKit). Solid understanding of prompt engineering and prompt-tuning techniques. Proven experience with data automation and building data pipelines. Proven track record in building and maintaining data pipelines and ETL processes. Strong knowledge of Python libraries such as Pandas, NumPy, and PySpark, Camelot. Familiarity with database management systems (SQL and NoSQL databases). Experience in designing and implementing system architecture. Ability to operate in a multi layered technology architecture and shape the technology maturity of the organization. Solid understanding of software development best practices, including version control (Git), code reviews, and testing frameworks (PyTest, UnitTest). Strong attention to detail and ability to work with complex data sets. Effective communication skills to present findings and insights to both technical and non-technical stakeholders. Specify superior listening, verbal and written communication skills Excellent project management and organization skills Superlative stakeholder management skills ability to positively influence stakeholders. Synthesis skills- Ability to connect the dots and answer the business question. Excellent problem-solving, structuring and critical-thinking skills. Ability to work independently and collaboratively in a fast-paced environment. What makes you stand out Masters degree in Computer Science, Engineering, or related fields. Experience in working with large-scale data sets and real-time data processing. Familiarity with additional programming languages like Java, C++, or R. Strong problem-solving skills and ability to work in a fast-paced environment. Why join our team: We help you be your best through professional development opportunities, interesting work and supportive leaders. We foster a vibrant and inclusive culture where you can work with talented colleagues to create new solutions and have impact for colleagues, clients and communities. Our scale enables us to provide a range of career opportunities, as well as benefits and rewards to enhance your well-being. Marsh, a business of Marsh McLennan (NYSEMMC), is the worlds top insurance broker and risk advisor. Marsh McLennan is a global leader in risk, strategy and people, advising clients in 130 countries across four businessesMarsh, Guy Carpenter, Mercer and Oliver Wyman. With annual revenue of $24 billion and more than 90,000 colleagues, Marsh McLennan helps build the confidence to thrive through the power of perspective. For more information, visit marsh.com, or follow on LinkedIn and X. Marsh McLennan is committed to embracing a diverse, inclusive and flexible work environment. We aim to attract and retain the best people and embrace diversity of age, background, caste, disability, ethnic origin, family duties, gender orientation or expression, gender reassignment, marital status, nationality, parental status, personal or social status, political affiliation, race, religion and beliefs, sex/gender, sexual orientation or expression, skin color, or any other characteristic protected by applicable law. Marsh McLennan is committed to hybrid work, which includes the flexibility of working remotely and the collaboration, connections and professional development benefits of working together in the office. All Marsh McLennan colleagues are expected to be in their local office or working onsite with clients at least three days per week. Office-based teams will identify at least one anchor day per week on which their full team will be together in person.
Posted 2 weeks ago
2.0 - 4.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Role Summary: We are seeking a highly analytical and results-oriented Manager, S&OP to join our dynamic team. In the fast-paced world of cloud kitchens, this role is critical for balancing the twin goals of maximizing item availability for our customers and minimizing food wastage across our network. You will be responsible for developing and managing accurate demand forecasts across all product categories, leveraging deep analysis of sales mix and consumption trends. This role requires strong proficiency in Excel, SQL, and Python to handle complex datasets and drive data-driven decision-making within our S&OP process. You will collaborate closely with Pod Operations, Growth, and Marketing teams to ensure our planning aligns with operational realities and commercial strategies. Key Responsibilities: 1. Demand Planning & Forecasting: Develop, maintain, and refine short-to-medium term demand forecasts for all product categories at relevant granularities (e.g., item, pod/zone, city). Utilize statistical methods, historical data analysis (via SQL, Python, Excel), and incorporate market intelligence to generate baseline forecasts. Analyze and model the impact of promotions, marketing campaigns, local events, seasonality, and menu changes on demand. Monitor forecast accuracy and bias KPIs, identifying root causes for variances and implementing corrective actions to continuously improve forecast quality. Prepare and lead demand review discussions within the S&OP cycle, presenting forecasts, assumptions, risks, and opportunities. 2. Sales Mix & Consumption Trend Analysis: Perform in-depth analysis of sales data using SQL and Python to understand mix shifts, attachment rates, and cannibalization effects. Analyze consumption patterns (e.g., order times, platform trends, customer behaviour) to identify key demand drivers and potential shifts. Connect sales and consumption trends directly to their impact on item availability and wastage metrics. Provide actionable insights to Marketing, Growth, and Culinary teams regarding product performance and portfolio optimization opportunities. 3. S&OP Process Contribution & KPI Management: Play a key role in the monthly and weekly S&OP cycle, ensuring timely inputs and effective collaboration. Continuously monitor, analyze, and report on key performance indicators: Item Availability and Wastage %. Develop insights and drive actions aimed at optimizing the trade-off between availability and wastage through improved forecasting and cross-functional alignment. Contribute to the ongoing improvement and maturity of the S&OP process within the organization. 4. Cross-Functional Collaboration: Partner closely with Pod Operations teams to gather localized insights, understand operational constraints, and ensure effective execution of plans. Collaborate with Growth & Marketing teams to align demand forecasts with promotional calendars, new product development (NPD), and market expansion plans. Engage with Finance and Procurement teams as needed for financial planning and supply feasibility inputs. 5. Data Analysis & Tool Usage: Leverage Advanced Excel, SQL, and Python for data extraction, manipulation, analysis, modeling, and reporting. Develop and maintain dashboards and reports to track performance and communicate insights effectively. Qualifications & Skills: Education: Bachelor's degree in Engineering (B.Tech) + MBA (supply chain specialization) Experience: 2 - 4 years of relevant experience in Demand Planning, S&OP, Supply Chain Management, or Business Analytics. Proven experience working in a fast-paced environment such as QSR (Quick Service Restaurants), E-commerce, Food Delivery, FMCG (with short shelf-life products), or Retail is highly preferred. Technical Skills: Required: Advanced proficiency in Microsoft Excel (complex formulas, pivot tables, modeling). Required: Strong proficiency in SQL for querying and data manipulation. Required: Proficiency in Python for data analysis and potentially forecasting model development (e.g., using libraries like Pandas, NumPy, Scikit-learn). Analytical & Problem-Solving Skills: Exceptional quantitative and analytical skills with the ability to work through complex problems, manipulate large datasets, and derive actionable insights. Business Acumen: Strong understanding of business drivers, financial implications (especially cost of goods, wastage), and the P&L impact of planning decisions.
Posted 2 weeks ago
0 years
0 Lacs
India
Remote
Job Title: Data Analyst Trainee Location: Remote Job Type: Internship (Full-Time) Duration: 1–3 Months Stipend: ₹25,000/month Department: Data & Analytics Job Summary: We are seeking a motivated and analytical Data Analyst Trainee to join our remote analytics team. This internship is perfect for individuals eager to apply their data skills in real-world projects, generate insights, and support business decision-making through analysis, reporting, and visualization. Key Responsibilities: Collect, clean, and analyze large datasets from various sources Perform exploratory data analysis (EDA) and generate actionable insights Build interactive dashboards and reports using Excel, Power BI, or Tableau Write and optimize SQL queries for data extraction and manipulation Collaborate with cross-functional teams to understand data needs Document analytical methodologies, insights, and recommendations Qualifications: Bachelor’s degree (or final-year student) in Data Science, Statistics, Computer Science, Mathematics, or a related field Proficiency in Excel and SQL Working knowledge of Python (Pandas, NumPy, Matplotlib) or R Understanding of basic statistics and analytical methods Strong attention to detail and problem-solving ability Ability to work independently and communicate effectively in a remote setting Preferred Skills (Nice to Have): Experience with BI tools like Power BI, Tableau, or Google Data Studio Familiarity with cloud data platforms (e.g., BigQuery, AWS Redshift) Knowledge of data storytelling and KPI measurement Previous academic or personal projects in analytics What We Offer: Monthly stipend of ₹25,000 Fully remote internship Mentorship from experienced data analysts and domain experts Hands-on experience with real business data and live projects Certificate of Completion Opportunity for a full-time role based on performance Show more Show less
Posted 2 weeks ago
15.0 - 20.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Machine Learning Good to have skills : NAMinimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an AI / ML Engineer, you will engage in the development of applications and systems that leverage artificial intelligence tools and cloud AI services. Your typical day will involve designing and implementing production-ready solutions, ensuring that they meet quality standards. You will work with various AI models, including generative AI, deep learning, and neural networks, while also exploring innovative applications such as chatbots and image processing. Collaboration with cross-functional teams will be essential as you contribute to the overall success of projects and drive advancements in AI technology. Roles & Responsibilities:- Expected to be a Subject Matter Expert with deep knowledge and experience.- Should have influencing and advisory skills.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities and understanding of AI technologies.- Mentor junior professionals to foster their growth and development in AI and machine learning. Professional & Technical Skills: - Must To Have Skills: Proficiency in Machine Learning.- Experience with cloud-based AI services and deployment strategies.- Strong understanding of deep learning frameworks such as TensorFlow or PyTorch.- Familiarity with data preprocessing and feature engineering techniques.- Ability to design and implement neural networks for various applications. Additional Information:- The candidate should have minimum 15 years of experience in Machine Learning.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
12.0 - 15.0 years
9 - 14 Lacs
Gurugram
Work from Office
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Data Science Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an AI / ML Engineer, you will engage in the development of applications and systems that leverage artificial intelligence tools and cloud AI services. Your typical day will involve designing and implementing production-ready solutions, ensuring that they meet quality standards. You will work with various AI models, including generative AI, deep learning, and neural networks, while also exploring innovative applications such as chatbots and image processing. Collaboration with cross-functional teams will be essential to integrate these advanced technologies into existing systems and workflows, driving efficiency and enhancing user experiences. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing and training sessions to enhance team capabilities.- Monitor project progress and ensure alignment with strategic goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Science.- Strong analytical skills with the ability to interpret complex data sets.- Experience with machine learning frameworks such as TensorFlow or PyTorch.- Familiarity with cloud platforms like AWS, Azure, or Google Cloud.- Proficient in programming languages such as Python or R. Additional Information:- The candidate should have minimum 12 years of experience in Data Science.- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
15.0 - 20.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Computer Vision Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an AI / ML Engineer, you will engage in the development of applications and systems that leverage artificial intelligence tools and cloud AI services. Your typical day will involve designing and implementing production-ready solutions, ensuring that they meet quality standards. You will work on various aspects of AI, including deep learning, neural networks, and image processing, while also integrating generative AI models into your projects. Collaboration with cross-functional teams will be essential as you contribute to innovative solutions that enhance the capabilities of AI technologies. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior professionals to foster their growth and development.- Continuously evaluate and improve existing processes to enhance team efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Computer Vision.- Strong understanding of deep learning frameworks such as TensorFlow or PyTorch.- Experience with image processing techniques and algorithms.- Familiarity with cloud platforms and services for deploying AI solutions.- Knowledge of software development best practices and version control systems. Additional Information:- The candidate should have minimum 7.5 years of experience in Computer Vision.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
12.0 - 15.0 years
9 - 14 Lacs
Pune
Work from Office
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Data Science Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an AI / ML Engineer, you will engage in the development of applications and systems that leverage artificial intelligence tools and cloud AI services. Your typical day will involve designing and implementing production-ready solutions, ensuring that they meet quality standards. You will work with various AI models, including generative AI, deep learning, and neural networks, while also exploring innovative applications such as chatbots and image processing. Collaboration with cross-functional teams will be essential to integrate these advanced technologies into existing systems and workflows, driving efficiency and enhancing user experiences. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing and training sessions to enhance team capabilities.- Monitor project progress and ensure alignment with strategic goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Science.- Strong analytical skills to interpret complex data sets.- Experience with machine learning frameworks such as TensorFlow or PyTorch.- Familiarity with cloud platforms like AWS, Azure, or Google Cloud.- Ability to design and implement data pipelines for AI applications. Additional Information:- The candidate should have minimum 12 years of experience in Data Science.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
15.0 - 20.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Machine Learning Good to have skills : NAMinimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an AI / ML Engineer, you will develop applications and systems that leverage artificial intelligence tools and cloud AI services. Your typical day will involve designing and implementing production-ready solutions, ensuring that they meet quality standards. You will work on integrating generative AI models into various applications, which may include deep learning, neural networks, chatbots, and image processing. Collaboration with cross-functional teams will be essential to ensure that the solutions you create are effective and innovative, addressing complex challenges in the field of artificial intelligence. Roles & Responsibilities:- Expected to be a Subject Matter Expert with deep knowledge and experience.- Should have influencing and advisory skills.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Mentor junior professionals to enhance their skills and knowledge in AI and machine learning.- Continuously evaluate and improve existing AI models and systems to ensure optimal performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Machine Learning.- Strong understanding of deep learning frameworks such as TensorFlow or PyTorch.- Experience with cloud platforms like AWS, Azure, or Google Cloud for deploying AI solutions.- Familiarity with data preprocessing and feature engineering techniques.- Ability to design and implement scalable machine learning pipelines. Additional Information:- The candidate should have minimum 15 years of experience in Machine Learning.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
15.0 - 20.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Machine Learning Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an AI / ML Engineer, you will engage in the development of applications and systems that leverage artificial intelligence tools and cloud AI services. Your typical day will involve designing and implementing production-ready solutions, ensuring that they meet quality standards. You will work with various AI models, including generative AI, deep learning, and neural networks, while also exploring innovative applications such as chatbots and image processing. Collaboration with cross-functional teams will be essential to integrate these advanced technologies into existing systems and workflows, driving efficiency and enhancing user experiences. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior professionals to enhance their skills and knowledge in AI and machine learning.- Continuously evaluate and implement new technologies to improve system performance and efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Machine Learning.- Strong understanding of deep learning frameworks such as TensorFlow or PyTorch.- Experience with cloud platforms and services, particularly in deploying AI solutions.- Familiarity with data preprocessing and feature engineering techniques.- Ability to design and implement algorithms for various AI applications. Additional Information:- The candidate should have minimum 7.5 years of experience in Machine Learning.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2