Jobs
Interviews

816 Preprocess Jobs - Page 29

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Hyderabad, TG, IN, 500081 Let's play together About Our Company Fortuna has become an established brand among customers within just a few years. We became a proud international Family of companies carrying Fortuna Entertainment Group from the first betting shop. We want to go further and be known for having the best tech department offering our employees the usage of modern technologies, and being part of many exciting projects. Our new home is the remarkable Churchill II building which has a view of Prague. Every detail underlines the company's corporate culture and represents our values. The workplace layout is 100% ecological, providing ideal conditions for everyday work. We all work as one team and treat each other with respect, openness, a sense of honor and respect for individual and cultural differences. POSITION TITLE: Data Scientist Key Purpose Statement – Core mission The key purpose or core mission of the Data Scientist position is to drive Data science in FEG business and build statistical and ML based models, and insights. Responsibilities Retrieve the data from different data sources i.e., Data Lake and Data Ware House using SQL, Python or PySpark Cleaning and preprocess the data for quality and accuracy Perform EDA to identify patterns, trends and insights using statistical methods Develop and implement machine learning models for customer behavior, revenues, campaigns Enhance the existing models by adding more features and expand it to other markets / teams Working with other teams such as engineering, product development, and marketing to understand their data needs and provide actionable insights Support the data analytics team in improving analytical models with data science-based insights Develop and provide MLOps framework on Microsoft Azure for deploying ML models (good to have) Requirements - Knowledge, Skills And Experience Education (High/University), language knowledge (level of EN, CZ, .. ), length of practice and experiences required: A degree in Computer Science / Information Technology Fluent communication skills in English both written and verbal Team player who can share experience with the team and grow them technically High quantitative and cognitive ability to solve complex problem and think with a vision Qualifications, knowledge of “XY”, specific technology; hard and soft skills required: At least 4 years of experience of working in data roles – Data Analyst, Business Analyst and Data Scientist 3 + years of experience in programming and data science, MLOps and automation of models Sound practical knowledge of working with advanced statistical algorithms Great understanding of all machine learning techniques and its application on business problems Experience with ML on MS Azure stack in combination with Python, SQL Strong business understanding. Worked in developing analytics solutions in 2-3 Domains Experience in building and productionizing ML platforms. Work And Personality Characteristics Required Data Driven Technical Thinking Problem-solving Communication skills Critical Thinking Task Management Proactive Offices at FEG Your browser does not support the video tag. Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Role: Drupal Developer Location: Juhi Nagar, Navi Mumbai (Work from Office – Alternate Saturdays will be working) Experience: 4+ years Joining: Immediate Joiners Only Work Mode: This is a Work from Office role. Work Schedule: Alternate Saturdays will be working. About company: It is an innovative technology company focused on delivering robust web solutions. (Further company details would typically be inserted here once provided by the client.) We are looking for talented individuals to join our team and contribute to cutting-edge projects. The Opportunity: Drupal Developer We are seeking an experienced and highly skilled Drupal Developer to join our team. The ideal candidate will have a strong understanding of Drupal's architecture and a proven track record in developing custom modules, implementing sophisticated theming, and integrating with various APIs. This is a hands-on role for an immediate joiner who is passionate about building secure, scalable, and high-performance Drupal applications. Key Responsibilities Develop and maintain custom Drupal modules using Hooks, Plugin system, Form API, and Entity API. Implement and work with REST, JSON:API, and GraphQL within Drupal for seamless data exchange. Design and implement Drupal themes using Twig templating engine and preprocess functions to ensure a consistent and engaging user experience. Configure and manage user roles and access control to maintain application security and data integrity. Apply best practices in securing Drupal applications, identifying and mitigating potential vulnerabilities. Integrate Drupal with various third-party APIs and external systems. Collaborate with cross-functional teams to define, design, and ship new features. Contribute to all phases of the development lifecycle, from concept to deployment and maintenance. Requirements Experience: 4+ years of professional experience in Drupal development. Custom Module Development: Strong understanding and hands-on experience with custom module development (Hooks, Plugin system, Form API, Entity API). API Integration (Drupal): Proficiency with REST / JSON:API / GraphQL in Drupal. Drupal Theming: Experience with Drupal theming using Twig and preprocess functions. Security & Access Control: Experience with user roles and access control, and a strong understanding of best practices in securing Drupal applications. Third-Party Integration: Familiarity with APIs and third-party integration. Joining: Immediate Joiners Only. Preferred Experience Experience with Rocket.Chat integration or other messaging tools. Exposure to Solr/Elasticsearch using Drupal Search API. Skills: rocket.chat integration,api integration,security,drupal development.,hooks,api integration (drupal),custom module development,json:api,form api,drupal theming,plugin system,third-party integration,graphql,drupal,rest,preprocess functions,entity api,twig,access control Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Work Level : Individual Core : Communication Skills, Problem Solving, Execution Leadership : Decisive, Team Alignment, Working Independently Role : Industry Type : IT Services & Consulting Function : Data Analyst Key Skills : Data Analytics,Data Analysis,Python,R,MySQL,Cloud,AWS,Bigdata,Big Data Platforms,Business Intelligence (BI),Tableau,Data Science,Statistical Modeling Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Responsibilities: This is a Remote Position. Collect, clean, and preprocess data from various sources. Perform exploratory data analysis (EDA) to identify trends and patterns. Develop dashboards and reports using tools like Excel, Power BI, or Tableau. Use SQL to query and manipulate large datasets. Assist in building predictive models and performing statistical analyses. Present insights and recommendations based on data findings. Collaborate with cross-functional teams to support data-driven decision-making. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Business Analyst The Business Analyst is responsible for providing day-to-day support for business strategy projects for the assigned functional business area. Under close supervision, this job supports business leaders by generating metrics and drafting reports to support business strategies. This job helps ensure that the assigned functional business area is optimized and cost effective. Key Responsibilities And Duties Generates metrics and drafts reports in assigned functional business area to inform decisions on tactical issues that impact the business. Supports implementation of policies and procedures in support of the business area strategy. Assists with process improvements with a focus on specific demographics and identifiers from the company’s databases. Analyzes and reports on area data (financial, headcount, etc.) and performance metrics. Supports business management projects by documenting risks, issues and action items. Participates in meeting planning in support of business projects and objectives. Educational Requirements University (Degree) Preferred Work Experience No Experience Required Physical Requirements Physical Requirements: Sedentary Work Career Level 5IC The Enterprise Data Steward supports the development, implementation and execution of business analytics initiatives and projects. This job supports the assessment, improvement, and governance of quality and ongoing fitness-for purpose of data, ensuring adequate data quality is maintained so that data can effectively support business processes. "*Ensures that data within the domain (inclusive of internal systems of record, 3rd party external data sources, analytic sources, and master data platforms) meets business requirements for both operational and analytical use cases of that data. A person with 3+ years of experience in a data analytics role with following abilities Ensures that data within the domain (inclusive of internal systems of record, 3rd party external data sources, analytic sources, and master data platforms) meets business requirements for both operational and analytical use cases of that data. Investigates and drives consistent metric definitions across the enterprise to develop one consistent view of the customer, user experience, and business performance. Ensures that data within the domain meets business requirements and data policy and data standards for data quality and ensures that data quality requirements are included in product/system development process, both at the source and throughout the data supply chain (acquire/curate/publish/consume), in partnership with IT and data engineering partners. Proficiency in tools :- Strong SQL, Snowflake Ability to clean and preprocess data, handling missing values, outliers, and ensuring data quality. Proficiency in working with databases, including data extraction, transformation, and loading (ETL) processes, good understanding of different Data modeling techniques. Knowledge of Python and Shell scripting. Proficiency in MS Excel and ability to apply formulas. Snowflake/Cloud Fundamental knowledge Related Skills Adaptability, Business Acumen, Collaboration, Communication, Consultative Communication, Detail-Oriented, Executive Presence, Financial Acumen, Messaging Effectiveness, Prioritizes Effectively, Problem Solving, Project Management, Relationship Management, Strategic Thinking _____________________________________________________________________________________________________ Company Overview TIAA Global Capabilities was established in 2016 with a mission to tap into a vast pool of talent, reduce risk by insourcing key platforms and processes, as well as contribute to innovation with a focus on enhancing our technology stack. TIAA Global Capabilities is focused on building a scalable and sustainable organization , with a focus on technology , operations and expanding into the shared services business space. Working closely with our U.S. colleagues and other partners, our goal is to reduce risk, improve the efficiency of our technology and processes and develop innovative ideas to increase throughput and productivity. We are an Equal Opportunity Employer. TIAA does not discriminate against any candidate or employee on the basis of age, race, color, national origin, sex, religion, veteran status, disability, sexual orientation, gender identity, or any other legally protected status. Accessibility Support TIAA offers support for those who need assistance with our online application process to provide an equal employment opportunity to all job seekers, including individuals with disabilities. If you are a U.S. applicant and desire a reasonable accommodation to complete a job application please use one of the below options to contact our accessibility support team: Phone: (800) 842-2755 Email: accessibility.support@tiaa.org Privacy Notices For Applicants of TIAA, Nuveen and Affiliates residing in US (other than California), click here. For Applicants of TIAA, Nuveen and Affiliates residing in California, please click here. For Applicants of TIAA Global Capabilities, click here. For Applicants of Nuveen residing in Europe and APAC, please click here. Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Req Number: 94284 Time Type: Full Time Job Description Lead Logistics IT | IT Specialist Seniority Level Junior IT Specialist / IT Specialist Industry Logistics & Supply Chain Employment Type Full-time Job Function Information Technology Job Description: Essential Responsibilities: Collaborate with geographically distributed teams. Design, develop, and deploy machine learning models to solve complex business problems. Develop, test, and deploy RPA bots following DSV internal governance. Implement AI algorithms for predictive analytics and natural language processing (NLP). Train, fine-tune, and evaluate machine learning models for accuracy and performance. Work with data engineers to gather, preprocess, and clean data for model training. Automate business processes across various domains to improve efficiency. Identify automation opportunities and requirements in collaboration with different stakeholders. Communicate technical solutions effectively to non-technical team members. Function / Market & Industry Knowledge / Business Acumen / Process working Understand the challenges faced within an IT organization of a large global company. Demonstrate competency in process and system design, delivery, requirements assessment, improvement, and business logic. Have a sound understanding of 3PL/4PL/LLP business processes and interdependencies (a plus). Technical Requirements: Strong knowledge of machine learning algorithms. Experience with ML/AI frameworks and RPA tools. Proficiency in programming languages such as Python, Java, or C#. Familiarity with data processing and analysis libraries. Experience with cloud platforms. Strong understanding of databases (SQL/NoSQL) and data integration techniques. Behavioral Competencies: Understanding of supply chain management principles (a bonus). Creativity and ability to develop innovative ideas and solutions. Ability to work in a diverse environment and culture. Self-motivated and autonomous, capable of moving forward successfully with minimal direction. Professional and effective in stressful situations. Education and Work Experience General degree with emphasis in IT, Engineering, Supply Chain Management or equivalent of +5 years relevant working experience. Language Skill Fluent in English, both spoken and written. Knowledge of other languages is an asset. DSV – Global Transport and Logistics DSV is a dynamic workplace that fosters inclusivity and diversity. We conduct our business with integrity, respecting different cultures and the dignity and rights of individuals. When you join DSV, you are working for one of the very best performing companies in the transport and logistics industry. You’ll join a talented team of approximately 75,000 employees in over 80 countries, working passionately to deliver great customer experiences and high-quality services. DSV aspires to lead the way towards a more sustainable future for our industry and are committed to trading on nature’s terms. We promote collaboration and transparency and strive to attract, motivate and retain talented people in a culture of respect. If you are driven, talented and wish to be part of a progressive and versatile organisation, we’ll support you and your need to achieve your potential and forward your career. Visit dsv.com and follow us on LinkedIn, Facebook and Twitter. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

India

Remote

Job Title: Data Science Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About WebBoost Solutions by UM WebBoost Solutions by UM provides aspiring professionals with hands-on experience in data science , offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities ✅ Collect, preprocess, and analyze large datasets. ✅ Develop predictive models and machine learning algorithms . ✅ Perform exploratory data analysis (EDA) to extract meaningful insights. ✅ Create data visualizations and dashboards for effective communication of findings. ✅ Collaborate with cross-functional teams to deliver data-driven solutions . Requirements 🎓 Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field . 🐍 Proficiency in Python or R for data analysis and modeling. 🧠 Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) . 📊 Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib) . 🧐 Strong analytical and problem-solving skills. 🗣 Excellent communication and teamwork abilities. Stipend & Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based). ✔ Hands-on experience in data science projects . ✔ Certificate of Internship & Letter of Recommendation . ✔ Opportunity to build a strong portfolio of data science models and applications. ✔ Potential for full-time employment based on performance. How to Apply 📩 Submit your resume and a cover letter with the subject line "Data Science Intern Application." 📅 Deadline: 28th May 2025 Equal Opportunity WebBoost Solutions by UM is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Let me know if you need any modifications! 🚀 Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

India

Remote

Job Title: Machine Learning Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with Certificate of Internship Application Deadline : 28th May 2025 About Unified Mentor Unified Mentor provides students and graduates with hands-on learning opportunities and career growth in Machine Learning and Data Science. Role Overview As a Machine Learning Intern, you will work on real-world projects, enhancing your practical skills in data analysis and model development. Responsibilities ✅ Design, test, and optimize machine learning models ✅ Analyze and preprocess datasets ✅ Develop algorithms and predictive models ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn ✅ Document findings and create reports Requirements 🎓 Enrolled in or a graduate of a relevant program (Computer Science, AI, Data Science, or related field) 🧠 Knowledge of machine learning concepts and algorithms 💻 Proficiency in Python or R (preferred) 🤝 Strong analytical and teamwork skills Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Hands-on machine learning experience ✔ Internship Certificate & Letter of Recommendation ✔ Real-world project contributions for your portfolio Equal Opportunity Unified Mentor is an equal-opportunity employer, welcoming candidates from all backgrounds. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

India

Remote

Machine Learning Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship Application Deadline: 28th May 2025 About WebBoost Solutions by UM WebBoost Solutions by UM provides students and graduates with hands-on learning and career growth opportunities in machine learning and data science . Role Overview As a Machine Learning Intern , you’ll work on real-world projects , gaining practical experience in machine learning and data analysis . Responsibilities ✅ Design, test, and optimize machine learning models. ✅ Analyze and preprocess datasets. ✅ Develop algorithms and predictive models for various applications. ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn . ✅ Document findings and create reports to present insights. Requirements 🎓 Enrolled in or graduate of a relevant program (AI, ML, Data Science, Computer Science, or related field). 📊 Knowledge of machine learning concepts and algorithms . 🐍 Proficiency in Python or R (preferred). 🤝 Strong analytical and teamwork skills . Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Practical machine learning experience . ✔ Internship Certificate & Letter of Recommendation . ✔ Build your portfolio with real-world projects . How to Apply 📩 Submit your application by 28th May 2025 with the subject: "Machine Learning Intern Application" . Equal Opportunity WebBoost Solutions by UM is an equal opportunity employer , welcoming candidates from all backgrounds . Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

India

Remote

Data Science Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Application Deadline: 28th May 2025 Opportunity: Full-time role based on performance + Internship Certificate About Unified Mentor Unified Mentor provides aspiring professionals with hands-on experience in data science through industry-relevant projects, helping them build successful careers. Responsibilities Collect, preprocess, and analyze large datasets Develop predictive models and machine learning algorithms Perform exploratory data analysis (EDA) to extract insights Create data visualizations and dashboards for effective communication Collaborate with cross-functional teams to deliver data-driven solutions Requirements Enrolled in or a graduate of Data Science, Computer Science, Statistics, or a related field Proficiency in Python or R for data analysis and modeling Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) Familiarity with data visualization tools like Tableau, Power BI, or Matplotlib Strong analytical and problem-solving skills Excellent communication and teamwork abilities Stipend & Benefits Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) Hands-on experience in data science projects Certificate of Internship & Letter of Recommendation Opportunity to build a strong portfolio of data science models and applications Potential for full-time employment based on performance How to Apply Submit your resume and a cover letter with the subject line "Data Science Intern Application." Equal Opportunity: Unified Mentor welcomes applicants from all backgrounds. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

India

Remote

Machine Learning Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship Application Deadline: 28th May 2025 About WebBoost Solutions by UM WebBoost Solutions by UM provides students and graduates with hands-on learning and career growth opportunities in machine learning and data science . Role Overview As a Machine Learning Intern , you’ll work on real-world projects , gaining practical experience in machine learning and data analysis . Responsibilities ✅ Design, test, and optimize machine learning models. ✅ Analyze and preprocess datasets. ✅ Develop algorithms and predictive models for various applications. ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn . ✅ Document findings and create reports to present insights. Requirements 🎓 Enrolled in or graduate of a relevant program (AI, ML, Data Science, Computer Science, or related field) 📊 Knowledge of machine learning concepts and algorithms . 🐍 Proficiency in Python or R (preferred). 🤝 Strong analytical and teamwork skills . Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Practical machine learning experience . ✔ Internship Certificate & Letter of Recommendation . ✔ Build your portfolio with real-world projects . How to Apply 📩 Submit your application by 28th May 2025 with the subject: "Machine Learning Intern Application" . Equal Opportunity WebBoost Solutions by UM is an equal opportunity employer , welcoming candidates from all backgrounds . Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Job Title: Machine Learning Engineer Location: Malaviya Nagar, Jaipur (On-site) Experience Required: 2 – 4 Years Industry: Blockchain Technology Employment Type: Full-Time About the Company: Our client is an innovative tech company specializing in cutting-edge blockchain solutions, working on decentralized applications, smart contracts, and fintech platforms. They're now expanding into AI/ML-driven blockchain analytics, fraud detection, and predictive systems and are looking for a skilled Machine Learning Engineer to join their growing team. Key Responsibilities: Design, develop, and deploy ML models to enhance blockchain data analysis, fraud detection, or smart contract optimization. Work with blockchain developers and data engineers to integrate ML solutions into decentralized systems. Preprocess large datasets from blockchain networks and external APIs. Conduct exploratory data analysis to derive meaningful insights and trends. Build and maintain scalable ML pipelines and model deployment workflows. Optimize models for performance, scalability, and accuracy in production environments. Research and evaluate new technologies in the intersection of AI/ML and blockchain. Required Skills: Solid understanding of core machine learning algorithms (supervised, unsupervised, NLP, etc.) Hands-on experience with Python and ML libraries like TensorFlow, PyTorch, Scikit-learn, etc. Strong knowledge of data preprocessing, feature engineering, and model evaluation techniques. Experience with REST APIs, data collection from APIs or databases. Good understanding of blockchain fundamentals and how decentralized systems work. Familiarity with blockchain analytics tools or platforms is a plus. Good to Have: Exposure to smart contracts and Ethereum/Solidity. Experience with graph-based ML (e.g., using blockchain transaction graphs). Knowledge of tools like Docker, Kubernetes, or cloud services (AWS/GCP/Azure). What We Offer: Opportunity to work on real-world blockchain + AI innovations. A collaborative team with a passion for decentralization and disruptive technologies. Competitive salary package and career growth in a fast-growing domain. To Apply: Send your updated resume to ridhamstaffing@gmail.com with the subject line: “ML Engineer – Blockchain | Jaipur” Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

India

Remote

⚠️ Applications without a GitHub or Portfolio link in the resume will be automatically rejected. Please include it to be considered At NilAi, we’re building an AI-powered platform that helps hospitals (starting with the NHS) optimize energy and water consumption, reduce carbon emissions, and meet Net Zero goals—without any new hardware. We're looking for a passionate AI Intern to join our mission-driven team and help us shape the future of sustainable healthcare. 🌍 Company: NilAI 📍 Location: India (Remote) 💼 Position: AI Intern 💰 Stipend: ₹5,000/month Responsibilities Clean, preprocess, and analyze large datasets related to hospital energy usage, water consumption, and operational workflows. Develop and implement machine learning models (e.g., regression, time-series forecasting, anomaly detection) using Scikit-learn, TensorFlow/PyTorch to predict and optimize energy consumption. Explore the application of LLMs (Large Language Models) for automating reports or extracting insights from unstructured data (e.g., maintenance logs, audit reports). Create interactive dashboards and visualizations using Power BI or Tableau to communicate findings to stakeholders. Integrate open-source APIs (e.g., OpenAI API) for enhancing data processing or generating sustainability recommendations. Assist in deploying lightweight models or prototypes using Flask or Streamlit for internal testing. Collaborate with the team to refine AI-driven recommendations for reducing carbon emissions and improving resource efficiency. Take ownership of complex challenges, demonstrating a commitment to continuous learning and delivering innovative, scalable solutions. Required Skills & Qualifications - Pursuing or recently completed a degree in Data Science, Computer Science, Engineering, Statistics, or a related field. - Proficiency in Python and experience with data science libraries (e.g., Pandas, NumPy, Scikit-learn). - Familiarity with machine learning frameworks (TensorFlow/PyTorch) and model deployment. - Experience with data visualization tools (Power BI, Tableau) and storytelling with data. - Basic understanding of LLMs and API integrations (e.g., OpenAI, Hugging Face). - Exposure to time-series forecasting (e.g., Prophet, ARIMA) or anomaly detection techniques. - Experience with ETL pipelines (e.g., Apache Airflow, Alteryx, or custom Python scripts) and data warehousing concepts. - Knowledge of SQL for data querying and manipulation. - Ability to work with messy, real-world datasets and strong problem-solving skills. - Passion for sustainability, healthcare innovation, or energy efficiency is a plus! Nice-to-Have Skills Experience with cloud platforms (AWS, GCP) or big data tools. What You’ll Gain Hands-on experience with AI for sustainability in a high-impact startup. Mentorship from experienced data scientists and exposure to real-world energy challenges. Opportunity to contribute to a product that directly reduces carbon emissions and saves costs for hospitals. Flexible work environment and potential for future full-time roles. Please Note: Kindly attach your CV with portfolios for review. Let’s build something that matters. 🌍 #AIforGood #ClimateTech #HealthcareInnovation #NilAi Show more Show less

Posted 1 month ago

Apply

2.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. AI Engineer Role Overview: We are seeking a highly skilled and experienced AI Engineers with a minimum of 2 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Your technical responsibilities: Assist in the development and implementation of AI models and systems, leveraging techniques such as Large Language Models (LLMs) and generative AI. Design, develop, and maintain efficient, reusable, and reliable Python code Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, Agentic Framework to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Write unit tests and conduct code reviews to ensure high-quality, bug-free software. Troubleshoot and debug applications to optimize performance and fix issues. Work with databases (SQL, NoSQL) and integrate third-party APIs. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Minimum 2 years of experience in Python, Data Science, Machine Learning, OCR and document intelligence In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Strong knowledge of Python frameworks such as Django, Flask, or FastAPI. Experience with RESTful API design and development. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Good to Have Skills: Understanding of agentic AI concepts and frameworks Proficiency in designing or interacting with agent-based AI architectures Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

2.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. AI Engineer Role Overview: We are seeking a highly skilled and experienced AI Engineers with a minimum of 2 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Your technical responsibilities: Assist in the development and implementation of AI models and systems, leveraging techniques such as Large Language Models (LLMs) and generative AI. Design, develop, and maintain efficient, reusable, and reliable Python code Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, Agentic Framework to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Write unit tests and conduct code reviews to ensure high-quality, bug-free software. Troubleshoot and debug applications to optimize performance and fix issues. Work with databases (SQL, NoSQL) and integrate third-party APIs. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Minimum 2 years of experience in Python, Data Science, Machine Learning, OCR and document intelligence In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Strong knowledge of Python frameworks such as Django, Flask, or FastAPI. Experience with RESTful API design and development. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Good to Have Skills: Understanding of agentic AI concepts and frameworks Proficiency in designing or interacting with agent-based AI architectures Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

2.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. AI Engineer Role Overview: We are seeking a highly skilled and experienced AI Engineers with a minimum of 2 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Your technical responsibilities: Assist in the development and implementation of AI models and systems, leveraging techniques such as Large Language Models (LLMs) and generative AI. Design, develop, and maintain efficient, reusable, and reliable Python code Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, Agentic Framework to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Write unit tests and conduct code reviews to ensure high-quality, bug-free software. Troubleshoot and debug applications to optimize performance and fix issues. Work with databases (SQL, NoSQL) and integrate third-party APIs. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Minimum 2 years of experience in Python, Data Science, Machine Learning, OCR and document intelligence In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Strong knowledge of Python frameworks such as Django, Flask, or FastAPI. Experience with RESTful API design and development. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Good to Have Skills: Understanding of agentic AI concepts and frameworks Proficiency in designing or interacting with agent-based AI architectures Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Experience: 3 - 7 Years Shift timing: 1.00 pm to 10.00 pm Domain: Banking and BFSI Work mode: Hybrid Notice Period: Immediate to 30 days Job Summary: We are seeking a skilled Credit Risk Modeller to develop, validate, and maintain credit risk models that assess the creditworthiness of individuals and organizations. The role involves analyzing financial data, creating predictive models, and supporting the credit decision-making process to minimize potential losses and optimize risk-adjusted returns. Key Responsibilities: Develop and implement credit risk models (e.g., Probability of Default (PD), Loss Given Default (LGD), Exposure at Default (EAD)) for retail and/or corporate portfolios. Conduct statistical analysis and predictive modeling using techniques such as logistic regression, decision trees, machine learning algorithms, and other quantitative methods. Collaborate with data teams to collect, clean, and preprocess data from multiple sources. Perform back-testing and validation of existing credit risk models to ensure accuracy and compliance with regulatory standards (e.g., Basel II/III). Prepare detailed documentation of modeling assumptions, methodology, and results. Provide insights and recommendations to credit risk managers and business stakeholders to improve risk management strategies. Stay up to date with industry best practices, regulatory requirements, and emerging trends in credit risk analytics. Participate in internal and external audits related to credit risk models. Support stress testing and scenario analysis for credit portfolios. Qualifications: Strong experience 3+ years in credit risk modeling, preferably in banking or financial services. Bachelor’s or Master’s degree in Finance, Economics, Statistics, Mathematics, Data Science, or related quantitative discipline. Proficiency in statistical and modeling tools such as SAS, R, Python, SQL, or equivalent. Good understanding of credit risk concepts and regulatory frameworks (Basel Accords, IFRS 9). Strong analytical skills with attention to detail and problem-solving ability. Excellent communication skills for explaining complex technical information to non-technical stakeholders. Experience with big data tools and machine learning techniques is a plus. Familiarity with credit risk software platforms is advantageous. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Location: Delhi (for projects across India) About Varahe Analytics: Varahe Analytics is one of India’s premier integrated political consulting firms, specialising in building data-driven 360-degree election management. We help our clients with strategic advice and implementation, combining data-backed insights and in-depth ground intelligence into a holistic electoral campaign. We are passionate about our democracy and the politics that shape our world. We draw on some of the sharpest minds from distinguished institutions and diverse professional backgrounds to help us achieve our goal of building electoral strategies that spark conversations, effect change, and help shape electoral and legislative ecosystems in our country. About the Team: As part of the Data Analytics team, you will have the opportunity to contribute to impactful research and insights that drive strategic decisions. Your role will involve analyzing datasets, building dashboards, and generating visual reports using tools like Power BI. You will work closely with cross-functional teams to uncover trends, support data-driven strategies, and provide actionable intelligence. This internship offers a unique chance to be part of high-impact analytical work that informs key decisions and contributes to shaping outcomes at scale. What Would This Role Entail? Report Making and Visualization: Develop, design, and maintain interactive and insightful reports and dashboards using Power BI. Transform raw data into meaningful visualizations that provide actionable insights. Ensure that reports are user-friendly, visually appealing, and accessible to a diverse audience. Data Analysis: Analyze and interpret complex data sets to identify trends, patterns, and key insights. Collaborate with stakeholders to understand their data requirements and deliver customized reporting solutions Data Management: Extract, clean, and preprocess data from various sources to ensure data integrity and accuracy. Maintain and update existing reports and dashboards to reflect new data and evolving business needs. Necessary Qualifications/Skills: Currently pursuing a Bachelor's or Master's degree in Economics, Data Science, Engineering, or a related field. Strong analytical and problem-solving skills with a keen attention to detail. Excellent communication skills to effectively convey data insights to non-technical Stakeholders. Proficient in Power BI and Excel, with introductory-level knowledge of Pandas. Ability to integrate Excel with Power BI for enhanced data analysis and reporting. Good to Have Skills: Proficiency in creating interactive and dynamic reports and dashboards using Power BI Enthusiasm for learning and applying data analysis techniques. How to Apply If you're a fresh professional looking for a high-impact challenge, interested in joining a team of like-minded and motivated individuals who think strategically, act decisively, and get things done, drop in an email at internship@varaheanalytics.com Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

India

On-site

Apply now: https://forms.office.com/r/RFESZssevc Key Responsibilities: - Data Analysis and Preprocessing: Analyze and preprocess diverse datasets relevant to the mortgage industry, ensuring data quality and relevance for model training. Model Development and Fine-Tuning: Research and implement state-of-the-art NLP models, focusing on pre-training as well instruction tuning pre-trained LLMs for mortgage-specific applications. Utilize techniques like RLHF to improve model alignment with human preferences and enhance decision-making capabilities. Algorithm Implementation: Develop and optimize machine learning algorithms to enhance model performance, accuracy, and efficiency. Collaboration: Work with domain experts to incorporate industry knowledge into model development, ensuring outputs are relevant and actionable. Experimentation: Conduct experiments to validate model hypotheses, analyze results, and iterate on model improvements. Documentation: Maintain comprehensive documentation of methodologies, experiments, and results to support transparency and reproducibility. Ethics and Bias Mitigation: Ensure responsible AI practices are followed by identifying potential biases in data and models, implementing strategies to mitigate them. Required Skills: Technical Expertise: Strong background in machine learning, deep learning, and NLP. Proficiency in Python and experience with ML frameworks such as TensorFlow or PyTorch. NLP Knowledge: Experience with NLP frameworks and libraries (e.g., Hugging Face Transformers) for developing language models. Data Handling: Proficiency in handling large datasets, feature engineering, and statistical analysis. Problem Solving: Strong analytical skills with the ability to solve complex problems using data-driven approaches. Communication: Excellent communication skills to effectively collaborate with technical teams and non-technical stakeholders. Preferred Qualifications: Educational Background: Master’s or Ph.D. in Data Science, Computer Science, Statistics, or a related field. Cloud Computing: Familiarity with cloud platforms (e.g., AWS, Azure) for scalable computing solutions. Ethics Awareness: Understanding of ethical considerations in AI development, including bias detection and mitigation. ⚠️ Disclaimer: Firstsource follows a fair, transparent, and merit-based hiring process. We never ask for money at any stage. Beware of fraudulent offers and always verify through our official channels or @firstsource.com email addresses. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

About About Firstsource Firstsource Solutions Limited, an RP-Sanjiv Goenka Group company (NSE: FSL, BSE: 532809, Reuters: FISO.BO, Bloomberg: FSOL:IN), is a specialized global business process services partner, providing transformational solutions and services spanning the customer lifecycle across Healthcare, Banking and Financial Services, Communications, Media and Technology, Retail, and other diverse industries. With an established presence in the US, the UK, India, Mexico, Australia, South Africa, and the Philippines, we make it happen for our clients, solving their biggest challenges with hyper-focused, domain-centered teams and cutting-edge tech, data, and analytics. Our real-world practitioners work collaboratively to deliver future-focused outcomes. Summary Job Summary We are seeking a skilled AI/ML professional to develop and fine-tune NLP models tailored to the mortgage industry. The role involves end-to-end data analysis, model training (including instruction tuning and RLHF), and algorithm optimization. The ideal candidate will collaborate with domain experts, conduct rigorous experimentation, and uphold ethical AI practices to deliver accurate, relevant, and bias-mitigated solutions. Responsibilities Key Roles & Responsibilities: Data Analysis and Preprocessing: Analyze and preprocess diverse datasets relevant to the mortgage industry, ensuring data quality and relevance for model training. Model Development and Fine-Tuning: Research and implement state-of-the-art NLP models, focusing on pre-training as well instruction tuning pre- trained LLMs for mortgage-specific applications. Utilize techniques like RLHF to improve model alignment with human preferences and enhance decision-making capabilities. Algorithm Implementation: Develop and optimize machine learning algorithms to enhance model performance, accuracy, and efficiency. Collaboration: Work with domain experts to incorporate industry knowledge into model development, ensuring outputs are relevant and actionable. Experimentation: Conduct experiments to validate model hypotheses, analyze results, and iterate on model improvements. Documentation: Maintain comprehensive documentation of methodologies, experiments, and results to support transparency and reproducibility. Ethics and Bias Mitigation: Ensure responsible AI practices are followed by identifying potential biases in data and models, implementing strategies to mitigate them. Qualifications Required Skills and Qualifications Technical Expertise: Strong background in machine learning, deep learning, and NLP, Proficiency in Python and experience with ML frameworks such as TensorFlow or PyTorch. NLP Knowledge: Experience with NLP frameworks and libraries (e.g., Hugging Face Transformers) for developing language models. Data Handling: Proficiency in handling large datasets, feature engineering, and statistical analysis Problem Solving: Strong analytical skills with the ability to solve complex problems using data-driven approaches. Communication: Excellent communication skills to effectively collaborate with technical teams and non-technical stakeholders. Educational Background: Master’s or Ph.D. in Data Science, Computer Science, Statistics, or a related field. Cloud Computing: Familiarity with cloud platforms (e.g., AWS, Azure) for scalable computing solutions. Ethics Awareness: Understanding of ethical considerations in AI development, including bias detection and mitigation. ⚠️ Disclaimer: Firstsource follows a fair, transparent, and merit-based hiring process. We never ask for money at any stage. Beware of fraudulent offers and always verify through our official channels or @firstsource.com email addresses. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

India

Remote

Job Title: Machine Learning Intern (Paid) Company: Coreline solutions Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship Application Deadline: 27th May 2025 About Coreline solutions Coreline solutions provides students and graduates with hands-on learning and career growth opportunities in machine learning and data science. Role Overview As a Machine Learning Intern, you’ll work on real-world projects, gaining practical experience in machine learning and data analysis. Responsibilities ✅ Design, test, and optimize machine learning models. ✅ Analyze and preprocess datasets. ✅ Develop algorithms and predictive models for various applications. ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn. ✅ Document findings and create reports to present insights. Requirements 🎓 Enrolled in or graduate of a relevant program (AI, ML, Data Science, Computer Science, or related field) 📊 Knowledge of machine learning concepts and algorithms. 🐍 Proficiency in Python or R (preferred). 🤝 Strong analytical and teamwork skills. Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Practical machine learning experience. ✔ Internship Certificate & Letter of Recommendation. ✔ Build your portfolio with real-world projects. How to Apply 📩 Submit your application by 27th May 2025 with the subject: "Machine Learning Intern Application". Equal Opportunity Coreline solutions is an equal opportunity employer, welcoming candidates from all backgrounds. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

India

Remote

Job Title: Data Science Intern (Paid) Company: Coreline solutions Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About Coreline solutions Coreline solutions provides aspiring professionals with hands-on experience in data science, offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities ✅ Collect, preprocess, and analyze large datasets. ✅ Develop predictive models and machine learning algorithms. ✅ Perform exploratory data analysis (EDA) to extract meaningful insights. ✅ Create data visualizations and dashboards for effective communication of findings. ✅ Collaborate with cross-functional teams to deliver data-driven solutions. Requirements 🎓 Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field. 🐍 Proficiency in Python or R for data analysis and modeling. 🧠 Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred). 📊 Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib). 🧐 Strong analytical and problem-solving skills. 🗣 Excellent communication and teamwork abilities. Stipend & Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based). ✔ Hands-on experience in data science projects. ✔ Certificate of Internship & Letter of Recommendation. ✔ Opportunity to build a strong portfolio of data science models and applications. ✔ Potential for full-time employment based on performance. How to Apply 📩 Submit your resume and a cover letter with the subject line "Data Science Intern Application." 📅 Deadline: 27th May 2025 Equal Opportunity Coreline solutions is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Show more Show less

Posted 1 month ago

Apply

2.0 years

0 Lacs

India

Remote

ob Title: AI Full stack Developer – GenAI & NLP Location: Pune, India (Hybrid) Work Mode: Remote Experience Required: 2+ Years (Relevant AI/ML with GenAI & NLP) Salary: Up to ₹15 LPA (CTC) Employment Type: Full-time Department: AI Research & Development Role Overview We are looking for a passionate AI Developer with strong hands-on experience in Generative AI and Natural Language Processing (NLP) to help build intelligent and scalable solutions. In this role, you will design and deploy advanced AI models for tasks such as language generation, summarization, chatbot development, document analysis, and more. You’ll work with cutting-edge LLMs (Large Language Models) and contribute to impactful AI initiatives. Key Responsibilities Design, fine-tune, and deploy NLP and GenAI models using LLMs like GPT, BERT, LLaMA, or similar. Build applications for tasks like text generation, question-answering, summarization, sentiment analysis, and semantic search. Integrate language models into production systems using RESTful APIs or cloud services. Evaluate and optimize models for accuracy, latency, and cost. Collaborate with product and engineering teams to implement intelligent user-facing features. Preprocess and annotate text data, create custom datasets, and manage model pipelines. Stay updated on the latest advancements in generative AI, transformer models, and NLP frameworks. Required Skills & Qualifications Bachelor’s or Master’s degree in Computer Science, AI/ML, or a related field. Minimum 2 years of experience in fullstack development and AI/ML development, with recent work in NLP or Generative AI. Hands-on experience with models such as GPT, T5, BERT, or similar transformer-based architectures. Proficient in Python and libraries such as Hugging Face Transformers, spaCy, NLTK, or OpenAI APIs. Hands-on experience in any frontend/ backend technologies for software development. Experience with deploying models using Flask, FastAPI, or similar frameworks. Strong understanding of NLP tasks, embeddings, vector databases (e.g., FAISS, Pinecone), and prompt engineering. Familiarity with MLOps tools and cloud platforms (AWS, Azure, or GCP). Preferred Qualifications Experience with LangChain, RAG (Retrieval-Augmented Generation), or custom LLM fine-tuning. Knowledge of model compression, quantization, or inference optimization. Exposure to ethical AI, model interpretability, and data privacy practices. What We Offer Competitive salary package up to ₹15 LPA. Remote work flexibility with hybrid team collaboration in Pune. Opportunity to work on real-world generative AI and NLP applications. Access to resources for continuous learning and certification support. Inclusive, fast-paced, and innovative work culture. Skills: nltk,computer vision,inference optimization,model interpretability,gpt,bert,mlops,artificial intelligence,next.js,tensorflow,ai development,machine learning,generative ai,ml,openai,node.js,kubernetes,large language models (llms),openai apis,natural language processing,machine learning (ml),fastapi,natural language processing (nlp),java,azure,nlp tasks,model compression,embeddings,vector databases,aws,typescript,r,hugging face transformers,google cloud,hugging face,llama,ai tools,mlops tools,rag architectures,langchain,spacy,docker,retrieval-augmented generation (rag),pytorch,gcp,cloud,large language models,react.js,deep learning,python,ai technologies,flask,ci/cd,data privacy,django,quantization,javascript,ethical ai,nlp Show more Show less

Posted 1 month ago

Apply

8.0 - 11.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Description ----------------------------------------------- Job Description: Lead Data Scientist – Healthcare Domain Specialist Location: Bangalore Company: RT Global Infosolutions Pvt Ltd (www.rgisol.com) Employment Type: Full-Time Industry: Healthcare/AI/Analytics Domain Expertise: Predictive Analytics, Healthcare Data As a key leader in our data science team, you will define strategy, lead projects, and collaborate with healthcare professionals, engineers, and product teams to deploy scalable AI solutions. Key Responsibilities • AI Model Development: Design, develop, and optimize predictive models for elderly fall risk assessment using advanced machine learning (ML) and deep learning techniques. • Data Analysis: Work with healthcare-specific data (e.g., patient records, sensor data, clinical data) to uncover patterns and actionable insights. • Domain Expertise Application: Leverage healthcare domain knowledge to ensure accuracy, reliability, and ethical use of models in predicting fall risks. • Collaborate with Experts: Collaborate with clinicians, healthcare providers, and crossfunctional teams to align AI solutions with clinical workflows and patient care strategies. • Data Engineering: Develop robust ETL pipelines to preprocess and integrate healthcare data from multiple sources, ensuring data quality and compliance. • Evaluation & Optimization: Continuously evaluate model performance and refine algorithms to achieve high accuracy and generalizability. • Compliance & Ethics: Ensure compliance with healthcare data regulations such as HIPAA, GDPR, and implement best practices for data privacy and security. • Research & Innovation: Stay updated with the latest research in healthcare AI, predictive analytics, and elderly care solutions, integrating new techniques as applicable. • Team Management: Guide all team members in technical and domain-specific problemsolving, manage day to day task deliverables, evaluate individual’s performance and coach. • Stakeholder Management: Present insights, models, and business impact assessments to senior leadership and healthcare stakeholders. Required Skills & Qualifications • Education: Master's or PhD in Data Science, Computer Science, Statistics, Bioinformatics, or a related field. A strong academic background in healthcare is preferred. • Experience: o 8 - 11 years of experience in data science, with at least 2 years in the healthcare domain. o Prior experience in leading AI projects in healthcare startups, hospitals, or MedTech companies. o Ability to work in cross-functional teams. o Ability to publish papers and research findings related to healthcare data science • Technical Expertise: o Proficiency in Python, R, or other programming languages used for ML and data analysis. o Hands-on experience with ML/DL frameworks (e.g., TensorFlow, PyTorch, Scikitlearn). o Experience with time-series data, wearable/sensor data, or IoT data integration is a plus. o Strong knowledge of statistics, probability, and feature engineering. o Familiarity with cloud platforms (AWS, Azure, GCP) and tools for scalable ML pipelines. • Healthcare Domain Knowledge: o Understanding of geriatric healthcare challenges, fall risks, and predictive care strategies. o Familiarity with Electronic Health Records (EHR), wearable devices, and sensor data. o Knowledge of healthcare data compliance (e.g., HIPAA, GDPR). • Soft Skills: o Strong analytical and problem-solving abilities. o Excellent communication skills to present findings to non-technical stakeholders. o A collaborative mindset to work with interdisciplinary teams. Preferred Qualifications • Knowledge of biomechanics or human movement analysis. • Experience with explainable AI (XAI) and interpretable ML models. What We Offer • Opportunity to work on cutting-edge healthcare AI solutions that make a meaningful impact on elderly lives. • Competitive salary and benefits package. • Flexible work environment, with options for hybrid work. • Opportunities for professional growth and leadership. • Collaborative and inclusive culture that values innovation and teamwork. Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Razorpay was founded by Shashank Kumar and Harshil Mathur in 2014. Razorpay is building a new-age digital banking hub (Neobank) for businesses in India with the mission is to enable frictionless banking and payments experiences for businesses of all shapes and sizes. What started as a B2B payments company is processing billions of dollars of payments for lakhs of businesses across India. We are a full-stack financial services organisation, committed to helping Indian businesses with comprehensive and innovative payment and business banking solutions built over robust technology to address the entire length and breadth of the payment and banking journey for any business. Over the past year, we've disbursed loans worth millions of dollars in loans to thousands of businesses. In parallel, Razorpay is reimagining how businesses manage money by simplifying business banking (via Razorpay X) and enabling capital availability for businesses (via Razorpay Capital). The Role Senior Analytics Specialist will work with the central analytics team at Razorpay. This will give you an opportunity to work in a fast-paced environment aimed at creating a very high impact and to work with a diverse team of smart and hardworking professionals from various backgrounds. Some of the responsibilities include working with large, complex data sets, developing strong business and product understanding and closely being involved in the product life cycle. Roles And Responsibilities You will work with large, complex data sets to solve open-ended, high impact business problems using data mining, experimentation, statistical analysis and related techniques, machine learning as needed You would have/develop a strong understanding of the business & product and conduct analysis to derive insights, develop hypothesis and validate with sound rigorous methodologies or formulate the problems for modeling with ML You would apply excellent problem solving skills and independently scope, deconstruct and formulate solutions from first-principles that bring outside-in and state of the art view You would be closely involved with the product life cycle working on ideation, reviewing Product Requirement Documents, defining success criteria, instrumenting for product features, Impact assessment and identifying and recommending improvements to further enhance the Product features You would expedite root cause analyses/insight generation against a given recurring use case through automation/self-serve platforms You will develop compelling stories with business insights, focusing on strategic goals of the organization You will work with Business, Product and Data engineering teams for continuous improvement of data accuracy through feedback and scoping on instrumentation quality and completeness Set high standards in project management; own scope and timelines for the team Mandatory Qualifications Bachelor's/Master’s degree in Engineering, Economics, Finance, Mathematics, Statistics, Business Administration or a related quantitative field 3+ years of high quality hands-on experience in analytics and data science Hands on experience in SQL, Python and Tableau Define the business and product metrics to be evaluated, work with engg on data instrumentation, create and automate self-serve dashboards to present to relevant stakeholders leveraging tools such as Tableau. Ability to structure and analyze data leveraging techniques like EDA, Cohort analysis, Funnel analysis and transform them into understandable and actionable recommendations and then communicate them effectively across the organization. Hands on experience in working with large scale structured, semi structured and unstructured data and various approach to preprocess/cleanse data, dimensionality reduction Work experience in Consumer-tech organizations would be a plus Developed a clear understanding of the qualitative and quantitative aspects of the product/strategic initiative and leverage it to identify and act upon existing Gaps and Opportunities Hands on experience of A/B testing, Significance testing, supervised and unsupervised ML, Web Analytics and Statistical Learning Razorpay believes in and follows an equal employment opportunity policy that doesn't discriminate on gender, religion, sexual orientation, colour, nationality, age, etc. We welcome interests and applications from all groups and communities across the globe. Follow us on LinkedIn & Twitter Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Work Level : Individual Core : Communication Skills, Problem Solving, Execution Leadership : Decisive, Team Alignment, Working Independently Role : Industry Type : IT Services & Consulting Function : Data Analyst Key Skills : Data Analytics,Data Analysis,Python,R,MySQL,Cloud,AWS,Bigdata,Big Data Platforms,Business Intelligence (BI),Tableau,Data Science,Statistical Modeling Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Responsibilities: This is a Remote Position. Collect, clean, and preprocess data from various sources. Perform exploratory data analysis (EDA) to identify trends and patterns. Develop dashboards and reports using tools like Excel, Power BI, or Tableau. Use SQL to query and manipulate large datasets. Assist in building predictive models and performing statistical analyses. Present insights and recommendations based on data findings. Collaborate with cross-functional teams to support data-driven decision-making. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies