Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 7.0 years
20 - 22 Lacs
Chennai
Remote
Job Summary We are seeking a highly skilled and mathematically grounded Machine Learning Engineer to join our AI team. The ideal candidate will have 5+ years of ML experience with a deep understanding of machine learning algorithms, statistical modeling, and optimization techniques, along with hands-on experience in building scalable ML systems using modern frameworks and tools. ________________________________________ Key Responsibilities Design, develop, and deploy machine learning models for real-world applications. Collaborate with data scientists, software engineers, and product teams to integrate ML solutions into production systems. Understand the mathematics behind machine learning algorithms to effectively implement and optimize them. Conduct mathematical analysis of algorithms to ensure robustness, efficiency, and scalability. Optimize model performance through hyperparameter tuning, feature engineering, and algorithmic improvements. Stay updated with the latest research in machine learning and apply relevant findings to ongoing projects. ________________________________________ Required Qualifications Mathematics & Theoretical Foundations Strong foundation in Linear Algebra (e.g., matrix operations, eigenvalues, SVD). Proficiency in Probability and Statistics (e.g., Bayesian inference, hypothesis testing, distributions). Solid understanding of Calculus (e.g., gradients, partial derivatives, optimization). Knowledge of Numerical Methods and Convex Optimization. Familiarity with Information Theory, Graph Theory, or Statistical Learning Theory is a plus. Programming & Software Skills Proficient in Python (preferred), with experience in libraries such as: o NumPy, Pandas, Scikit-learn, Matplotlib, Seaborn Experience with deep learning frameworks: o TensorFlow, PyTorch, Keras, or JAX Familiarity with ML Ops tools: o MLflow, Kubeflow, Airflow, Docker, Kubernetes Experience with cloud platforms (AWS, GCP, Azure) for model deployment. Machine Learning Expertise Hands-on experience with supervised, unsupervised, and reinforcement learning. Understanding of model evaluation metrics and validation techniques. Experience with large-scale data processing (e.g., Spark, Dask) is a plus. ________________________________________ Preferred Qualifications Masters or Ph.D. in Computer Science, Mathematics, Statistics, or a related field. Publications or contributions to open-source ML projects. Experience with LLMs, transformers, or generative models. Remote (Chennai - possible to travel Chennai for meetings)
Posted 3 weeks ago
15.0 - 20.0 years
9 - 14 Lacs
Pune
Work from Office
About The Role Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Large Language Models Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an AI / ML Engineer, you will engage in the development of applications and systems that leverage artificial intelligence tools and cloud AI services. Your typical day will involve designing and implementing production-ready solutions, ensuring that they meet quality standards. You will work with various AI models, including generative AI, deep learning, and neural networks, while collaborating with cross-functional teams to integrate these technologies into existing systems. Your role will also require you to stay updated with the latest advancements in AI and machine learning, applying innovative approaches to solve complex problems and enhance system capabilities. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Evaluate and recommend new technologies and tools to improve team efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Large Language Models.- Strong understanding of deep learning frameworks such as TensorFlow or PyTorch.- Experience with cloud platforms like AWS, Azure, or Google Cloud for deploying AI solutions.- Familiarity with natural language processing techniques and tools.- Ability to design and implement scalable AI applications. Additional Information:- The candidate should have minimum 5 years of experience in Large Language Models.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
3.0 - 8.0 years
9 - 14 Lacs
Pune
Work from Office
About The Role Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Large Language Models Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an AI / ML Engineer, you will engage in the development of applications and systems that leverage artificial intelligence tools and cloud AI services. Your typical day will involve collaborating with cross-functional teams to design and implement production-ready solutions, ensuring that the applications meet high-quality standards. You will also explore the integration of generative AI models into various projects, contributing to innovative solutions that may encompass deep learning, neural networks, chatbots, and image processing technologies. Your role will require a proactive approach to problem-solving and a commitment to continuous learning in the rapidly evolving field of AI and machine learning. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the design and implementation of AI-driven applications and systems.- Collaborate with team members to troubleshoot and resolve technical challenges. Professional & Technical Skills: - Must To Have Skills: Proficiency in Large Language Models.- Strong understanding of deep learning frameworks such as TensorFlow or PyTorch.- Experience with cloud platforms and services for deploying AI applications.- Familiarity with data preprocessing and feature engineering techniques.- Knowledge of natural language processing and its applications. Additional Information:- The candidate should have minimum 3 years of experience in Large Language Models.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
0.0 - 4.0 years
0 Lacs
Delhi
On-site
Job Title: Data Scientist Experience: 2 to 4 Years Location: Delhi/NCR Industry: Information Technology & Services Employment Type: Full-time About the Role: We are looking for a passionate and results-driven Data Scientist to join our growing analytics and AI/ML team. The ideal candidate will bring hands-on experience in data exploration, model building, and deployment. You will work closely with cross-functional teams to deliver actionable insights and machine learning solutions that drive business value. Key Responsibilities: Collect, clean, and preprocess large datasets from various sources. Perform exploratory data analysis and visualize patterns and trends. Build and validate machine learning and statistical models. Translate business problems into data science problems and solutions. Communicate findings through reports, dashboards, and visualizations. Collaborate with engineering and product teams to deploy data-driven solutions. Stay updated with the latest tools, technologies, and industry trends. Required Skills and Qualifications: Bachelor's or Master's degree in Computer Science, Statistics, Mathematics, Data Science, or related field. 2 to 4 years of industry experience in data science, machine learning, or applied statistics. Proficient in Python (NumPy, pandas, scikit-learn, matplotlib, etc.). Solid understanding of supervised and unsupervised ML algorithms. Experience with SQL and data querying from relational databases. Familiarity with tools like Jupyter, Git, and cloud platforms (AWS/Azure/GCP) is a plus. Exposure to model deployment and MLOps practices is desirable. Strong analytical thinking and problem-solving skills. Good communication and team collaboration abilities. Nice to Have: Experience with NLP, computer vision, or time-series analysis. Knowledge of Big Data tools (Spark, Hadoop). Experience with data visualization libraries (Seaborn, Plotly, Power BI, Tableau).
Posted 3 weeks ago
12.0 - 15.0 years
3 - 6 Lacs
Bengaluru
Work from Office
About The Role Project Role : Data Science Practitioner Project Role Description : Formulating, design and deliver AI/ML-based decision-making frameworks and models for business outcomes. Measure and justify AI/ML based solution values. Must have skills : Computer Vision Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Science Practitioner, you will be engaged in formulating, designing, and delivering AI and machine learning-based decision-making frameworks and models that drive business outcomes. Your typical day will involve collaborating with various teams to measure and justify the value of AI and machine learning solutions, ensuring that they align with organizational goals and deliver tangible results. You will also be responsible for analyzing complex data sets, deriving insights, and presenting findings to stakeholders to support informed decision-making processes. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing and best practices among team members to enhance overall team performance.- Develop and implement strategies to optimize AI and machine learning models for improved efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Computer Vision.- Strong understanding of image processing techniques and algorithms.- Experience with deep learning frameworks such as TensorFlow or PyTorch.- Familiarity with data augmentation and preprocessing methods for computer vision tasks.- Ability to implement and optimize convolutional neural networks for various applications. Additional Information:- The candidate should have minimum 12 years of experience in Computer Vision.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
3.0 - 6.0 years
0 Lacs
Mumbai, Maharashtra, India
Remote
About This Role BlackRock’s Fixed Income team runs more than $1 trillion in global fixed income assets across index, active long-only, alternative, and liability driven strategies. The platform offers fixed income investors one of the industry's broadest array of investment choices across model-based and fundamental investment styles. With over $400 billion in assets under management, the BlackRock Financial Institutions Group (FIG) is hiring a Technical Program Manager (TPM) to lead and build the technology function supporting FIG’s investment processes. This is a hybrid leadership role that blends hands-on development, team management, and cross-functional stakeholder engagement. The TPM will manage end-to-end technology delivery, drive governance and standardization, and directly mentor both junior developers and business-aligned citizen developers. You’ll be responsible for designing and delivering scalable tools using Python and Streamlit, while embedding best practices across a growing ecosystem of investment-enabling technology. In addition, the TPM will spearhead the exploration and implementation of Generative AI (GenAI) solutions—identifying high-value use cases, building prototypes, and integrating AI assistants or copilots that enhance productivity, insight generation, and user experience within the investment process. Key Responsibilities Hands-On Technical Delivery Design, develop, and deploy internal tooling and analytics using Python and Streamlit. Contribute to and oversee reusable libraries, APIs, and visualizations for investment and operational needs. Maintain high standards in code quality, performance, and documentation. Program & Delivery Leadership Own the roadmap and execution of technical projects, ensuring timely, high-impact delivery. Lead agile processes including sprint planning, prioritization, and retrospectives. Track deliverables and manage risks across multiple workstreams. Team Management & Mentorship Lead a currently small growing technical team, distributing tasks, conducting reviews, and fostering growth. Guide citizen developers and business stakeholders building local tools - offering technical support, guardrails, and integration guidance. Champion a collaborative, learning-oriented environment. Tech Governance & DevOps Define and enforce governance practices for the team’s codebase—version control, testing, modularity, and reuse. Maintain and evolve CI/CD pipelines and infrastructure using Azure and modern DevOps best practices. Ensure integration with enterprise platforms and APIs. Stakeholder Engagement Work closely with portfolio managers, investment strategists, data and risk teams to understand needs and translate them into scalable tech solutions. Present demos, technical updates, and roadmaps to senior stakeholders. Facilitate coordination with enterprise engineering teams (e.g., platform and infra). What We’re Looking For Required Skills And Qualifications 3-6 years of experience in engineering or technical program leadership roles, including both people and project management. Proficiency in Python, with experience building and deploying user-facing tools using Streamlit, and leveraging libraries such as pandas, NumPy, and matplotlib for data analysis and financial modeling. Strong familiarity with CI/CD pipelines, Azure, and modern DevOps practices. Track record of leading delivery from concept to production. Exceptional communication and stakeholder management skills across tech and business. Familiarity with Portfolio Management software and tools. Solid understanding of version control systems, preferably Git, and experience in managing a collaborative codebase. Desirable Skills Experience mentoring citizen developers or enabling business-side teams to build responsibly. Knowledge of fixed income or other capital markets; Aladdin platform familiarity is a plus. Familiarity with Tableau, data APIs, or lightweight ETL frameworks. Exposure to risk, performance attribution, or investment workflows. What we offer? A global role at the intersection of finance and technology, with a significant impact on investment strategies and outcomes. Opportunities for professional growth in both technology and financial domains. A collaborative and innovative team environment focused on continuous improvement and excellence. Exposure to cutting-edge Generative AI technologies and the opportunity to design and apply AI-driven solutions in real-world investment workflows. Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law.
Posted 3 weeks ago
0.0 - 6.0 years
4 - 6 Lacs
Nagpur
Work from Office
Responsibilities: Implement computer vision applications using OpenCV & object detection techniques, segmentation , Train and fine tune models Develop machine learning models with Python, Docker, AWS SageMaker & IAM. Work from home
Posted 3 weeks ago
1.0 years
7 - 12 Lacs
Bengaluru, Karnataka, India
On-site
This role is for one of the Weekday's clients Salary range: Rs 700000 - Rs 1200000 (ie INR 7-12 LPA) Min Experience: 1 years Location: Bengaluru JobType: full-time We are seeking a detail-oriented and analytical Data Analyst with 1-3 years of hands-on experience in data analytics, database management, and reporting. In this role, you will be responsible for collecting, analyzing, and interpreting large datasets to help drive data-informed decisions across the business. Your expertise in SQL and Python will be key to building scalable data pipelines, running ad-hoc queries, generating actionable insights, and automating reporting processes. This is an exciting opportunity to work closely with cross-functional teams such as Product, Engineering, Marketing, and Operations to solve real business problems using data. The ideal candidate is passionate about data, enjoys working with numbers, and is excited to contribute to data-driven strategy and operational excellence. Requirements Key Responsibilities: Extract, clean, and analyze structured and unstructured data from various sources using SQL and Python. Develop and maintain dashboards, reports, and visualizations to communicate key business metrics. Work with stakeholders to define KPIs and create automated data pipelines to track performance over time. Translate business requirements into technical specifications and analytical queries. Perform exploratory data analysis (EDA) to identify trends, correlations, and outliers. Support A/B testing and experimental design by analyzing results and generating insights. Collaborate with engineering teams to ensure data integrity, quality, and accessibility. Document data definitions, processes, and analysis workflows for transparency and reproducibility. Required Skills & Qualifications: Bachelor's degree in Computer Science, Statistics, Mathematics, Engineering, Economics, or a related field. 1-3 years of professional experience in a data analyst or similar role. Strong proficiency in writing efficient and optimized SQL queries for data extraction and manipulation. Solid experience with Python for data analysis (Pandas, NumPy, Matplotlib/Seaborn, etc.). Experience working with relational databases (e.g., MySQL, PostgreSQL, SQL Server). Understanding of data wrangling, cleaning, and preprocessing techniques. Familiarity with data visualization tools such as Power BI, Tableau, or similar platforms. Strong analytical mindset with the ability to translate data into business insights. Excellent communication skills with the ability to present findings clearly to technical and non-technical stakeholders. Good to Have (Not Mandatory): Experience with cloud data warehouses (e.g., Snowflake, BigQuery, Redshift). Exposure to version control systems like Git. Understanding of basic statistics and hypothesis testing. Knowledge of APIs and data integration techniques
Posted 3 weeks ago
0.0 - 3.0 years
0 - 1 Lacs
Chepauk, Chennai, Tamil Nadu
On-site
Position Title: AI Specialist - Impact Based Forecasting Due to the operational nature of this role, preference will be given toapplicants who are currently based in Chennai, India and possess valid work authorization. RIMES is committed to diversity and equal opportunity in employment. Open Period: 11 July 2025 – 10 August 2025 Background: The Regional Integrated Multi-Hazard Early Warning System for Africa and Asia (RIMES) is an international and intergovernmental institution, owned and governed by its Member States, for the generation, application, and communication of multi-hazard early warning information. RIMES was formed in the aftermath of the 2004 Indian Ocean tsunami, as a collective response by countries in Africa and Asia to establish a regional early warning system within a multi-hazard framework, to strengthen preparedness and response to trans-boundary hazards. RIMES was formally established on 30 April 2009 and registered with the United Nations on 1 July 2009. It operates from its regional early warning center located at the Asian Institute of Technology (AIT) campus in Pathumthani, Thailand. Position Description: The AI Specialist – Impact-Based Forecasting design and implement AI-based solutions to support predictive analytics and intelligent decision support across sectors (e.g., climate services, disaster management /The AI Specialist will play a central role in building robust data pipelines, integrating multi-source datasets, and enabling real-time data-driven decision-making by stakeholders. The role involves drawing from and contribute to multi-disciplinary datasets and working closely with a multi-disciplinary team within RIMES for generating IBF DSS, developing contingency plans, automating monitoring systems, contributing to Post-Disaster Needs Assessments (PDNA), and applying AI/ML techniques for risk reduction. This position requires a strong understanding of meteorological, hydrological, vulnerability and exposure patterns and translates data into actionable insights for disaster preparedness and resilience planning. The position reports to the Meteorology and Disaster Risk Modeling Specialist and India Regional Program Adviser, overseeing the AI Specialist – Impact-Based Forecasting’s work or as assigned by RIMES’ institutional structure and in close coordination with the Systems Research and Development Specialist and Project Manager. Duty station: RIMES Project Office Chennai, India (or other locations as per project requirements). Type of Contract: Full-time Project-Based contract Skills and Qualifications: Education: Bachelor’s or Master’s degree in Computer Science, Data Science, Artificial Intelligence, or a related field. Experience: Minimum of 3 years of experience in data engineering, analytics, or IT systems for disaster management, meteorology, or climate services. Experience in multi-stakeholder projects and facilitating capacity-building programs. Knowledge Skills and Abilities: Machine Learning Fundamentals: Deep understanding of various ML algorithms, including supervised, unsupervised, and reinforcement learning. This includes regression, classification, clustering, time series analysis, anomaly detection, etc. Deep Learning: Proficiency with deep learning architectures (e.g., CNNs, RNNs, LSTMs, Transformers) and frameworks (TensorFlow, PyTorch, Keras). Ability to design, train, and optimize complex neural networks. Strong programming skills in Python, with extensive libraries (NumPy, Pandas, SciPy, Scikit-learn, Matplotlib, Seaborn, GeoPandas). Familiarity with AI tools: such as PyTorch, TensorFlow, Keras, MLflow, etc. Data Visualization: Ability to create clear, compelling visualizations to communicate complex data and model outputs. Familiarity with early warning systems, disaster risk frameworks, and sector-specific IBF requirements is a strong plus. Proficiency in technical documentation and user training. Personal Qualities: Excellent interpersonal skills; team-orientated work style; pleasant personality. Strong desire to learn and undertake new challenges. Creative problem-solver; willing to work hard. Analytical thinker with problem-solving skills. Strong attention to detail and ability to work under pressure. Self-motivated, adaptable, and capable of working in multicultural and multidisciplinary environments. Strong communication skills and the ability to coordinate with stakeholders. Major Duties and Responsibilities: Impact Based Forecasting Collaborate with other members of the IT team, meteorologists, hydrologists, GIS specialists, and disaster risk management experts within RIMES to ensure the development of IBF DSS. Develop AI models (e.g., NLP, computer vision, reinforcement learning) Integrate models into applications and dashboards. Ensure model explainability and ethical compliance. Assist the RIMES Team in applying AI/ML models to forecast hazards and project likely impacts based on exposure and vulnerability indices. Work with forecasters and domain experts to automate the generation of impact-based products. Ensure data security, backup, and compliance with data governance and interoperability standards. Train national counterparts on the use and management of the AL, including analytics dashboards. Collaborate with GIS experts, hydromet agencies, and emergency response teams for integrated service delivery. Technical documentation on data architecture, models, and systems. Capacity Building and Stakeholder Engagement Facilitate training programs for team members and stakeholders, focusing on RIMES policies, regulations, and the use of forecasting tools. Develop and implement a self-training plan to enhance personal expertise, obtaining a trainer certificate as required. Prepare and implement training programs to enhance team capacity and submit training outcome reports. Reporting Prepare technical reports, progress updates, and outreach materials for stakeholders. Maintain comprehensive project documentation, including strategies, milestones, and outcomes. Capacity-building workshop materials and training reports. Other Responsibilities Utilize AI skills to assist in system implementation plans and decision support system (DSS) development. Utilize skills to assist in system implementation plans and decision support system (DSS) development. Assist in 24/7 operational readiness for client early warning systems such as SOCs, with backup support from RIMES Headquarters. Undertake additional tasks as assigned by the immediate supervisor or HR manager based on recommendations from RIMES technical team members and organizational needs. The above responsibilities are illustrative and not exhaustive. Undertake any other relevant tasks that could be needed from time to time. Contract Duration The contract will initially be for one year and may be extended based on the satisfactory completion of a 180-day probationary period and subsequent annual performance reviews. How to Apply: Interested candidates should send your application letter, resume, salary expectation and 2 references to rimeshra@rimes.int by midnight of 10 August 2025, Bangkok time. Please state “AI Specialist—Impact-Based Forecasting: Your Name “the Subject line of the email. Only short-listed applicants will be contacted. Ms. Dusadee Padungkul Head-Department of Operational Support Regional Integrated Multi-Hazard Early Warning System AIT Campus, 58 Moo 9 Paholyothin Rd., Klong 1, Klong Luang, Pathumthani 12120 Thailand. RIMES promotes diversity and inclusion in the workplace. Well-qualified applicants particularly women are encouraged to apply. Job Type: Full-time Pay: ₹50,000.00 - ₹100,000.00 per month Schedule: Monday to Friday Ability to commute/relocate: Chepauk, Chennai, Tamil Nadu: Reliably commute or planning to relocate before starting work (Preferred) Application Question(s): Kindly specify your salary expectation per month. Do you have any experience or interest in working with international or non-profit organizations? Please explain. Education: Bachelor's (Required) Experience: working with international organization: 1 year (Preferred) Data engineering: 3 years (Required) Data analytics: 3 years (Required) Disaster management: 3 years (Preferred) Language: English (Required) Location: Chepauk, Chennai, Tamil Nadu (Required)
Posted 3 weeks ago
0.0 - 1.0 years
2 - 5 Lacs
Bengaluru
Work from Office
Location: Bangalore Duration: 3 months About the Role: We are looking for a passionate and motivated AI/ML Intern to join our team. In this role, you will work closely with our Data Science and Engineering teams to develop, train, and deploy machine learning models that solve real-world problems. This is a hands-on opportunity to gain experience with cutting-edge technologies—including Generative AI, Large Language Models (LLMs), AI Agents, and Retrieval-Augmented Generation (RAG)—and contribute to impactful projects across domains. Roles and Responsibilities Key Responsibilities: Assist in collecting, cleaning, and preprocessing structured and unstructured data for ML models Build and experiment with machine learning models for classification, regression, clustering, or NLP tasks Perform exploratory data analysis and visualize insights Support the deployment and evaluation of models in a production or simulated environment Collaborate with cross-functional teams including data engineers and software developers Document experiments, results, and workflows Stay updated with the latest trends and research in AI/ML, including LLMs, Generative AI, AI Agents, and retrieval-based techniques Explore the application of LLMs (e.g., GPT, Claude, LLaMA) in areas such as summarization, question answering, and content generation Contribute to prototyping and evaluation of Retrieval-Augmented Generation (RAG) pipelines using tools like LangChain or LlamaIndex Support prompt engineering, chaining, and evaluation of agent-based systems for specific business tasks. Roles & Responsibilities Preferred Skills & Qualifications: Pursued a degree in Computer Science, Data Science, AI/ML, or related field Strong foundation in Python and libraries such as NumPy, Pandas, Scikit-learn, TensorFlow or PyTorch Basic understanding of GenAI API usage concepts (e.g., OpenAI, Anthropic, Hugging Face). Familiarity with LLM frameworks such as LangChain or LlamaIndex is a plus. Understanding of foundational ML concepts including supervised/unsupervised learning, embeddings, and transfer learning. Familiarity with vector databases like FAISS, Pinecone, or ChromaDB is an advantage Familiarity with SQL and data visualization tools (e.g., Matplotlib, Seaborn, Plotly) Knowledge of NLP or deep learning is a plus Excellent problem-solving and communication skills Eagerness to learn, experiment, and work independently as well as in collaborative teams
Posted 3 weeks ago
5.0 - 10.0 years
5 - 10 Lacs
Gurgaon
On-site
Senior Manager EXL/SM/1412746 ServicesGurgaon Posted On 09 Jul 2025 End Date 23 Aug 2025 Required Experience 5 - 10 Years Basic Section Number Of Positions 2 Band C2 Band Name Senior Manager Cost Code D012182 Campus/Non Campus NON CAMPUS Employment Type Permanent Requisition Type New Max CTC 1500000.0000 - 3500000.0000 Complexity Level Not Applicable Work Type Hybrid – Working Partly From Home And Partly From Office Organisational Group Analytics Sub Group Analytics - UK & Europe Organization Services LOB Services SBU Analytics Country India City Gurgaon Center EXL - Gurgaon Center 38-B Skills Skill SAS PYTHON BANKING INDUSTRY KNOWLEDGE IMPAIRMENT TESTING REPORTING AND ANALYTICS SKILLS CREDIT CARD ANALYTICS Minimum Qualification B.TECH/B.E M TECH Certification No data available Job Description We are looking for a Credit Risk Reporting and Impairment Insights Analyst to support risk reporting, monitoring, and insights generation. The ideal candidate will have 3-6 years of experience in the Credit Risk domain , with strong expertise in Python and data analytics . This role requires working with large datasets, developing reports, and providing meaningful risk insights to support decision-making. Key Responsibilities: Develop, automate, and maintain credit risk reports and dashboards using Python and visualization tools. Analyze credit risk metrics , portfolio trends, and key risk indicators to generate actionable insights. Extract, clean, and process large datasets from various risk data sources. Collaborate with risk teams to improve existing risk monitoring frameworks and enhance reporting automation. Ensure data accuracy, integrity, and compliance with risk governance standards. Work closely with business stakeholders to provide deep dives into portfolio performance and risk trends. Required Skills & Experience: 3-6 years of experience in credit risk analytics, reporting, or risk monitoring . Hands-on expertise in Python (Pandas, NumPy, SQLAlchemy, etc.) for data manipulation and reporting. Experience with SQL for querying large datasets. Strong understanding of credit risk concepts , including delinquency, PD/LGD, and portfolio performance monitoring. Exposure to risk reporting frameworks and regulatory requirements (Basel, IFRS9, CCAR, etc.) is a plus. Experience with data visualization tools (Power BI, Tableau, Matplotlib, Seaborn) for reporting and insights generation. Strong problem-solving skills and ability to communicate risk insights to stakeholders. Prior experience in automating reports and processes to improve efficiency. Preferred Qualifications: Experience with cloud-based data platforms (AWS, GCP, or Azure) is a plus. Knowledge of credit bureau data and loan portfolio analysis . Familiarity with SAS or R (optional but beneficial). Why Join Us? Opportunity to work on impactful credit risk reporting and analytics . Exposure to advanced risk management frameworks and data-driven decision-making . Collaborative work environment with opportunities for growth and upskilling . Workflow Workflow Type L&S-DA-Consulting
Posted 3 weeks ago
1.0 - 2.0 years
0 Lacs
Noida
On-site
Level AI was founded in 2019 and is a Series C startup headquartered in Mountain View, California. Level AI revolutionizes customer engagement by transforming contact centers into strategic assets. Our AI-native platform leverages advanced technologies such as Large Language Models to extract deep insights from customer interactions. By providing actionable intelligence, Level AI empowers organizations to enhance customer experience and drive growth. Consistently updated with the latest AI innovations, Level AI stands as the most adaptive and forward-thinking solution in the industry.As a critical member of the team, your work will be cutting-edge technologies and will play a high-impact role in shaping the future of AI-driven enterprise applications. You will directly work with people who've worked at Amazon, Facebook, Google, and other technology companies in the world. With Level AI, you will get to have fun, learn new things, and grow along with us. What you’ll get to do at Level AI (and more as we grow together) Drive product impact by proposing and conducting quantitative research into key user behaviors and trends Conduct in-depth analysis and build statistical models to identify trends and key drivers that inform important decisions made by the user. Data analysis to identify common trends Based on analysis, propose enhancements for automation pipelines. Review the output of AI models to improve performance by identifying common usage gaps Optimise AI model performance through effective prompt engineering techniques. Establish automated model performance improvement pipelines through prompt engineering. Responsible for generating, testing, evaluating, and curating high-quality, diverse, and representative data (including synthetic) for AI model development, training, and performance. Analyse model performance by diving into user feedback data and product usage reports. Suggest ways to improve product usage by specifically targeting each group of users and also feature. Create a business analysis report weekly. Define and monitor key metrics; investigate changes in metrics. Driven to continuously learn and adapt. We'd love to explore more about you if you have Bachelor's degree or above with a good academic background. 1-2 years of full-time work experience as an AI Analyst. A highly resourceful individual who is looking to grow in the SaaS AI field. Experience in Python (Pandas, Matplotlib, NumPy) is a must. Experience in SQL is a must. Experience with Prompt Engineering is a plus. Familiarity with Classical ML Models (scikit-learn) and Deep Learning (Hugging Face, PyTorch) is a plus. Experience with Data Engineering is a plus. Employing test findings to do Statistical analysis and improve models. Knowledge of common metrics for evaluation of ML models is a plus. We offer market-leading compensation, based on the skills and aptitude of the candidate. To learn more visit : https://thelevel.ai/ Funding : https://www.crunchbase.com/organization/level-ai LinkedIn : https://www.linkedin.com/company/level-ai/
Posted 3 weeks ago
4.0 years
12 - 20 Lacs
Pune, Maharashtra, India
On-site
About Improzo At Improzo (Improve + Zoe; meaning Life in Greek), we believe in improving life by empowering our customers. Founded by seasoned Industry leaders, we are laser focused for delivering quality-led commercial analytical solutions to our clients. Our dedicated team of experts in commercial data, technology, and operations has been evolving and learning together since our inception. Here, you won't find yourself confined to a cubicle; instead, you'll be navigating open waters, collaborating with brilliant minds to shape the future. You will work with leading Life Sciences clients, seasoned leaders and carefully chosen peers like you! People are at the heart of our success, so we have defined our CARE values framework with a lot of effort, and we use it as our guiding light in everything we do. We CARE! Customer-Centric: Client success is our success. Prioritize customer needs and outcomes in every action. Adaptive: Agile and Innovative, with a growth mindset. Pursue bold and disruptive avenues that push the boundaries of possibilities. Respect: Deep respect for our clients & colleagues. Foster a culture of collaboration and act with honesty, transparency, and ethical responsibility. Execution: Laser focused on quality-led execution; we deliver! Strive for the highest quality in our services, solutions, and customer experiences. About The Role We're looking for a Data Scientist in Pune to drive insights for pharma clients using advanced ML, Gen AI, and LLMs on complex healthcare data. You'll optimize Pharma commercial strategies (forecasting, marketing, SFE) and improve patient outcomes (journey mapping, adherence, RWE). Key Responsibilities Data Exploration & Problem Framing: Proactively engage with client/business stakeholders (e.g., Sales, Marketing, Market Access, Commercial Operations, Medical Affairs, Patient Advocacy teams) to deeply understand their challenges and strategic objectives. Explore, clean, and prepare large, complex, and sometimes messy datasets from various sources, including but not limited to: sales data, prescription data, claims data, Electronic Health Records (EHRs), patient support program data, CRM data, and real-world evidence (RWE) datasets. Translate ambiguous business problems into well-defined data science questions and develop appropriate analytical frameworks. Advanced Analytics & Model Development Design, develop, validate, and deploy robust statistical models and machine learning algorithms (e.g., predictive models, classification, clustering, time series analysis, causal inference, natural language processing). Develop models for sales forecasting, marketing mix optimization, customer segmentation (HCPs, payers, pharmacies), sales force effectiveness (SFE) analysis, incentive compensation modelling, and market access analytics (e.g., payer landscape, formulary impact). Analyze promotional effectiveness and patient persistency/adherence. Build models for patient journey mapping, patient segmentation for personalized interventions, treatment adherence prediction, disease progression modelling, and identifying drivers of patient outcomes from RWE. Contribute to understanding patient behavior, unmet needs, and the impact of interventions on patient health. Generative AI & LLM Solutions Extracting insights from unstructured text data (e.g., clinical notes, scientific literature, sales call transcripts, patient forum discussions). Summarization of complex medical or commercial documents. Automated content generation for internal use (e.g., draft reports, competitive intelligence summaries). Enhancing data augmentation or synthetic data generation for model training. Developing intelligent search or Q&A systems for commercial or medical inquiries. Apply techniques like prompt engineering, fine-tuning of LLMs, and retrieval-augmented generation (RAG). Insight Generation & Storytelling Transform complex analytical findings into clear, concise, and compelling narratives and actionable recommendations for both technical and non-technical audiences. Create impactful data visualizations, dashboards, and presentations using tools like Tableau, Power BI, or Python/R/Alteryx visualization libraries. Collaboration & Project Lifecycle Management Collaborate effectively with cross-functional teams including product managers, data engineers, software developers, and other data scientists. Support the entire data science lifecycle, from conceptualization and data acquisition to model development, deployment (MLOps), and ongoing monitoring in production environments. Qualifications Master's or Ph.D. in Data Science, Statistics, Computer Science, Applied Mathematics, Economics, Bioinformatics, Epidemiology, or a related quantitative field. 4+ years progressive experience as a Data Scientist, with demonstrated success in applying advanced analytics to solve business problems, preferably within the healthcare, pharmaceutical, or life sciences industry using pharma dataset extensively (e.g. sales data from Iqvia, Symphony, Komodo, etc., CRM data from Veeva, OCE, etc.) Must-have: Solid understanding of pharmaceutical commercial operations (e.g., sales force effectiveness, marketing, market access, CRM). Must-have: Experience working with real-world patient data (e.g., claims, EHR, pharmacy data, patient registries) and understanding of patient journeys. Strong programming skills in Python (e.g., Pandas, NumPy, Scikit-learn, TensorFlow/PyTorch) and/or R for data manipulation, statistical analysis, and machine learning. Expertise in SQL for data extraction, manipulation, and analysis from relational databases. Experience with machine learning frameworks and libraries. Proficiency in data visualization tools (e.g., Tableau, Power BI) and/or visualization libraries (e.g., Matplotlib, Seaborn, Plotly). Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and big data technologies (e.g., Spark, Hadoop) is a significant advantage. Specific experience with Natural Language Processing (NLP) techniques, Generative AI models (e.g., Transformers, diffusion models), Large Language Models (LLMs), and prompt engineering is highly desirable. Experience with fine-tuning LLMs, working with models from Hugging Face, or utilizing major LLM APIs (e.g., OpenAI, Anthropic, Google). Experience with MLOps practices and tools (e.g., MLflow, Kubeflow, Docker, Kubernetes). Knowledge of pharmaceutical or biotech industry regulations and compliance requirements like HIPAA, CCPA, SOC, etc. Excellent communication, presentation, and interpersonal skills, with the ability to effectively interact with both technical and non-technical stakeholders at all levels. Attention to details, biased for quality and client centricity. Ability to work independently and as part of a cross-functional team. Strong leadership, mentoring, and coaching skills. Benefits Competitive salary and benefits package. Opportunity to work on cutting-edge Analytics projects, transforming the life sciences industry Collaborative and supportive work environment. Opportunities for professional development and growth. Skills: data manipulation,analytics,llm,generative ai,commercial pharma,mlops,sql,python,natural language processing,data visualization,models,r,machine learning,statistical analysis,genai,data,patient outcomes
Posted 3 weeks ago
5.0 years
0 Lacs
Panipat, Haryana, India
On-site
🚨 Urgent Opening: Freelance Data Science Trainer 🚨 Envision Group is hiring an experienced Data Science Trainer for a project-based assignment starting immediately. 🔹 Experience Required: 3–5 years 🔹 Mode: Offline 🔹 Duration: 120–150 hours (project-based) 🔹 Start Date: Immediate 🔹 Location: Panipat, Haryana 🔹 Accommodation : Will be provided Key Areas to Cover: Python (NumPy, Pandas, Matplotlib, Seaborn) Statistics & Hypothesis Testing Machine Learning (Regression, Classification, Clustering) Jupyter Notebook, Scikit-learn Real-world projects and hands-on assignments We're looking for someone who can deliver practical, outcome-driven training and is passionate about teaching. 📩 Interested? DM me or send your profile to [hr@envisiongroup.in] Let’s make learning impactful!
Posted 3 weeks ago
0.0 years
0 - 0 Lacs
Viman Nagar, Pune, Maharashtra
On-site
We are looking for analytical and innovative AI/ML Engineer Interns who are passionate about building intelligent systems and solving real-world problems with data. You’ll work closely with our product and tech teams to develop, train, and deploy machine learning models in dynamic, hands-on projects. Key Responsibilities Design and implement machine learning models to solve business problems. Perform data cleaning, feature engineering, and exploratory data analysis. Evaluate and optimise model performance using appropriate metrics. Collaborate with engineers to deploy ML models into real-world applications. Document experiments, findings, and processes for clarity and reproducibility. Qualifications Must have a Bachelor’s degree (completed or in final year) in Computer Science, AI/ML, Data Science, or related fields . Strong understanding of core ML concepts, algorithms, and statistics. Proficiency in Python and libraries like scikit-learn, TensorFlow, PyTorch, pandas, and NumPy . Hands-on experience with real datasets and ML pipelines. Familiarity with data visualisation tools (e.g., Matplotlib, Seaborn). What You’ll Gain Practical experience developing ML solutions from the ground up. Mentorship from experienced engineers and data scientists. Exposure to end-to-end ML workflows, from data to deployment. Opportunity to contribute to impactful projects with real users. Consideration for a full-time role based on performance. Job Type: Internship Contract length: 3 months Pay: ₹5,000.00 - ₹8,000.00 per month Benefits: Flexible schedule Schedule: Day shift Night shift Supplemental Pay: Performance bonus Shift allowance Education: Bachelor's (Required) Location: Viman Nagar, Pune, Maharashtra (Required) Shift availability: Night Shift (Preferred) Day Shift (Preferred) Work Location: In person
Posted 3 weeks ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About the Company We are looking for an experienced Python Developer with expertise in using TensorFlow/PyTorch, LangChain, OpenAI API, Elasticsearch and a deep understanding of Natural Language Processing to help us develop and optimize high-performance applications. About the Role You will be responsible for implementing, testing, and maintaining data pipelines, machine learning models, and NLP techniques to extract valuable insights from data. Responsibilities Design, develop, and maintain Python-based data analysis and machine learning applications with clean and well-documented code Develop, optimize and deploy ML models for information retrieval, LLM-based agents, embeddings (FAISS, Pinecone, Weaviate), predictive analytics, and Retrieval-Augmented Generation (RAG) Research and implement NLP algorithms for text classification, sentiment analysis, named entity recognition (NER), and topic modeling, including troubleshooting and debugging to ensure reliable performance at scale Implement data pipelines and ETL processes for big data processing Collaborate with cross-functional teams to understand business requirements and build scalable tech Qualifications Strong proficiency in Python with hands-on experience in libraries like Pandas, NumPy, scikit-learn, TensorFlow, PyTorch Expertise in information retrieval, statistical analysis, data visualization and developing LLM-based agents, embeddings (FAISS, Pinecone, Weaviate), predictive analytics, and Retrieval-Augmented Generation (RAG) Hands-on experience with Natural Language Processing (NLP) libraries such as NLTK, spaCy, Hugging Face, or similar tools Experience with data wrangling techniques, including cleaning, transforming, and merging data sets from various sources Familiarity with machine learning algorithms and frameworks (supervised, unsupervised learning, and deep learning techniques) Solid understanding of text analytics such as text pre-processing, tokenization, stemming, lemmatization, and part-of-speech tagging Experience with cloud platforms (AWS, GCP, or Azure) and containerization (Docker, Kubernetes) is a plus. Knowledge of data visualization tools (Matplotlib, Seaborn, ggplot2, Plotly, etc.) Strong problem-solving skills and attention to detail with ability to work in an agile, fast-paced environment and deliver results under tight deadlines Required Skills 4 year Bachelor’s degree in Computer Science, Information Technology, Data Science, Statistics or related domains, or equivalent qualification 4+ years in developing scalable ML models, NLP models and systems from 0 to 1 and deploying them to production Strong knowledge of RESTful APIs and GraphQL for frontend-backend communication Familiarity with version control using Git, CI/CD tools, and deployment pipelines Knowledge of big data tools and platforms (Spark, Hadoop, etc.) and experience with managing databases
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Python Developer specializing in AI and ML, you will play a crucial role in our team located in Guindy, Chennai. Your primary responsibility will be to create Python-based AI/ML models, establish scalable pipelines, and implement intelligent systems to tackle intricate real-world challenges. Your duties will include designing and executing AI/ML models in Python utilizing libraries like TensorFlow, PyTorch, Scikit-learn, or Keras. Additionally, you will fine-tune pre-trained models, write well-documented Python code, and handle large datasets efficiently using tools like Pandas, NumPy, and PySpark. Furthermore, you will be tasked with developing data pipelines, building Python-based APIs for model integration, deploying models with tools like Docker, and optimizing model efficiency using Python techniques. It is essential to stay updated on Python advancements and collaborate with various stakeholders to deliver Python-driven solutions effectively. To be successful in this role, you should hold a degree in Computer Science or related fields, demonstrate expertise in Python programming, possess knowledge of AI/ML frameworks, and exhibit familiarity with data analysis and visualization libraries. Proficiency in version control tools, deploying ML models, and strong problem-solving skills are also essential. Preferred qualifications include proficiency in Python-based Big Data tools, experience with NLP frameworks, and familiarity with MLOps tools. By joining our team, you will have the opportunity to work on impactful AI/ML projects, grow in a collaborative environment, and lead with Python in cutting-edge technologies. If you are passionate about leveraging Python for AI/ML projects, eager to collaborate with a dynamic team, and excited about shaping the future of AI, we look forward to welcoming you on board. To apply, kindly submit your resume and examples of Python-based AI/ML projects to hr@whitemastery.com. This is a full-time, permanent position with a day shift schedule, and the expected start date is 15/04/2025.,
Posted 3 weeks ago
1.0 - 2.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Level AI was founded in 2019 and is a Series C startup headquartered in Mountain View, California. Level AI revolutionizes customer engagement by transforming contact centers into strategic assets. Our AI-native platform leverages advanced technologies such as Large Language Models to extract deep insights from customer interactions. By providing actionable intelligence, Level AI empowers organizations to enhance customer experience and drive growth. Consistently updated with the latest AI innovations, Level AI stands as the most adaptive and forward-thinking solution in the industry.As a critical member of the team, your work will be cutting-edge technologies and will play a high-impact role in shaping the future of AI-driven enterprise applications. You will directly work with people who've worked at Amazon, Facebook, Google, and other technology companies in the world. With Level AI, you will get to have fun, learn new things, and grow along with us. What you’ll get to do at Level AI (and more as we grow together) Drive product impact by proposing and conducting quantitative research into key user behaviors and trends Conduct in-depth analysis and build statistical models to identify trends and key drivers that inform important decisions made by the user Data analysis to identify common trends Based on analysis, propose enhancements for automation pipelines Review the output of AI models to improve performance by identifying common usage gaps Optimise AI model performance through effective prompt engineering techniques Establish automated model performance improvement pipelines through prompt engineering Responsible for generating, testing, evaluating, and curating high-quality, diverse, and representative data (including synthetic) for AI model development, training, and performance Analyse model performance by diving into user feedback data and product usage reports Suggest ways to improve product usage by specifically targeting each group of users and also feature Create a business analysis report weekly Define and monitor key metrics; investigate changes in metrics Driven to continuously learn and adapt We'd love to explore more about you if you have Bachelor's degree or above with a good academic background 1-2 years of full-time work experience as an AI Analyst A highly resourceful individual who is looking to grow in the SaaS AI field Experience in Python (Pandas, Matplotlib, NumPy) is a must Experience in SQL is a must Experience with Prompt Engineering is a plus Familiarity with Classical ML Models (scikit-learn) and Deep Learning (Hugging Face, PyTorch) is a plus Experience with Data Engineering is a plus Employing test findings to do Statistical analysis and improve models Knowledge of common metrics for evaluation of ML models is a plus We offer market-leading compensation, based on the skills and aptitude of the candidate. To learn more visit : https://thelevel.ai/ Funding : https://www.crunchbase.com/organization/level-ai LinkedIn : https://www.linkedin.com/company/level-ai/
Posted 3 weeks ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Title : Data : Bachelor's or master's degree in computer science, Statistics, Mathematics, or a related : 6 To 10 set : Artificial Intelligence / Machine knowledge : Strong working knowledge of the Google Cloud Platform (GCP) and its AI/ML services. Proven experience in chatbot creation and development using relevant frameworks. Proven experience in developing and implementing machine learning models. Strong programming skills in Python, with expertise in Pandas, NumPy, Scikit-learn, TensorFlow, PyTorch, Keras , Matplotlib and Seaborn. Proficiency in SQL querying and database management. Experience with front-end frameworks such as React or Angular and CSS. Experience with back-end frameworks such as Django, Flask, or FastAPI. Experience in prompt engineering for large language models (LLMs), including prompt design, optimization, and evaluation. Strong problem-solving skills and the ability to translate business requirements into technical solutions. Excellent communication and presentation skills, with the ability to explain complex concepts to non-technical Description : Deploy and manage AI/ML applications on the Google Cloud Platform (GCP). Design, develop, and implement conversational AI solutions using various chatbot frameworks and platforms. Design, develop, and optimize prompts for large language models (LLMs) to achieve desired outputs. Develop and implement machine learning models using supervised, unsupervised, and reinforcement learning algorithms. Utilize Python with Pandas, NumPy, Scikit-learn, TensorFlow, PyTorch, and Keras to build and deploy machine learning solutions. Create visualizations using Matplotlib and Seaborn to communicate insights. Write and optimize SQL queries to extract and manipulate data from various databases. Develop and maintain web applications and APIs using Python frameworks such as Django, Flask, or FastAPI. Build user interfaces using JavaScript frameworks such as React or Angular, along with CSS. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Communicate complex technical concepts and findings to non-technical stakeholders through presentations and reports. (ref:hirist.tech)
Posted 3 weeks ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Selected Intern's Day-to-day Responsibilities Include Data collection & cleaning: Assist in sourcing, cleaning, and preprocessing data from diverse platforms to ensure accuracy and usability Detect and resolve inconsistencies, missing values, and anomalies in datasets Exploratory Data Analysis (EDA): Perform exploratory analyses to identify key trends, patterns, and relationships within the data Leverage visualizations to interpret findings and effectively communicate insights to the team Model development & testing: Support the end-to-end development of machine learning models, including data preparation, feature engineering, and performance evaluation Gain hands-on experience with supervised and unsupervised learning techniques under mentorship Data visualization & reporting: Design and develop dashboards and reports to showcase analytical insights and model performance metrics Utilize tools such as Python (e.g., Matplotlib, Seaborn) or Power BI to build compelling visualizations Collaboration & documentation: Work closely with cross-functional teams to ensure data initiatives are aligned with business objectives Maintain thorough documentation of data workflows, analysis processes, and model outcomes About Company: Welcome to the forefront of technological education with the International Institute of Data Science and Technology (IIDST), India's premier platform for aspiring data scientists and web developers. IIDST stands as a beacon of excellence, offering a transformative learning experience in the dynamic realms of data science and web development. IIDST takes pride in its innovative "pay after placement" model, ensuring that students can invest in their education without the burden of upfront costs. This revolutionary approach reflects our confidence in the quality of our programs and underscores our dedication to students' success. Whether you aspire to unravel the mysteries of data science or master the intricacies of web development, IIDST is your gateway to a future where knowledge meets opportunity. Join us on this transformative journey and let IIDST empower you to lead the way in the ever-evolving landscape of technology.
Posted 3 weeks ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Python Software Development Sr.Analyst Job Description In these roles, you will be responsible for: Design, implement, and test generative AI models using python and various frameworks such as Pandas, TensorFlow, PyTorch, and OpenAI. Research and explore new techniques and applications of generative AI, such as text, image, audio, and video synthesis, style transfer, data augmentation, and anomaly detection. Collaborate with other developers, researchers, and stakeholders to deliver high-quality and innovative solutions. Document and communicate the results and challenges of generative AI projects. Required Skills for this role include: Technical skills 3 + years Experience in developing Python frameworks such DL, ML, Flask At least 2 years of experience in developing generative AI models using python and relevant frameworks. Good knowledge in RPA Strong knowledge of machine learning, deep learning, and generative AI concepts and algorithms. Proficient in python and common libraries such as numpy, pandas, matplotlib, and scikit-learn. Familiar with version control, testing, debugging, and deployment tools. Excellent communication and problem-solving skills. Curious and eager to learn new technologies and domains. Desired Skills: Knowledge of Django, Web API Proficient exposure on MVC. Preferences: Graduate degree in Computer Science with 4 years ofPython based development. Gen AI Framework Professional Certification
Posted 3 weeks ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Key Responsibilities: • Design and implement predictive models and machine learning algorithms to solve healthcare-specific challenges • Analyze large, complex healthcare datasets including electronic health records (EHR) and claims data • Develop statistical models for patient risk stratification, treatment optimization, population health management, and revenue cycle optimization • Build models for clinical decision support, patient outcome prediction, care quality improvement, and revenue cycle optimization • Create and maintain automated data pipelines for real-time analytics and reporting • Work with healthcare data standards (HL7 FHIR, ICD-10, CPT, SNOMED CT) and ensure regulatory compliance • Develop and deploy models in cloud environments while creating visualizations for stakeholders • Present findings and recommendations to cross-functional teams including clinicians, product managers, and executives Qualifications required: • Bachelor's degree in data science, Statistics, Computer Science, Mathematics, or related quantitative field • At least 2 years of hands-on experience in data science, analytics, or machine learning roles • Demonstrated experience working with large datasets and statistical modeling • Proficiency in Python or R for data analysis and machine learning • Experience with SQL and database management systems • Knowledge of machine learning frameworks such as scikit-learn, TensorFlow, PyTorch • Familiarity with data visualization tools such as Tableau, Power BI, matplotlib, ggplot2 • Experience with version control systems (Git) and collaborative development practices • Strong foundation in statistics, hypothesis testing, and experimental design • Experience with supervised and unsupervised learning techniques • Knowledge of data preprocessing, feature engineering, and model validation • Understanding of A/B testing and causal inference methods. What You’ll Need to Be Successful (Required Skills): • Large Language Model (LLM) Experience: At least 2 years of hands-on experience working with pre-trained language models (GPT, BERT, T5) including fine-tuning, prompt engineering, and model evaluation techniques • Generative AI Frameworks: Proficiency with generative AI libraries and frameworks such as Hugging Face Transformers, Lang Chain, OpenAI API, or similar platforms for building and deploying AI applications • Prompt Engineering and Optimization: Experience designing, testing, and optimizing prompts for various use cases including text generation, summarization, classification, and conversational AI applications • Vector Databases and Embeddings: Knowledge of vector similarity search, embedding models, and vector databases (Pinecone, We aviate, Chroma) for building retrieval-augmented generation (RAG) systems • AI Model Evaluation: Experience with evaluation methodologies for generative models including BLEU scores, ROUGE metrics, human evaluation frameworks, and bias detection techniques • Multi-modal AI Systems: Familiarity with multi-modal generative models combining text, images, and other data types, including experience with vision-language models and cross-modal applications • AI Safety and Alignment: Understanding of responsible AI practices including content filtering, bias mitigation, hallucination detection, and techniques for ensuring AI outputs align with business requirements and ethical guidelines
Posted 3 weeks ago
0.0 - 1.0 years
5 - 12 Lacs
Bengaluru
Work from Office
Job Description: We are looking for an enthusiastic and detail-oriented Data Analyst (Fresher) to join our analytics team. If you are passionate about data, eager to work on real-world business problems, proficient in Python and SQL, we would love to hear from you. Responsibilities: Analyze large datasets to identify trends, patterns, and insights Write efficient SQL queries to extract and manipulate data Use Python for data cleaning, analysis, and automation (Pandas, NumPy, etc.) Create visual reports using tools or libraries (Matplotlib, Seaborn, or Excel) Collaborate with product and business teams to solve data-related problems Maintain dashboards and ensure data accuracy and consistency Required Skills: Proficiency in Python for data analysis (Pandas, NumPy, basic scripting) Strong command of SQL (joins, aggregations, subqueries) Good understanding of data structures and basic statistics Strong problem-solving skills and attention to detail Effective communication and willingness to learn Preferred (Good to Have): Exposure to BI tools like Power BI / Tableau Understanding of Excel/Google Sheets functions Internship/project experience in data analysis or related work Benefits: Flexible working hours Mentorship and structured training Opportunity to work on real business problems Friendly and collaborative team culture Certificate of completion and potential for full-time conversion (if applicable)
Posted 3 weeks ago
3.0 years
0 Lacs
Panaji, Goa, India
On-site
About the Project We are seeking a brilliant and innovative Data Scientist to join the team building "a Stealth Prop-tech Startup," a groundbreaking digital real estate platform in Dubai. This is a complex initiative to build a comprehensive ecosystem integrating long-term sales, short-term stays, and advanced technologies including AI/ML, data analytics, Web3/blockchain, and conversational AI. You will be at the heart of our intelligence engine, transforming vast datasets into the predictive models and insights that will define our competitive edge. This is a pivotal role in a high-impact project, offering the chance to work on challenging problems in the PropTech space and see your models directly influence the user experience and business strategy. Job Summary As a Data Scientist, you will be responsible for designing, developing, and deploying the machine learning models that power the platform's most innovative features. You will work on everything from creating a proprietary property valuation model ("TruValue UAE") to building a sophisticated recommendation engine and forecasting market trends. You will collaborate closely with backend engineers, product managers, and business stakeholders to leverage our unique data assets, driving personalization, market intelligence, and strategic decision-making across the platform. Key Responsibilities Design, train, and deploy machine learning models for the "TruValue UAE" Automated Valuation Model (AVM) to predict property values. Develop and implement a personalization and recommendation engine to suggest relevant properties to users based on their behavior and preferences. Analyze large, complex datasets to identify key business insights, user behavior patterns, and real estate market trends. Build predictive models to forecast metrics such as user churn, rental yield, and neighborhood demand dynamics. Collaborate with the backend engineering team to integrate ML models into the production environment via scalable APIs. Work with the product team to define data-driven hypotheses and conduct experiments to improve platform features. Communicate complex findings and the results of analyses to non-technical stakeholders through clear visualizations and reports. Contribute to the design and development of the big data infrastructure and MLOps pipelines. Required Skills and Experience 3-5+ years of hands-on experience as a Data Scientist, with a proven track record of building and deploying machine learning models in a production environment. A Master’s degree or PhD in a quantitative field such as Computer Science, Statistics, Mathematics, or Engineering. Expert proficiency in Python and its data science ecosystem (e.g., Pandas, NumPy, Scikit-learn, TensorFlow, PyTorch). Strong practical knowledge of various machine learning techniques, including regression, classification, clustering, and recommendation systems. Advanced SQL skills and experience working with relational databases (e.g., PostgreSQL). Experience with data visualization tools (e.g., Matplotlib, Seaborn, Tableau). Preferred Qualifications Experience in the PropTech (Property Technology) or FinTech sectors is highly desirable. Direct experience building Automated Valuation Models (AVMs) or similar price prediction models. Experience working with cloud-based data platforms and ML services (e.g., AWS SageMaker, Google AI Platform, BigQuery, Redshift). Familiarity with MLOps principles and tools for model deployment and monitoring. Experience with time-series analysis and forecasting. Experience with Natural Language Processing (NLP) techniques.
Posted 3 weeks ago
3.0 - 7.0 years
14 - 18 Lacs
Bengaluru
Work from Office
Scale an existing RAG code base for a production grade AI application Proficiency in Prompt Engineering, LLMs, and Retrieval Augmented Generation Programming languages like Python or Java Experience with vector databases Experience using LLMs in software applications, including prompting, calling, and processing outputs Experience with AI frameworks such as LangChain Troubleshooting skills and creative in finding new ways to leverage LLM Experience with Azure Proof of Concept (POC) DevelopmentDevelop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Collaborate with development teams to implement and iterate on POCs, ensuring alignment with customer requirements and expectations. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another, particularly COBOL to JAVA through rapid prototypes/ PoC Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong programming skills, with proficiency in Python and experience with AI frameworks such as TensorFlow, PyTorch, Keras or Hugging Face. Understanding in the usage of libraries such as SciKit Learn, Pandas, Matplotlib, etc. Familiarity with cloud platforms (e.g. Kubernetes, AWS, Azure, GCP) and related services is a plus. Experience and working knowledge in COBOL & JAVA would be preferred Having experience in Code generation, code matching & code translation Prepare the effort estimates, WBS, staffing plan, RACI, RAID etc. . Excellent interpersonal and communication skills. Engage with stakeholders for analysis and implementation. Commitment to continuous learning and staying updated with advancements in the field of AI. Demonstrate a growth mindset to understand clients' business processes and challenges Preferred technical and professional experience EducationBachelor's, Master's, or Ph.D. degree in Computer Science, Artificial Intelligence, Data Science or a related field. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough