Home
Jobs
Companies
Resume

116 Dask Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Position Title : AI/ML Engineer. Company : Cyfuture India Pvt.Ltd. Industry : IT Services and IT Consulting. Location : Sector 81, NSEZ, Noida (5 Days Work From Office). About Cyfuture Cyfuture is a trusted name in IT services and cloud infrastructure, offering state-of-the-art data center solutions and managed services across platforms like AWS, Azure, and VMWare. We are expanding rapidly in system integration and managed services, building strong alliances with global OEMs like VMWare, AWS, Azure, HP, Dell, Lenovo, and Palo Alto. Position Overview We are hiring an experienced AI/ML Engineer to lead and shape our AI/ML initiatives. The ideal candidate will have hands-on experience in machine learning and artificial intelligence, with strong leadership capabilities and a passion for delivering production-ready solutions. This role involves end-to-end ownership of AI/ML projects, from strategy development to deployment and optimization of large-scale systems. Key Responsibilities Lead and mentor a high-performing AI/ML team. Design and execute AI/ML strategies aligned with business goals. Collaborate with product and engineering teams to identify impactful AI opportunities. Build, train, fine-tune, and deploy ML models in production environments. Manage operations of LLMs and other AI models using modern cloud and MLOps tools. Implement scalable and automated ML pipelines (e., with Kubeflow or MLRun). Handle containerization and orchestration using Docker and Kubernetes. Optimize GPU/TPU resources for training and inference tasks. Develop efficient RAG pipelines with low latency and high retrieval accuracy. Automate CI/CD workflows for continuous integration and delivery of ML systems. Key Skills & Expertise Cloud Computing & Deployment : Proficiency in AWS, Google Cloud, or Azure for scalable model deployment. Familiarity with cloud-native services like AWS SageMaker, Google Vertex AI, or Azure ML. Expertise in Docker and Kubernetes for containerized deployments. Experience with Infrastructure as Code (IaC) using tools like Terraform or CloudFormation. Machine Learning & Deep Learning Strong command of frameworks : TensorFlow, PyTorch, Scikit-learn, XGBoost. Experience with MLOps tools for integration, monitoring, and automation. Expertise in pre-trained models, transfer learning, and designing custom architectures. Programming & Software Engineering Strong skills in Python (NumPy, Pandas, Matplotlib, SciPy) for ML development. Backend/API development with FastAPI, Flask, or Django. Database handling with SQL and NoSQL (PostgreSQL, MongoDB, BigQuery). Familiarity with CI/CD pipelines (GitHub Actions, Jenkins). Scalable AI Systems Proven ability to build AI-driven applications at scale. Handle large datasets, high-throughput requests, and real-time inference. Knowledge of distributed computing : Apache Spark, Dask, Ray. Model Monitoring & Optimization Hands-on with model compression, quantization, and pruning. A/B testing and performance tracking in production. Knowledge of model retraining pipelines for continuous learning. Resource Optimization Efficient use of compute resources : GPUs, TPUs, CPUs. Experience with serverless architectures to reduce cost. Auto-scaling and load balancing for high-traffic systems. Problem-Solving & Collaboration Translate complex ML models into user-friendly applications. Work effectively with data scientists, engineers, and product teams. Write clear technical documentation and architecture reports. (ref:hirist.tech) Show more Show less

Posted 3 days ago

Apply

12.0 - 18.0 years

0 Lacs

Tamil Nadu, India

Remote

Linkedin logo

Join us as we work to create a thriving ecosystem that delivers accessible, high-quality, and sustainable healthcare for all. This position requires expertise in designing, developing, debugging, and maintaining AI-powered applications and data engineering workflows for both local and cloud environments. The role involves working on large-scale projects, optimizing AI/ML pipelines, and ensuring scalable data infrastructure. As a PMTS, you will be responsible for integrating Generative AI (GenAI) capabilities, building data pipelines for AI model training, and deploying scalable AI-powered microservices. You will collaborate with AI/ML, Data Engineering, DevOps, and Product teams to deliver impactful solutions that enhance our products and services. Additionally, it would be desirable if the candidate has experience in retrieval-augmented generation (RAG), fine-tuning pre-trained LLMs, AI model evaluation, data pipeline automation, and optimizing cloud-based AI deployments. Responsibilities AI-Powered Software Development & API Integration Develop AI-driven applications, microservices, and automation workflows using FastAPI, Flask, or Django, ensuring cloud-native deployment and performance optimization. Integrate OpenAI APIs (GPT models, Embeddings, Function Calling) and Retrieval-Augmented Generation (RAG) techniques to enhance AI-powered document retrieval, classification, and decision-making. Data Engineering & AI Model Performance Optimization Design, build, and optimize scalable data pipelines for AI/ML workflows using Pandas, PySpark, and Dask, integrating data sources such as Kafka, AWS S3, Azure Data Lake, and Snowflake. Enhance AI model inference efficiency by implementing vector retrieval using FAISS, Pinecone, or ChromaDB, and optimize API latency with tuning techniques (temperature, top-k sampling, max tokens settings). Microservices, APIs & Security Develop scalable RESTful APIs for AI models and data services, ensuring integration with internal and external systems while securing API endpoints using OAuth, JWT, and API Key Authentication. Implement AI-powered logging, observability, and monitoring to track data pipelines, model drift, and inference accuracy, ensuring compliance with AI governance and security best practices. AI & Data Engineering Collaboration Work with AI/ML, Data Engineering, and DevOps teams to optimize AI model deployments, data pipelines, and real-time/batch processing for AI-driven solutions. Engage in Agile ceremonies, backlog refinement, and collaborative problem-solving to scale AI-powered workflows in areas like fraud detection, claims processing, and intelligent automation. Cross-Functional Coordination and Communication Collaborate with Product, UX, and Compliance teams to align AI-powered features with user needs, security policies, and regulatory frameworks (HIPAA, GDPR, SOC2). Ensure seamless integration of structured and unstructured data sources (SQL, NoSQL, vector databases) to improve AI model accuracy and retrieval efficiency. Mentorship & Knowledge Sharing Mentor junior engineers on AI model integration, API development, and scalable data engineering best practices, and conduct knowledge-sharing sessions. Education & Experience Required 12-18 years of experience in software engineering or AI/ML development, preferably in AI-driven solutions. Hands-on experience with Agile development, SDLC, CI/CD pipelines, and AI model deployment lifecycles. Bachelor’s Degree or equivalent in Computer Science, Engineering, Data Science, or a related field. Proficiency in full-stack development with expertise in Python (preferred for AI), Java Experience with structured & unstructured data: SQL (PostgreSQL, MySQL, SQL Server) NoSQL (OpenSearch, Redis, Elasticsearch) Vector Databases (FAISS, Pinecone, ChromaDB) Cloud & AI Infrastructure AWS: Lambda, SageMaker, ECS, S3 Azure: Azure OpenAI, ML Studio GenAI Frameworks & Tools: OpenAI API, Hugging Face Transformers, LangChain, LlamaIndex, AutoGPT, CrewAI. Experience in LLM deployment, retrieval-augmented generation (RAG), and AI search optimization. Proficiency in AI model evaluation (BLEU, ROUGE, BERT Score, cosine similarity) and responsible AI deployment. Strong problem-solving skills, AI ethics awareness, and the ability to collaborate across AI, DevOps, and data engineering teams. Curiosity and eagerness to explore new AI models, tools, and best practices for scalable GenAI adoption. About Athenahealth Here’s our vision: To create a thriving ecosystem that delivers accessible, high-quality, and sustainable healthcare for all. What’s unique about our locations? From an historic, 19th century arsenal to a converted, landmark power plant, all of athenahealth’s offices were carefully chosen to represent our innovative spirit and promote the most positive and productive work environment for our teams. Our 10 offices across the United States and India — plus numerous remote employees — all work to modernize the healthcare experience, together. Our Company Culture Might Be Our Best Feature. We don't take ourselves too seriously. But our work? That’s another story. athenahealth develops and implements products and services that support US healthcare: It’s our chance to create healthier futures for ourselves, for our family and friends, for everyone. Our vibrant and talented employees — or athenistas, as we call ourselves — spark the innovation and passion needed to accomplish our goal. We continue to expand our workforce with amazing people who bring diverse backgrounds, experiences, and perspectives at every level, and foster an environment where every athenista feels comfortable bringing their best selves to work. Our size makes a difference, too: We are small enough that your individual contributions will stand out — but large enough to grow your career with our resources and established business stability. Giving back is integral to our culture. Our athenaGives platform strives to support food security, expand access to high-quality healthcare for all, and support STEM education to develop providers and technologists who will provide access to high-quality healthcare for all in the future. As part of the evolution of athenahealth’s Corporate Social Responsibility (CSR) program, we’ve selected nonprofit partners that align with our purpose and let us foster long-term partnerships for charitable giving, employee volunteerism, insight sharing, collaboration, and cross-team engagement. What can we do for you? Along with health and financial benefits, athenistas enjoy perks specific to each location, including commuter support, employee assistance programs, tuition assistance, employee resource groups, and collaborative workspaces — some offices even welcome dogs. In addition to our traditional benefits and perks, we sponsor events throughout the year, including book clubs, external speakers, and hackathons. And we provide athenistas with a company culture based on learning, the support of an engaged team, and an inclusive environment where all employees are valued. We also encourage a better work-life balance for athenistas with our flexibility. While we know in-office collaboration is critical to our vision, we recognize that not all work needs to be done within an office environment, full-time. With consistent communication and digital collaboration tools, athenahealth enables employees to find a balance that feels fulfilling and productive for each individual situation. Show more Show less

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Built systems that power B2B SaaS products? Want to scale them for real-world impact? Our client is solving some of the toughest data problems in India powering fintech intelligence, risk engines, and decision-making platforms where structured data is often missing. Their systems are used by leading institutions to make sense of complex, high-velocity datasets in real time. We’re looking for a Senior Data Engineer who has helped scale B2B SaaS platforms, built pipelines from scratch, and wants to take complete ownership of data architecture and infrastructure decisions. What You'll Do: Design, build, and maintain scalable ETL pipelines using Python , PySpark , and Airflow Architect ingestion and transformation workflows using AWS services like S3 , Lambda , Glue , and EMR Handle large volumes of structured and unstructured data with a focus on performance and reliability Lead data warehouse and schema design across Postgres , MongoDB , DynamoDB , and Elasticsearch Collaborate cross-functionally to ensure data infrastructure aligns with product and analytics goals Build systems from the ground up and contribute to key architectural decisions Mentor junior team members and guide implementation best practices You’re a Great Fit If You Have: 3 to 7 years of experience in data engineering , preferably within B2B SaaS/AI environments ( mandatory ) Strong programming skills in Python and experience with PySpark , and Airflow Strong expertise in designing, building and deploying data pipelines in product environments Mandatory experience in NoSQL databases Hands-on with AWS data services and distributed data processing tools like Spark or Dask Understanding of data modeling , performance tuning , and database design Experience working in fast-paced, product-driven teams and have seen the 0 to 1 journey Awareness of async programming and how it applies in real-world risk/fraud use cases Experience mentoring or guiding junior engineers is preferred Role Details: Location: Mumbai (On-site WFO) Experience: 3 to 7 years Budget: 20 to 30 LPA (Max) Notice Period: 30 days or less If you're from a B2B SaaS background and looking to solve meaningful, large-scale data problems we’d love to talk. Apply now or reach out directly to learn more. Show more Show less

Posted 3 days ago

Apply

1.0 - 4.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

Role : Data Scientist Experience : 1 to 4 Years Work Mode : WFO / Hybrid /Remote if applicable Immediate Joiners Preferred Required Skills & Qualification : An ideal candidate will have experience, as we are building an AI-powered workforce intelligence platform that helps businesses optimize talent strategies, enhance decision making, and drive operational efficiency. Our software leverages cutting-edge AI, NLP, and data science to extract meaningful insights from vast amounts of structured and unstructured workforce data. As part of our new AI team, you will have the opportunity to work on real-world AI applications, contribute to innovative NLP solutions, and gain hands on experience in building AI-driven products from the ground up. Required Skills & Qualification Strong experience in Python programming 1-3 years of experience in Data Science/NLP (Freshers with strong NLP projects are welcome). Proficiency in Python, PyTorch, Scikit-learn, and NLP libraries (NLTK, SpaCy, Hugging Face). Basic knowledge of cloud platforms (AWS, GCP, or Azure). Experience with SQL for data manipulation and analysis. Assist in designing, training, and optimizing ML/NLP models using PyTorch, NLTK, Scikit- learn, and Transformer models (BERT, GPT, etc.). Familiarity with MLOps tools like Airflow, MLflow, or similar. Experience with Big Data processing (Spark, Pandas, or Dask). Help deploy AI/ML solutions on AWS, GCP, or Azure. Collaborate with engineers to integrate AI models into production systems. Expertise in using SQL and Python to clean, preprocess, and analyze large datasets. Learn & Innovate Stay updated with the latest advancements in NLP, AI, and ML frameworks. Strong analytical and problem-solving skills. Willingness to learn, experiment, and take ownership in a fast-paced startup environment. Nice To Have Requirements For The Candidate Desire to grow within the company Team player and Quicker learner Performance-driven Strong networking and outreach skills Exploring aptitude & killer attitude Ability to communicate and collaborate with the team at ease. Drive to get the results and not let anything get in your way. Critical and analytical thinking skills, with a keen attention to detail. Demonstrate ownership and strive for excellence in everything you do. Demonstrate a high level of curiosity and keep abreast of the latest technologies & tools Ability to pick up new software easily and represent yourself peers and co-ordinate during meetings with Customers. What We Offer We offer a market-leading salary along with a comprehensive benefits package to support your well-being. Enjoy a hybrid or remote work setup that prioritizes work-life balance and personal well being. We invest in your career through continuous learning and internal growth opportunities. Be part of a dynamic, inclusive, and vibrant workplace where your contributions are recognized and rewarded. We believe in straightforward policies, open communication, and a supportive work environment where everyone thrives. (ref:hirist.tech) Show more Show less

Posted 4 days ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

Remote

Linkedin logo

About the Role: We are looking for a Generative AI Developer with 6 + years of experience in building AI-driven applications using deep learning and NLP techniques. You will be responsible for designing, fine-tuning, and deploying generative AI models for various use cases, including text generation, image synthesis, and AI-powered automation solutions. Key Responsibilities: Develop and optimize Generative AI models (GPT, LLaMA, Stable Diffusion, DALL·E, MidJourney, etc.) Fine-tune LLMs and diffusion models to meet specific business needs. Implement retrieval-augmented generation (RAG) and integrate AI-powered applications into production. Work with prompt engineering, transfer learning, and custom model training . Develop and deploy AI models using cloud platforms (AWS, GCP, Azure) and MLOps best practices. Optimize model performance for scalability, efficiency, and cost-effectiveness. Work with vector databases (FAISS, Pinecone, Weaviate) to enhance AI applications. Stay updated with the latest trends in AI, deep learning, and NLP and apply research into real-world use cases. Required Skills & Qualifications: 4+ years of hands-on experience in AI/ML development with expertise in Generative AI . Proficiency in Python, TensorFlow, PyTorch, or JAX for deep learning model development. Strong experience with LLMs (GPT, BERT, T5, Claude, Gemini, etc.) and Transformer architectures . Knowledge of computer vision, NLP, and multimodal AI . Hands-on experience with Hugging Face, LangChain, OpenAI APIs , and fine-tuning techniques. Experience in deploying AI models using cloud platforms, Kubernetes, and Docker. Familiarity with MLOps, data pipelines, and vector databases . Strong problem-solving and analytical skills to tackle AI challenges. Preferred Skills: Experience in AI-powered chatbots, speech synthesis, or creative AI applications . Knowledge of distributed computing frameworks (Ray, Spark, Dask) . Understanding of Responsible AI practices , model bias mitigation, and explainability. Why Join Us? Work on cutting-edge AI solutions with real-world impact. Collaborate with leading AI researchers and engineers . Competitive salary, remote work flexibility, and upskilling opportunities. Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Step into the world of AI innovation with the Experts Community of Soul AI (By Deccan AI). We are looking for India’s top 1% NLP Engineers for a unique job opportunity to work with the industry leaders. Who can be a part of the community? We are looking for top-tier Natural Language Processing Engineers with experience in text analytics, LLMs, and speech processing. If you have experience in this field then this is your chance to collaborate with industry leaders. What’s in it for you? Pay above market standards The role is going to be contract based with project timelines from 2 - 12 months , or freelancing. Be a part of Elite Community of professionals who can solve complex AI challenges Work location could be: Remote (Highly likely) Onsite on client location Deccan AI’s Office: Hyderabad or Bangalore Responsibilities: Develop and optimize NLP models (NER, summarization, sentiment analysis) using transformer architectures (BERT, GPT, T5, LLaMA). Build scalable NLP pipelines for real-time and batch processing of large text data and optimize models for performance and deploy on cloud platforms (AWS, GCP, Azure). Implement CI/CD pipelines for automated training, deployment, and monitoring & integrate NLP models with search engines, recommendation systems, and RAG techniques. Ensure ethical AI practices and mentor junior engineers. Required Skills: Expert Python skills with NLP libraries (Hugging Face, SpaCy, NLTK). Experience with transformer-based models (BERT, GPT, T5) and deploying at scale (Flask, Kubernetes, cloud services). Strong knowledge of model optimization, data pipelines (Spark, Dask), and vector databases. Familiar with MLOps, CI/CD (MLflow, DVC), cloud platforms, and data privacy regulations. Nice to Have: Experience with multimodal AI, conversational AI (Rasa, OpenAI API), graph-based NLP, knowledge graphs, and A/B testing for model improvement. Contributions to open-source NLP projects or a strong publication record. What are the next steps? 1. Register on our Soul AI website. 2. Our team will review your profile. 3. Clear all the screening rounds: Clear the assessments once you are shortlisted. As soon as you qualify all the screening rounds (assessments, interviews) you will be added to our Expert Community! 4. Profile matching and Project Allocation: Be patient while we align your skills and preferences with the available project. Skip the Noise. Focus on Opportunities Built for You! Show more Show less

Posted 6 days ago

Apply

2.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

About Neo Group: Neo is a new-age, focused Wealth and Asset Management platform in India, catering to HNIs, UHNIs and multi-family offices. Neo stands on its three pillars of unbiased advisory, transparency and cost-efficiency, to offer comprehensive, trustworthy solutions. Founded by Nitin Jain (ex-CEO of Edelweiss Wealth), Neo has amassed over USD 3 Billion (₹25,000 Cr.) of Assets Under Advice within a short span of 2 years since inception, including USD 360 Million (₹3,000 Cr.) Assets Under Management. We have recently partnered with Peak XV Partners via a USD 35 Million growth round. To know more, please visit: www.neo-group.in Position: Senior Data Scientist Location: Mumbai Experience: 4 - 8 years Job Description: You are a data pro with deep statistical knowledge and analytical aptitude. You know how to make sense of massive amounts of data and gather deep insights. You will use statistics, data mining, machine learning, and deep learning techniques to deliver data-driven insights for clients. You will dig deep to understand their challenges and create innovative yet practical solutions. Responsibilities: • Meeting with the business team to discuss user interface ideas and applications. • Selecting features, building and optimizing classifiers using machine learning techniques • Data mining using state-of-the-art methods • Doing ad-hoc analysis and presenting results in a clear manner • Optimize application for maximum speed and scalability • Assure that all user input is validated before submitting code • Collaborate with other team members and stakeholders • Taking ownership of features and accountability Requirements: • 4+ years’ experience in developing Data Models • Excellent understanding of machine learning techniques and algorithms, such as k-NN, Naive Bayes, SVM, Decision Forests, etc. • Excellent understanding of NLP and language processing • Proficient understanding of Python or PySpark • Good experience of Python and databases such as MongoDB or MySQL • Good applied statistics skills, such as distributions, statistical testing, regression, etc. • Build Acquisition Scorecard Models • Build Behaviour Scorecard Models • Created Threat Detection Models • Created risk profiling model or classification model • Build Threat/Fraud Triggers from various sources of data • Experience with Data Analysis Libraries - NumPy, Pandas, Statsmodels, Dask • Good understanding of Word2vec, RNNs, Transformers, Bert, Resnet, MobileNet, Unet, Mask-RCNN, Siamese Networks, GradCam, image augmentation techniques, GAN, Tensorboard • Ability to provide accurate estimates for tasks and detailed breakdowns for planning and managing sprints • Deployment - Flask, Tensorflow serving, Lambda functions, Docker is a plus • Previous Experience leading a DS team is a plus Personal Qualities: • An ability to perform well in a fast-paced environment • Excellent analytical and multitasking skills • Stays up-to-date on emerging technologies • Data-oriented personality Why join us? We will provide you with the opportunity to challenge yourself and learn new skills, as you become an integral part our growth story. We are group of ambitious people who believe in building a business environment around new age concepts, framework, and technologies built on a strong foundation of industry expertise. We promise you the prospect of being surrounded by smart, ambitious, motivated people, day-in and day-out. That’s the kind of work you can expect to do at Neo. Show more Show less

Posted 6 days ago

Apply

2.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

About Neo Group: Neo is a new-age, focused Wealth and Asset Management platform in India, catering to HNIs, UHNIs and multi-family offices. Neo stands on its three pillars of unbiased advisory, transparency and cost-efficiency, to offer comprehensive, trustworthy solutions. Founded by Nitin Jain (ex-CEO of Edelweiss Wealth), Neo has amassed over USD 3 Billion (₹25,000 Cr.) of Assets Under Advice within a short span of 2 years since inception, including USD 360 Million (₹3,000 Cr.) Assets Under Management. We have recently partnered with Peak XV Partners via a USD 35 Million growth round. To know more, please visit: www.neo-group.in Position: Data Scientist Location: Mumbai Experience: 2-5 years Job Description: You are a data pro with deep statistical knowledge and analytical aptitude. You know how to make sense of massive amounts of data and gather deep insights. You will use statistics, data mining, machine learning, and deep learning techniques to deliver data-driven insights for clients. You will dig deep to understand their challenges and create innovative yet practical solutions. Responsibilities: • Meeting with the business team to discuss user interface ideas and applications. • Selecting features, building and optimizing classifiers using machine learning techniques • Data mining using state-of-the-art methods • Doing ad-hoc analysis and presenting results in a clear manner • Optimize application for maximum speed and scalability • Assure that all user input is validated before submitting code • Collaborate with other team members and stakeholders • Taking ownership of features and accountability Requirements: • 2+ years’ experience in developing Data Models • Excellent understanding of machine learning techniques and algorithms, such as k-NN, Naive Bayes, SVM, Decision Forests, etc. • Excellent understanding of NLP and language processing • Proficient understanding of Python or PySpark • Basic understanding of Python and databases such as MongoDB or MySQL • Good applied statistics skills, such as distributions, statistical testing, regression, etc. • Build Acquisition Scorecard Models • Build Behaviour Scorecard Models • Created Threat Detection Models • Created risk profiling model or classification model • Build Threat/Fraud Triggers from various sources of data • Experience with Data Analysis Libraries - NumPy, Pandas, Statsmodels, Dask • Good understanding of Word2vec, RNNs, Transformers, Bert, Resnet, MobileNet, Unet, Mask-RCNN, Siamese Networks, GradCam, image augmentation techniques, GAN, Tensorboard • Ability to provide accurate estimates for tasks and detailed breakdowns for planning and managing sprints • Deployment - Flask, Tensorflow serving, Lambda functions, Docker is a plus • Previous Experience leading a DS team is a plus Personal Qualities: • An ability to perform well in a fast-paced environment • Excellent analytical and multitasking skills • Stays up-to-date on emerging technologies • Data-oriented personality Why join us? We will provide you with the opportunity to challenge yourself and learn new skills, as you become an integral part our growth story. We are group of ambitious people who believe in building a business environment around new age concepts, framework, and technologies built on a strong foundation of industry expertise. We promise you the prospect of being surrounded by smart, ambitious, motivated people, day-in and day-out. That’s the kind of work you can expect to do at Neo. Show more Show less

Posted 6 days ago

Apply

4.0 - 7.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Your Team Responsibilities Datadesk is a Go to team for any data requirements, new vendor data on-boardings across business units within MSCI. As part of Datadesk, you are proficient in handling a diverse range of datasets, including (Equity, FI, Crypto, Pharma, Thematic, ESG, Private, etc.) Datadesk is a Centralized team to manage askData service. Datadesk also Participates in integration of recently acquired companies. Early adopter of new technologies (AI, Cloud, DSP, etc.) Your Key Responsibilities Utilize Python and frameworks like pandas, numpy, and dask to process, aggregate, and manipulate large financial datasets. Apply statistical modeling and AI techniques to improve data processing, forecasting, and decision-making. Identify opportunities for AI adoption in data processing, analytics, and decision-making. Ensure data quality, integrity, and consistency across different sources. Create presentations and reports that effectively communicate data findings to stakeholders. Take ownership of assigned tasks with minimal supervision, ensuring timely and high-quality deliverables. Your Skills And Experience That Will Help You Excel Degree in computer science, statistics, Information Technology and/or Finance with 4 -7 years of professional experience. You are proficient in PYTHON and it's various frameworks like pandas , numpy and dask. Good knowledge in statistics as well as statistical modelling which will be used for aggregating big data set , resampling data, and explaining data. Data Visualization Framework like Power Bi , Stream lit or any other Python Frontend framework is a plus. You have strong interest in Finance - work experience in finance and /or capital markets You have experience dealing with providers of financial data products (MSCI, Refinitiv, ICE, S&P, Factset etc.) preferred Good communication skills (written and oral) and proficiency in creating presentations. You are an independent worker who can drive certain parts of the work with minimal oversight About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com Show more Show less

Posted 6 days ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Role Description Data Engineer – Job Description We are looking for a highly skilled Data Engineer to design, build, and maintain robust data pipelines and infrastructure. This role requires expertise in Python, PySpark, SQL, and modern cloud platforms such as Snowflake. The ideal candidate will collaborate with business stakeholders and analytics teams to ensure the efficient collection, transformation, and delivery of data to power insights and decision-making. Responsibilities Understand business requirements, system designs, and security standards. Collaborate with SMEs to analyze existing processes, gather functional requirements, and identify improvements. Build and streamline data pipelines using Python, PySpark, SQL, and Spark from various data sources. Support data cataloging and knowledge base development. Develop tools for analytics and data science teams to optimize data product consumption. Enhance data system functionality in collaboration with data and analytics experts. Communicate insights using statistical analysis, data visualization, and storytelling techniques. Manage technical and business documentation for all data engineering efforts. Participate in hands-on development and coordinate with onshore/offshore teams. Requirements 5+ years of experience building data pipelines on on-premise and cloud platforms (e.g., Snowflake). Strong expertise in Python, PySpark, and SQL for data ingestion, transformation, and automation. Experience in developing Python-based applications with visualization libraries such as Plotly and Streamlit. Solid knowledge of data engineering concepts and practices including metadata management and data governance. Proficient in using cloud-based data warehousing and data lake environments. Familiarity with ELT/ETL tools like DBT and Cribl. Experience with incremental data capture, stream ingestion, and real-time data processing. Preferred Qualifications Background in cybersecurity, IT infrastructure, or software systems. 3+ years of experience in cloud-based data warehouse and data lake architectures. Hands-on experience with data visualization tools (e.g., Tableau, Plotly, Streamlit). Strong communication skills and ability to translate complex data into actionable insights. Technical Skills Python PySpark SQL Snowflake (or other cloud data platforms) Plotly, Streamlit, Flask, Dask ELT/ETL tools (DBT, Cribl) Data visualization (Tableau, Plotly) Metadata management & data governance Stream processing & real-time data ingestion Skills Python,Sql,Cloud Platform Show more Show less

Posted 6 days ago

Apply

0.0 - 16.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Indeed logo

Category: Software Development/ Engineering Main location: India, Karnataka, Bangalore Position ID: J0625-0079 Employment Type: Full Time Position Description: Company Profile: Founded in 1976, CGI is among the largest independent IT and business consulting services firms in the world. With 94,000 consultants and professionals across the globe, CGI delivers an end-to-end portfolio of capabilities, from strategic IT and business consulting to systems integration, managed IT and business process services and intellectual property solutions. CGI works with clients through a local relationship model complemented by a global delivery network that helps clients digitally transform their organizations and accelerate results. CGI Fiscal 2024 reported revenue is CA$14.68 billion and CGI shares are listed on the TSX (GIB.A) and the NYSE (GIB). Learn more at cgi.com. Position: Manage Consulting Expert- AI Architect Experience: 13-16 years Category: Software Development/ Engineering Shift Timing: General Shift Location: Bangalore Position ID: J0625-0079 Employment Type: Full Time Education Qualification: Bachelor's degree in Computer Science or related field or higher with minimum 13 years of relevant experience. We are looking for an experienced and visionary AI Architect with a strong engineering background and hands-on implementation experience to lead the development and deployment of AI-powered solutions. The ideal candidate will have a minimum of 13–16 years of experience in software and AI systems design, including extensive exposure to large language models (LLMs), vector databases, and modern AI frameworks such as LangChain. This role requires a balance of strategic architectural planning and tactical engineering execution, working across teams to bring intelligent applications to life. Your future duties and responsibilities: Design robust, scalable architectures for AI/ML systems, including LLM-based and generative AI solutions. Lead the implementation of AI features and services in enterprise-grade products with clear, maintainable code. Develop solutions using LangChain, orchestration frameworks, and vector database technologies. Collaborate with product managers, data scientists, ML engineers, and business stakeholders to gather requirements and translate them into technical designs. Guide teams on best practices for AI system integration, deployment, and monitoring. Define and implement architecture governance, patterns, and reusable frameworks for AI applications. Stay current with emerging AI trends, tools, and methodologies to continuously enhance architecture strategy. Oversee development of Proof-of-Concepts (PoCs) and Minimum Viable Products (MVPs) to validate innovative ideas. Ensure systems are secure, scalable, and high-performing in production environments. Mentor junior engineers and architects to build strong AI and engineering capabilities within the team. Required qualifications to be successful in this role: Must to have Skills- 13–16 years of overall experience in software development, with at least 5+ years in AI/ML system architecture and delivery. Proven expertise in developing and deploying AI/ML models in production environments. Deep knowledge of LLMs, LangChain, prompt engineering, RAG (retrieval-augmented generation), and vector search. Strong programming and system design skills with a solid engineering foundation. Exceptional ability to communicate complex concepts clearly to technical and non-technical stakeholders. Experience with Agile methodologies and cross-functional team leadership. Programming Languages: Python, Java, Scala, SQL AI/ML Frameworks: LangChain, TensorFlow, PyTorch, Scikit-learn, Hugging Face Transformers Data Processing: Apache Spark, Kafka, Pandas, Dask Vector Stores & Retrieval Systems: FAISS, Pinecone, Weaviate, Chroma Cloud Platforms: AWS (SageMaker, Lambda), Azure (ML Studio, OpenAI), Google Cloud AI MLOps & DevOps: Docker, Kubernetes, MLflow, Kubeflow, Airflow, CI/CD tools (GitHub Actions, Jenkins) Databases: PostgreSQL, MongoDB, Redis, BigQuery, Snowflake Tools & Platforms: Databricks, Jupyter Notebooks, Git, Terraform Good to have Skills- Solution Engineering and Implementation Experience in AI Project. Skills: AWS Machine Learning English GitHub Python Jenkins Kubernetes Prometheus Snowflake What you can expect from us: Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.

Posted 1 week ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Position Title: AI/ML Engineer Company : Cyfuture India Pvt. Ltd. Industry : IT Services and IT Consulting Location : Sector 81, NSEZ, Noida (5 Days Work From Office) Website : www.cyfuture.com About Cyfuture Cyfuture is a trusted name in IT services and cloud infrastructure, offering state-of-the-art data center solutions and managed services across platforms like AWS, Azure, and VMWare. We are expanding rapidly in system integration and managed services, building strong alliances with global OEMs like VMWare, AWS, Azure, HP, Dell, Lenovo, and Palo Alto. Position Overview We are hiring an experienced AI/ML Engineer to lead and shape our AI/ML initiatives. The ideal candidate will have hands-on experience in machine learning and artificial intelligence, with strong leadership capabilities and a passion for delivering production-ready solutions. This role involves end-to-end ownership of AI/ML projects, from strategy development to deployment and optimization of large-scale systems. Key Responsibilities Lead and mentor a high-performing AI/ML team. Design and execute AI/ML strategies aligned with business goals. Collaborate with product and engineering teams to identify impactful AI opportunities. Build, train, fine-tune, and deploy ML models in production environments. Manage operations of LLMs and other AI models using modern cloud and MLOps tools. Implement scalable and automated ML pipelines (e.g., with Kubeflow or MLRun). Handle containerization and orchestration using Docker and Kubernetes. Optimize GPU/TPU resources for training and inference tasks. Develop efficient RAG pipelines with low latency and high retrieval accuracy. Automate CI/CD workflows for continuous integration and delivery of ML systems. Key Skills & Expertise 1. Cloud Computing & Deployment Proficiency in AWS, Google Cloud, or Azure for scalable model deployment. Familiarity with cloud-native services like AWS SageMaker, Google Vertex AI, or Azure ML. Expertise in Docker and Kubernetes for containerized deployments Experience with Infrastructure as Code (IaC) using tools like Terraform or CloudFormation. 2. Machine Learning & Deep Learning Strong command of frameworks: TensorFlow, PyTorch, Scikit-learn, XGBoost. Experience with MLOps tools for integration, monitoring, and automation. Expertise in pre-trained models, transfer learning, and designing custom architectures. 3. Programming & Software Engineering Strong skills in Python (NumPy, Pandas, Matplotlib, SciPy) for ML development. Backend/API development with FastAPI, Flask, or Django. Database handling with SQL and NoSQL (PostgreSQL, MongoDB, BigQuery). Familiarity with CI/CD pipelines (GitHub Actions, Jenkins). 4. Scalable AI Systems Proven ability to build AI-driven applications at scale. Handle large datasets, high-throughput requests, and real-time inference. Knowledge of distributed computing: Apache Spark, Dask, Ray. 5. Model Monitoring & Optimization Hands-on with model compression, quantization, and pruning. A/B testing and performance tracking in production. Knowledge of model retraining pipelines for continuous learning. 6. Resource Optimization Efficient use of compute resources: GPUs, TPUs, CPUs. Experience with serverless architectures to reduce cost. Auto-scaling and load balancing for high-traffic systems. 7. Problem-Solving & Collaboration Translate complex ML models into user-friendly applications. Work effectively with data scientists, engineers, and product teams. Write clear technical documentation and architecture reports. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

Role: AI Engineer Join AiDP: Revolutionizing Document Automation through AI At AiDP, we're transforming complex document workflows into seamless experiences with powerful AI-driven automation. We're on a mission to redefine efficiency, accuracy, and collaboration in finance, insurance, and compliance. To continue pushing boundaries, we’re looking for exceptional talent Your Mission: Develop, deploy, and optimize cutting-edge machine learning models for accurate extraction and structuring of data from complex documents. Design and implement scalable NLP pipelines to handle vast quantities of unstructured and structured data. Continuously refine models through experimentation and data-driven analysis to maximize accuracy and efficiency. Collaborate closely with product and engineering teams to deliver impactful, real-world solutions. We’re looking for: Proven expertise in NLP, machine learning, and deep learning with solid knowledge of frameworks such as PyTorch, TensorFlow, Hugging Face, or scikit-learn. Strong proficiency in Python and experience with data processing tools (Pandas, NumPy, Dask). Experience deploying models to production using containerization technologies (Docker, Kubernetes) and cloud platforms (AWS, Azure, GCP). Familiarity with version control systems (Git) and continuous integration/continuous deployment (CI/CD) pipelines. Background in computer science, including understanding of algorithms, data structures, and software engineering best practices. Strong analytical thinking, problem-solving skills, and passion for tackling challenging issues in document automation and compliance workflows Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Vadodara, Gujarat, India

On-site

Linkedin logo

At Rearc, we're committed to empowering engineers to build awesome products and experiences. Success as a business hinges on our people's ability to think freely, challenge the status quo, and speak up about alternative problem-solving approaches. If you're an engineer driven by the desire to solve problems and make a difference, you're in the right place! Our approach is simple — empower engineers with the best tools possible to make an impact within their industry. We're on the lookout for engineers who thrive on ownership and freedom, possessing not just technical prowess, but also exceptional leadership skills. Our ideal candidates are hands-on leaders who don't just talk the talk but also walk the walk, designing and building solutions that push the boundaries of cloud computing. As a Senior Data Engineer at Rearc, you will be at the forefront of driving technical excellence within our data engineering team. Your expertise in data architecture, cloud-native solutions, and modern data processing frameworks will be essential in designing workflows that are optimized for efficiency, scalability, and reliability. You'll leverage tools like Databricks, PySpark, and Delta Lake to deliver cutting-edge data solutions that align with business objectives. Collaborating with cross-functional teams, you will design and implement scalable architectures while adhering to best practices in data management and governance . Building strong relationships with both technical teams and stakeholders will be crucial as you lead data-driven initiatives and ensure their seamless execution. What You Bring 8+ years of experience in data engineering, showcasing expertise in diverse architectures, technology stacks, and use cases. Strong expertise in designing and implementing data warehouse and data lake architectures, particularly in AWS environments. Extensive experience with Python for data engineering tasks, including familiarity with libraries and frameworks commonly used in Python-based data engineering workflows. Proven experience with data pipeline orchestration using platforms such as Airflow, Databricks, DBT or AWS Glue. Hands-on experience with data analysis tools and libraries like Pyspark, NumPy, Pandas, or Dask. Proficiency with Spark and Databricks is highly desirable. Experience with SQL and NoSQL databases, including PostgreSQL, Amazon Redshift, Delta Lake, Iceberg and DynamoDB. In-depth knowledge of data architecture principles and best practices, especially in cloud environments. Proven experience with AWS services, including expertise in using AWS CLI, SDK, and Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, or AWS CDK. Exceptional communication skills, capable of clearly articulating complex technical concepts to both technical and non-technical stakeholders. Demonstrated ability to quickly adapt to new tasks and roles in a dynamic environment. What You'll Do Strategic Data Engineering Leadership: Provide strategic vision and technical leadership in data engineering, guiding the development and execution of advanced data strategies that align with business objectives. Architect Data Solutions: Design and architect complex data pipelines and scalable architectures, leveraging advanced tools and frameworks (e.g., Apache Kafka, Kubernetes) to ensure optimal performance and reliability. Drive Innovation: Lead the exploration and adoption of new technologies and methodologies in data engineering, driving innovation and continuous improvement across data processes. Technical Expertise: Apply deep expertise in ETL processes, data modelling, and data warehousing to optimize data workflows and ensure data integrity and quality. Collaboration and Mentorship: Collaborate closely with cross-functional teams to understand requirements and deliver impactful data solutions—mentor and coach junior team members, fostering their growth and development in data engineering practices. Thought Leadership: Contribute to thought leadership in the data engineering domain through technical articles, conference presentations, and participation in industry forums. Some More About Us Founded in 2016, we pride ourselves on fostering an environment where creativity flourishes, bureaucracy is non-existent, and individuals are encouraged to challenge the status quo. We're not just a company; we're a community of problem-solvers dedicated to improving the lives of fellow software engineers. Our commitment is simple - finding the right fit for our team and cultivating a desire to make things better. If you're a cloud professional intrigued by our problem space and eager to make a difference, you've come to the right place. Join us, and let's solve problems together! Show more Show less

Posted 1 week ago

Apply

6.0 - 9.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description The Risk division is responsible for credit, market and operational risk, model risk, independent liquidity risk, and insurance throughout the firm. RISK BUSINESS The Risk Business identifies, monitors, evaluates, and manages the firm’s financial and non-financial risks in support of the firm’s Risk Appetite Statement and the firm’s strategic plan. Operating in a fast paced and dynamic environment and utilizing the best in class risk tools and frameworks, Risk teams are analytically curious, have an aptitude to challenge, and an unwavering commitment to excellence. Overview To ensure uncompromising accuracy and timeliness in the delivery of the risk metrics, our platform is continuously growing and evolving. Risk Engineering combines the principles of Computer Science, Mathematics and Finance to produce large scale, computationally intensive calculations of risk Goldman Sachs faces with each transaction we engage in. As an Engineer in the Risk Engineering organization, you will have the opportunity to impact one or more aspects of risk management. You will work with a team of talented engineers to drive the build & adoption of common tools, platforms, and applications. The team builds solutions that are offered as a software product or as a hosted service. We are a dynamic team of talented developers and architects who partner with business areas and other technology teams to deliver high profile projects using a raft of technologies that are fit for purpose (Java, Cloud computing, HDFS, Spark, S3, ReactJS, Sybase IQ among many others). A glimpse of the interesting problems that we engineer solutions for, include acquiring high quality data, storing it, performing risk computations in limited amount of time using distributed computing, and making data available to enable actionable risk insights through analytical and response user interfaces. What We Look For Senior Developer in large projects across a global team of developers and risk managers Performance tune applications to improve memory and CPU utilization. Perform statistical analyses to identify trends and exceptions related Market Risk metrics. Build internal and external reporting for the output of risk metric calculation using data extraction tools, such as SQL, and data visualization tools, such as Tableau. Utilize web development technologies to facilitate application development for front end UI used for risk management actions Develop software for calculations using databases like Snowflake, Sybase IQ and distributed HDFS systems. Interact with business users for resolving issues with applications. Design and support batch processes using scheduling infrastructure for calculation and distributing data to other systems. Oversee junior technical team members in all aspects of Software Development Life Cycle (SDLC) including design, code review and production migrations. Skills And Experience Bachelor’s degree in Computer Science, Mathematics, Electrical Engineering or related technical discipline 6-9 years’ experience is working risk technology team in another bank, financial institution. Experience in market risk technology is a plus. Experience with one or more major relational / object databases. Experience in software development, including a clear understanding of data structures, algorithms, software design and core programming concepts Comfortable multi-tasking, managing multiple stakeholders and working as part of a team Comfortable with working with multiple languages Technologies: Scala, Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience in working with process scheduling platforms like Apache Airflow. Should be ready to work in GS proprietary technology like Slang/SECDB An understanding of compute resources and the ability to interpret performance metrics (e.g., CPU, memory, threads, file handles). Knowledge and experience in distributed computing – parallel computation on a single machine like DASK, Distributed processing on Public Cloud. Knowledge of SDLC and experience in working through entire life cycle of the project from start to end About Goldman Sachs At Goldman Sachs, we commit our people, capital and ideas to help our clients, shareholders and the communities we serve to grow. Founded in 1869, we are a leading global investment banking, securities and investment management firm. Headquartered in New York, we maintain offices around the world. We believe who you are makes you better at what you do. We're committed to fostering and advancing diversity and inclusion in our own workplace and beyond by ensuring every individual within our firm has a number of opportunities to grow professionally and personally, from our training and development opportunities and firmwide networks to benefits, wellness and personal finance offerings and mindfulness programs. Learn more about our culture, benefits, and people at GS.com/careers. We’re committed to finding reasonable accommodations for candidates with special needs or disabilities during our recruiting process. Learn more: https://www.goldmansachs.com/careers/footer/disability-statement.html © The Goldman Sachs Group, Inc., 2023. All rights reserved. Goldman Sachs is an equal employment/affirmative action employer Female/Minority/Disability/Veteran/Sexual Orientation/Gender Identity Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Sanas is revolutionizing the way we communicate with the world’s first real-time algorithm, designed to modulate accents, eliminate background noises, and magnify speech clarity. Pioneered by seasoned startup founders with a proven track record of creating and steering multiple unicorn companies, our groundbreaking GDP-shifting technology sets a gold standard. Sanas is a 200-strong team, established in 2020. In this short span, we’ve successfully secured over $100 million in funding. Our innovation have been supported by the industry’s leading investors, including Insight Partners, Google Ventures, Quadrille Capital, General Catalyst, Quiet Capital, and other influential investors. Our reputation is further solidified by collaborations with numerous Fortune 100 companies. With Sanas, you’re not just adopting a product; you’re investing in the future of communication. We’re looking for a sharp, hands-on Data Engineer to help us build and scale the data infrastructure that powers cutting-edge audio and speech AI products. You’ll be responsible for designing robust pipelines, managing high-volume audio data, and enabling machine learning teams to access the right data — fast. As one of the first dedicated data engineers on the team, you'll play a foundational role in shaping how we handle data end-to-end, from ingestion to training-ready features. You'll work closely with ML engineers, research scientists, and product teams to ensure data is clean, accessible, and structured for experimentation and production. Key Responsibilities : Build scalable, fault-tolerant pipelines for ingesting, processing, and transforming large volumes of audio and metadata Design and maintain ETL workflows for training and evaluating ML models, using tools like Airflow or custom pipelines Collaborate with ML research scientists to make raw and derived audio features (e.g., spectrograms, MFCCs) efficiently available for training and inference Manage and organize datasets, including labeling workflows, versioning, annotation pipelines, and compliance with privacy policies Implement data quality, observability, and validation checks across critical data pipelines Help optimize data storage and compute strategies for large-scale training Qualifications : 2–5 years of experience as a Data Engineer, Software Engineer, or similar role with a focus on data infrastructure Proficient in Python, SQL, and working with distributed data processing tools (e.g., Spark, Dask, Beam) Experience with cloud data infrastructure (AWS/GCP), object storage (e.g.,S3), and data orchestration tools Familiarity with audio data and its unique challenges (large file sizes, time-series features, metadata handling) is a strong plus Comfortable working in a fast-paced, iterative startup environment where systems are constantly evolving Strong communication skills and a collaborative mindset — you’ll be working cross-functionally with ML, infra, and product teams Nice to Have : Experience with data for speech models like ASR, TTS, or speaker verification Knowledge of real-time data processing (e.g., Kafka, WebSockets, or low-latency APIs) Background in MLOps, feature engineering, or supporting model lifecycle workflows Experience with labeling tools, audio annotation platforms, or human-in-the-loop systems Joining us means contributing to the world’s first real-time speech understanding platform revolutionizing Contact Centers and Enterprises alike. Our technology empowers agents, transforms customer experiences, and drives measurable growth. But this is just the beginning. You'll be part of a team exploring the vast potential of an increasingly sonic future Show more Show less

Posted 1 week ago

Apply

5.0 - 10.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

We are seeking a highly skilled and motivated Lead DS/ML engineer to join our team. The role is critical to the development of a cutting-edge reporting platform designed to measure and optimize online marketing campaigns. We are seeking a highly skilled Data Scientist / ML Engineer with a strong foundation in data engineering (ELT, data pipelines) and advanced machine learning to develop and deploy sophisticated models. The role focuses on building scalable data pipelines, developing ML models, and deploying solutions in production to support a cutting-edge reporting, insights, and recommendations platform for measuring and optimizing online marketing campaigns. The ideal candidate should be comfortable working across data engineering, ML model lifecycle, and cloud-native technologies. Job Description: Key Responsibilities: Data Engineering & Pipeline Development Design, build, and maintain scalable ELT pipelines for ingesting, transforming, and processing large-scale marketing campaign data. Ensure high data quality, integrity, and governance using orchestration tools like Apache Airflow, Google Cloud Composer, or Prefect. Optimize data storage, retrieval, and processing using BigQuery, Dataflow, and Spark for both batch and real-time workloads. Implement data modeling and feature engineering for ML use cases. Machine Learning Model Development & Validation Develop and validate predictive and prescriptive ML models to enhance marketing campaign measurement and optimization. Experiment with different algorithms (regression, classification, clustering, reinforcement learning) to drive insights and recommendations. Leverage NLP, time-series forecasting, and causal inference models to improve campaign attribution and performance analysis. Optimize models for scalability, efficiency, and interpretability. MLOps & Model Deployment Deploy and monitor ML models in production using tools such as Vertex AI, MLflow, Kubeflow, or TensorFlow Serving. Implement CI/CD pipelines for ML models, ensuring seamless updates and retraining. Develop real-time inference solutions and integrate ML models into BI dashboards and reporting platforms. Cloud & Infrastructure Optimization Design cloud-native data processing solutions on Google Cloud Platform (GCP), leveraging services such as BigQuery, Cloud Storage, Cloud Functions, Pub/Sub, and Dataflow. Work on containerized deployment (Docker, Kubernetes) for scalable model inference. Implement cost-efficient, serverless data solutions where applicable. Business Impact & Cross-functional Collaboration Work closely with data analysts, marketing teams, and software engineers to align ML and data solutions with business objectives. Translate complex model insights into actionable business recommendations. Present findings and performance metrics to both technical and non-technical stakeholders. Qualifications & Skills: Educational Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Science, Machine Learning, Artificial Intelligence, Statistics, or a related field. Certifications in Google Cloud (Professional Data Engineer, ML Engineer) is a plus. Must-Have Skills: Experience: 5-10 years with the mentioned skillset & relevant hands-on experience Data Engineering: Experience with ETL/ELT pipelines, data ingestion, transformation, and orchestration (Airflow, Dataflow, Composer). ML Model Development: Strong grasp of statistical modeling, supervised/unsupervised learning, time-series forecasting, and NLP. Programming: Proficiency in Python (Pandas, NumPy, Scikit-learn, TensorFlow/PyTorch) and SQL for large-scale data processing. Cloud & Infrastructure: Expertise in GCP (BigQuery, Vertex AI, Dataflow, Pub/Sub, Cloud Storage) or equivalent cloud platforms. MLOps & Deployment: Hands-on experience with CI/CD pipelines, model monitoring, and version control (MLflow, Kubeflow, Vertex AI, or similar tools). Data Warehousing & Real-time Processing: Strong knowledge of modern data platforms for batch and streaming data processing. Nice-to-Have Skills: Experience with Graph ML, reinforcement learning, or causal inference modeling. Working knowledge of BI tools (Looker, Tableau, Power BI) for integrating ML insights into dashboards. Familiarity with marketing analytics, attribution modeling, and A/B testing methodologies. Experience with distributed computing frameworks (Spark, Dask, Ray). Location: Bengaluru Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

Remote

Linkedin logo

Who We Are Ontic makes software that corporate and government security professionals use to proactively manage threats, mitigate risks, and make businesses stronger. Built by security and software professionals, the Ontic Platform connects and unifies critical data, business processes, and collaborators in one place, consolidating security intelligence and operations. We call this Connected Intelligence. Ontic serves corporate security teams across key functions, including intelligence, investigations, GSOC, executive protection, and security operations. As Ontic employees, we put our mission first and value the trust bestowed upon us by our clients to help keep their people safe. We approach our clients and each other with empathy while focusing on the execution of our strategy. And we have fun doing it. Who We Are Ontic makes software that corporate and government security professionals use to proactively manage threats, mitigate risks, and make businesses stronger. Built by security and software professionals, the Ontic Platform connects and unifies critical data, business processes, and collaborators in one place, consolidating security intelligence and operations. We call this Connected Intelligence. Ontic serves corporate security teams across key functions, including intelligence, investigations, GSOC, executive protection, and security operations. Key Responsibilities Design, develop, and optimize machine learning models for various business applications. Build and maintain scalable AI feature pipelines for efficient data processing and model training. Develop robust data ingestion, transformation, and storage solutions for big data. Implement and optimize ML workflows, ensuring scalability and efficiency. Monitor and maintain deployed models, ensuring performance, reliability, and retraining when necessary. Qualifications And Experience Bachelor's or Master's degree in Computer Science, Artificial Intelligence, Data Science, or a related field. 4+ years of experience in machine learning, deep learning, or data science roles. Proficiency in Python and ML frameworks/tools such as PyTorch, Langchain Experience with data processing frameworks like Spark, Dask, Airflow and Dagster Hands-on experience with cloud platforms (AWS, GCP, Azure) and ML services. Experience with MLOps tools like ML flow, Kubeflow Familiarity with containerisation and orchestration tools like Docker and Kubernetes. Excellent problem-solving skills and ability to work in a fast-paced environment. Strong communication and collaboration skills. Ontic Benefits & Perks Competitive Salary Medical Benefits Internet Reimbursement Home Office Stipend Continued Education Stipend Festive & Achievement Celebrations Dynamic Office Environment Ontic is an equal opportunity employer. We are committed to a work environment that celebrates diversity. We do not discriminate against any individual based on race, color, sex, national origin, age, religion, marital status, sexual orientation, gender identity, gender expression, military or veteran status, disability, or any factors protected by applicable law. Ontic Benefits & Perks Competitive Salary Medical, Vision & Dental Benefits 401k Stock Options HSA Contribution Learning Stipend Flexible PTO Policy Quarterly company ME (mental escape) days Generous Parental Leave policy Home Office Stipend Mobile Phone Reimbursement Home Internet Reimbursement for Remote Employees Anniversary & Milestone Celebrations Ontic is an equal-opportunity employer. We are committed to a work environment that celebrates diversity. We do not discriminate against any individual based on race, color, sex, national origin, age, religion, marital status, sexual orientation, gender identity, gender expression, military or veteran status, disability, or any factors protected by applicable law. All Ontic employees are expected to understand and adhere to all Ontic Security and Privacy related policies in order to protect Ontic data and our clients data. Show more Show less

Posted 1 week ago

Apply

3.0 - 5.0 years

10 - 15 Lacs

Pune

Work from Office

Naukri logo

Job Description: Sr. Software Engineer (Python) Company: Karini AI Location: Pune (Wakad) Experience Required: 3 - 5 years Compensation: Not Disclosed Role Overview : We are seeking a skilled Sr. Software Engineer with advanced Python skills with a passion for product development and a knowledge of Machine Learning and/or Generative AI. You will collaborate with a talented team of engineers and AI Engineers to design and develop high-quality Generative AI platform on AWS. Key Responsibilities : Designed and developed backend applications and APIs using Python. Work on product development, building robust, scalable, and maintainable solutions. Integrate Generative AI models into production environments to solve real-world problems. Collaborate with cross-functional teams, including data scientists, product managers, and designers, to understand requirements and deliver solutions. Optimize application performance and ensure scalability across cloud environments. Write clean, maintainable, and efficient code while adhering to best practices. Requirements : 3-5 years of hands-on experience in Product development. Demonstrable experience in understanding advanced python concepts for building scalable systems. Demonstrable experience working with FastAPI server in production environment. Familiarity with unit testing, version control and CI/CD Good understanding of Machine Learning concepts and frameworks (e.g., TensorFlow, PyTorch). Experience with integrating and deploying ML models into applications is a plus. Knowledge of database systems (SQL/NoSQL) and RESTful API development. Exposure to containerization (Docker) and cloud platforms (AWS). Strong problem-solving skills and attention to detail. Preferred Qualifications : Bachelor of Engineering in Computer Science, Information Technology, or any other engineering discipline. M.Tech, M.E. & B.E-Computer Science preferred. Hands-on experience in product-focused organizations. Experience working with data pipelines or data engineering tasks. Knowledge of CI/CD pipelines and DevOps practices. Familiarity with version control tools like Git. Interest or experience in Generative AI or NLP applications. What We Offer: Top-tier compensation package, aligned with industry benchmarks. Comprehensive employee benefits including Provident Fund (PF) and medical insurance. Experience working with Ex-AWS founding team with the fastest growing company. Work on innovative AI-driven products that solve complex problems. Collaborate with a talented and passionate team in a dynamic environment. Opportunities for professional growth and skill enhancement in Generative AI A supportive, inclusive, and flexible work culture that values creativity and ownership.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

As a Senior Data Scientist, you will drive data science initiatives from conception to deployment, crafting advanced ML models and providing mentorship to junior colleagues. Collaborate seamlessly across teams to integrate data-driven solutions and maintain data governance compliance. Stay abreast of industry trends, contribute to thought leadership, and innovate solutions for intricate business problems. Responsibilities Lead and manage data science projects from conception to deployment, ensuring alignment with business objectives and deadlines. Develop and deploy AI and statistical algorithms to extract insights and drive actionable recommendations from complex datasets. Provide guidance and mentorship to junior data scientists on advanced analytics techniques, coding best practices, and model interpretation. Design rigorous testing frameworks to evaluate model performance, validate results, and iterate on models to improve accuracy and reliability. Stay updated with the latest advancements in data science methodologies, tools, and technologies, and contribute to the team's knowledge base through sharing insights, attending conferences, and conducting research. Establishing and maintaining data governance policies, ensuring data integrity, security, and compliance with regulations. Qualifications 5+ Years of prior analytics and data science experience in driving projects involving AI and Advanced Analytics. 3+ Years Experience in Deep Learning frameworks, NLP/Text Analytics, SVM, LSTM, Transformers, Neural network. In-depth understanding and hands on experience in working with Large Language Models along with exposure in fine tuning open source models for variety of use case. Strong exposure in prompt engineering, knowledge of vector database, langchain framework and data embeddings. Strong problem-solving skills and the ability to iterate and experiment to optimize AI model behavior. Proficiency in Python programming language for data analysis, machine learning, and web development. Hands-on experience with machine learning libraries such as NumPy, SciPy, and scikit-learn. Experience with distributed computing frameworks such as PySpark and Dask. Excellent problem-solving skills and attention to detail. Ability to communicate effectively with diverse clients/stakeholders. Education Background Bachelor’s in Computer Science, Statistics, Mathematics, or related field. Tier I/II candidates preferred. Show more Show less

Posted 1 week ago

Apply

0 years

1 Lacs

India

On-site

We are looking for a passionate Python IOT developer to join our team at Magneto Dynamics. About Magneto Dynamics – From a humble beginning in 1989 , we have come a long way in quest for innovation. For us , Quality has been a way of life , inbuilt in our system . Our Business profile expanded with our quest for niche and innovative applications and we ventured out to successfully develop intricate sub assemblies and parts of Flow meters for our OEM customer in US and Europe. These include complete assembly of components which are so unique to the industry. Our parts range from complex assemblies like Clutch Calibrators, Pickup sensors to high precision Turbine Rotors, and other critical flow meter related parts. We now operate on a mixed model of design, manufacturing, assembly and outsourcing. From Aluminum and Zinc casting to Stainless Steel machining, we are involved in the entire gamut of servicing any of our customer needs. Our Specialization involves work related to casting, molding, machining, precision fabrication for high precision and high end applications. We have special purpose Wire EDM machine and CNC machines for making precision parts. We also develop customized testing facility and fixtures meeting customer's needs. We are geared up to bring out precision parts utilizing most modern technologies by use of Solid Modelling, CNC turn mills , and VMCs . Our Infrastructure is well established with lean manufacturing concepts. As an ISO 9001 certified company, we are focused on establishing a good Quality system in place. Our Policy is to minimize waste in supply chain leading to tangible benefits to our customers. Strong Management Culture is inbuilt in our system with a clear objective to Delight Customers. With a built-up space of 7000 Sqft on a 12000 sqft own land, a supportive base of vendors and other business associates our ecosystem is quite prepared for all expansion plans. Job Profile- You will be responsible for developing and implementing high-quality software solutions, creating complex applications using cutting-edge programming features and frameworks and collaborating with other teams in the firm to define, design and ship new features. As an active part of our company, you will brainstorm and chalk out solutions to suit our requirements and meet our business goals. You will also be working on data engineering problems and building data pipelines. You would get ample opportunities to work on challenging and innovative projects, using the latest technologies and tools. If you enjoy working in a fast-paced and collaborative environment, we encourage you to apply for this exciting role. We offer industry-standard compensation packages, relocation assistance, and professional growth and development opportunities. Objectives of this role · Develop, test and maintain high-quality software using Python and Embedded C programming language. · Participate in the entire software development lifecycle, building, testing and delivering high-quality solutions. · Collaborate with cross-functional teams to identify and solve complex problems. · Write clean and reusable code that can be easily maintained and scaled. Your tasks- · Create large-scale IOT / Embedded Applications. · Participate in code reviews, ensure code quality and identify areas for improvement to implement practical solutions. · Debugging codes when required and troubleshooting any Python-related queries. · Keep up to date with emerging trends and technologies in Python development. Required skills and qualifications- · Bachelor's degree in Computer Science, Software Engineering or a related field.(Freshers) · Strong Programming Fundamentals. · In-depth understanding of the Python software development stacks, ecosystems, frameworks and tools such as Numpy, Scipy, Pandas, Dask, spaCy, NLTK, sci-kit-learn and PyTorch. · Familiarity with front-end development using HTML, CSS, and JavaScript. · Familiarity with database technologies such as SQL and MySQL. · Excellent problem-solving ability with solid communication and collaboration skills. Interview Date: 09-06-2025 Interview Time: 10am onwards Contact Person: Antony Peter - 9962048534 Interview Venue: Talent Pursuits. Magneto Dynamics No 7,8,9 Venkateswar Ngr Main Road, Perungudi, Chennai, Tamil Nadu 600096 Google Map Link: https://goo.gl/maps/TSStgJfA7B7c7DMj7 Job Types: Full-time, Permanent Pay: Up to ₹150,000.00 per year Benefits: Provident Fund Schedule: Day shift Work Location: In person

Posted 1 week ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

As a Senior Data Scientist, you will drive data science initiatives from conception to deployment, crafting advanced ML models and providing mentorship to junior colleagues. Collaborate seamlessly across teams to integrate data-driven solutions and maintain data governance compliance. Stay abreast of industry trends, contribute to thought leadership, and innovate solutions for intricate business problems. Responsibilities Lead and manage data science projects from conception to deployment, ensuring alignment with business objectives and deadlines. Develop and deploy AI and statistical algorithms to extract insights and drive actionable recommendations from complex datasets. Provide guidance and mentorship to junior data scientists on advanced analytics techniques, coding best practices, and model interpretation. Design rigorous testing frameworks to evaluate model performance, validate results, and iterate on models to improve accuracy and reliability. Stay updated with the latest advancements in data science methodologies, tools, and technologies, and contribute to the team's knowledge base through sharing insights, attending conferences, and conducting research. Establishing and maintaining data governance policies, ensuring data integrity, security, and compliance with regulations. Qualifications 5+ Years of prior analytics and data science experience in driving projects involving AI and Advanced Analytics. 3+ Years Experience in Deep Learning frameworks, NLP/Text Analytics, SVM, LSTM, Transformers, Neural network. In-depth understanding and hands on experience in working with Large Language Models along with exposure in fine tuning open source models for variety of use case. Strong exposure in prompt engineering, knowledge of vector database, langchain framework and data embeddings. Strong problem-solving skills and the ability to iterate and experiment to optimize AI model behavior. Proficiency in Python programming language for data analysis, machine learning, and web development. Hands-on experience with machine learning libraries such as NumPy, SciPy, and scikit-learn. Experience with distributed computing frameworks such as PySpark and Dask. Excellent problem-solving skills and attention to detail. Ability to communicate effectively with diverse clients/stakeholders. Education Background Bachelor’s in Computer Science, Statistics, Mathematics, or related field. Tier I/II candidates preferred. Show more Show less

Posted 1 week ago

Apply

0.0 years

0 Lacs

Perungudi, Chennai, Tamil Nadu

On-site

Indeed logo

We are looking for a passionate Python IOT developer to join our team at Magneto Dynamics. About Magneto Dynamics – From a humble beginning in 1989 , we have come a long way in quest for innovation. For us , Quality has been a way of life , inbuilt in our system . Our Business profile expanded with our quest for niche and innovative applications and we ventured out to successfully develop intricate sub assemblies and parts of Flow meters for our OEM customer in US and Europe. These include complete assembly of components which are so unique to the industry. Our parts range from complex assemblies like Clutch Calibrators, Pickup sensors to high precision Turbine Rotors, and other critical flow meter related parts. We now operate on a mixed model of design, manufacturing, assembly and outsourcing. From Aluminum and Zinc casting to Stainless Steel machining, we are involved in the entire gamut of servicing any of our customer needs. Our Specialization involves work related to casting, molding, machining, precision fabrication for high precision and high end applications. We have special purpose Wire EDM machine and CNC machines for making precision parts. We also develop customized testing facility and fixtures meeting customer's needs. We are geared up to bring out precision parts utilizing most modern technologies by use of Solid Modelling, CNC turn mills , and VMCs . Our Infrastructure is well established with lean manufacturing concepts. As an ISO 9001 certified company, we are focused on establishing a good Quality system in place. Our Policy is to minimize waste in supply chain leading to tangible benefits to our customers. Strong Management Culture is inbuilt in our system with a clear objective to Delight Customers. With a built-up space of 7000 Sqft on a 12000 sqft own land, a supportive base of vendors and other business associates our ecosystem is quite prepared for all expansion plans. Job Profile- You will be responsible for developing and implementing high-quality software solutions, creating complex applications using cutting-edge programming features and frameworks and collaborating with other teams in the firm to define, design and ship new features. As an active part of our company, you will brainstorm and chalk out solutions to suit our requirements and meet our business goals. You will also be working on data engineering problems and building data pipelines. You would get ample opportunities to work on challenging and innovative projects, using the latest technologies and tools. If you enjoy working in a fast-paced and collaborative environment, we encourage you to apply for this exciting role. We offer industry-standard compensation packages, relocation assistance, and professional growth and development opportunities. Objectives of this role · Develop, test and maintain high-quality software using Python and Embedded C programming language. · Participate in the entire software development lifecycle, building, testing and delivering high-quality solutions. · Collaborate with cross-functional teams to identify and solve complex problems. · Write clean and reusable code that can be easily maintained and scaled. Your tasks- · Create large-scale IOT / Embedded Applications. · Participate in code reviews, ensure code quality and identify areas for improvement to implement practical solutions. · Debugging codes when required and troubleshooting any Python-related queries. · Keep up to date with emerging trends and technologies in Python development. Required skills and qualifications- · Bachelor's degree in Computer Science, Software Engineering or a related field.(Freshers) · Strong Programming Fundamentals. · In-depth understanding of the Python software development stacks, ecosystems, frameworks and tools such as Numpy, Scipy, Pandas, Dask, spaCy, NLTK, sci-kit-learn and PyTorch. · Familiarity with front-end development using HTML, CSS, and JavaScript. · Familiarity with database technologies such as SQL and MySQL. · Excellent problem-solving ability with solid communication and collaboration skills. Interview Date: 09-06-2025 Interview Time: 10am onwards Contact Person: Antony Peter - 9962048534 Interview Venue: Talent Pursuits. Magneto Dynamics No 7,8,9 Venkateswar Ngr Main Road, Perungudi, Chennai, Tamil Nadu 600096 Google Map Link: https://goo.gl/maps/TSStgJfA7B7c7DMj7 Job Types: Full-time, Permanent Pay: Up to ₹150,000.00 per year Benefits: Provident Fund Schedule: Day shift Work Location: In person

Posted 1 week ago

Apply

0.0 years

0 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Ready to build the future with AI At Genpact, we don&rsquot just keep up with technology&mdashwe set the pace. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what&rsquos possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Consultant - Pyspark /Python Data Engineer! We are looking for a passionate Python developer to join our team at Genpact. You will be responsible for developing and implementing high-quality software solutions for data transformation and analytics using cutting-edge programming features and frameworks and collaborating with other teams in the firm to define, design and ship new features. As an active part of our company, you will brainstorm and chalk out solutions to suit our requirements and meet our business goals. You will also be working on data engineering problems and building data pipelines. You would get ample opportunities to work on challenging and innovative projects, using the latest technologies and tools. If you enjoy working in a fast-paced and collaborative environment, we encourage you to apply for this exciting role. We offer industry-standard compensation packages, relocation assistance , and professional growth and development opportunities. Responsibilities . Develop, test and maintain high-quality solutions using PySpark /Python programming language. . Participate in the entire software development lifecycle, building, testing and delivering high-quality data pipelines. . Collaborate with cross-functional teams to identify and solve complex problems. . Write clean and reusable code that can be easily maintained and scaled. . Keep up to date with emerging trends and technologies in Python development. Qualifications we seek in you! Minimum qualifications . years of experience as a Python Developer with a strong portfolio of projects. . Bachelor%27s degree in Computer Science , Software Engineering or a related field. . Experience on developing pipelines on cloud platforms such as AWS or Azure using AWS Glue or ADF . In-depth understanding of the Python software development stacks, ecosystems, frameworks and tools such as Numpy , Scipy , Pandas, Dask , spaCy , NLTK, Great Expectations, Splink and PyTorch . . Experience with data platforms such as Databricks/ Snowflake . Experience with front-end development using HTML or Python. . Familiarity with database technologies such as SQL and NoSQL. . Excellent problem-solving ability with solid communication and collaboration skills. Preferred skills and qualifications . Experience with popular Python frameworks such as Django, Flask, FastAPI or Pyramid. . Knowledge of GenAI concepts and LLMs. . Contributions to open-source Python projects or active involvement in the Python community. Why join Genpact Lead AI-first transformation - Build and scale AI solutions that redefine industries Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career &mdashGain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up . Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 1 week ago

Apply

0 years

0 Lacs

Madhya Pradesh, India

On-site

Linkedin logo

Job Overview: We are looking for a AI/ML Developer to join our team of researchers, data scientists, and developers. You will work on cutting-edge AI solutions across industries such as commerce, agriculture, insurance, financial markets, and procurement. Your role involves developing and optimizing machine learning and generative AI models to solve real-world challenges. Key Responsibilities: • Develop and optimize ML, NLP, Deep Learning, and Generative AI models. • Research and implement state-of-the-art algorithms for supervised and unsupervised learning. • Work with large-scale datasets in distributed environments. • Understand business processes to select and apply the best ML approaches. • Ensure scalability and performance of ML solutions. • Collaborate with cross-functional teams, including product owners, designers, and developers. • Solve complex data integration and deployment challenges. • Communicate results effectively using data visualization. • Work in global teams across different time zones. Required Skills & Experience: • Strong experience in Machine Learning, Deep Learning, NLP, and Generative AI . • Hands-on expertise in frameworks like TensorFlow, PyTorch, or Hugging Face Transformers . • Experience with LLMs (Large Language Models), model fine-tuning, and prompt engineering . • Proficiency in Python, R, or Scala for ML development. • Knowledge of cloud-based ML platforms (AWS, Azure, GCP). • Experience with big data processing (Spark, Hadoop, or Dask). • Ability to scale ML models from prototypes to production . • Strong analytical and problem-solving skills. If you’re passionate about pushing the boundaries of ML and GenAI , we’d love to hear from you! Show more Show less

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies