Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be responsible for developing machine learning applications as per the specified requirements. This will involve selecting appropriate datasets and data representation methods, running machine learning tests and experiments, and utilizing machine learning frameworks such as Keras or PyTorch, along with libraries like scikit-learn. Your role will require excellent communication skills, the ability to collaborate effectively within a team, and outstanding analytical and problem-solving capabilities. Additionally, you will be expected to train and retrain systems when necessary. To be successful in this position, you should have a minimum of 4 years of experience in the AI/ML domain, with specific expertise in Object recognition and Natural Language Processing. Proficiency in visualizing and manipulating large datasets, as well as familiarity with tools such as OpenCV and NLTK, will be highly beneficial.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
You will be joining Firstsource Solutions Limited as a Data Scientist where you will be utilizing your expertise in Machine Learning, Deep Learning, Computer Vision, Natural Language Processing, and Generative AI to develop innovative data-driven solutions and applications. Your role will involve designing and deploying dynamic models and applications using modern web frameworks like Flask and FastAPI, ensuring efficient deployment and ongoing monitoring of these systems. Your responsibilities will include designing and implementing advanced ML and DL models, developing web applications for model deployment to enable real-time data processing and user interaction, performing exploratory data analysis to understand underlying patterns and trends, creating data processing pipelines to prepare large datasets for analysis and modeling, employing Generative AI techniques for content generation, and collaborating with cross-functional teams to integrate AI capabilities into products and systems. To excel in this role, you should have a BE, Masters, or PhD in Computer Science, Data Science, Statistics, or a related field, along with 5 to 7 years of relevant experience in a data science role focusing on ML, DL, and statistical modeling. Strong coding skills in Python, experience with Flask or FastAPI, proficiency in ML and DL frameworks (e.g., PyTorch, TensorFlow), CV (e.g., OpenCV), and NLP libraries (e.g., NLTK, spaCy) are essential. Experience with generative models such as GANs, VAEs, or Transformers, deployment skills with Docker, Kubernetes, and CI/CD pipelines, and excellent analytical and communication skills are also required. Certifications in Data Science, ML, or AI from recognized institutions will be an added advantage. This position is available in multiple locations including Hyderabad, Mumbai, Bangalore, and Chennai. Firstsource follows a fair, transparent, and merit-based hiring process and does not ask for money at any stage. Be cautious of fraudulent offers and always verify through official channels or @firstsource.com email addresses.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
You will be working as a Data Scientist at FSL, utilizing your expertise in Machine Learning, Deep Learning, Computer Vision, Natural Language Processing, and Generative AI to create innovative data-driven solutions and applications. Your role will involve designing and implementing advanced ML and DL models, developing web applications for model deployment using Flask and FastAPI, and ensuring efficient deployment and monitoring of these systems. Additionally, you will be responsible for performing data analysis, including exploratory data analysis to identify patterns and trends, as well as developing data processing pipelines to prepare large datasets for analysis and modeling. Your tasks will also include utilizing Generative AI techniques to create new data points and enhance content generation, collaborating with cross-functional teams to integrate AI capabilities into products, and ensuring that AI solutions align with business goals and user requirements. You will need to stay updated with the latest AI, ML, DL, CV, and NLP developments, explore new technologies, and effectively communicate complex quantitative analysis to senior management and other departments. To qualify for this role, you should have a BE, Masters, or PhD in Computer Science, Data Science, Statistics, or a related field, along with 5 to 7 years of experience in a data science role focusing on ML, DL, and statistical modeling. Your technical skills should include strong coding abilities in Python, experience with Flask or FastAPI, proficiency in ML and DL frameworks (PyTorch, TensorFlow), CV (OpenCV), and NLP libraries (NLTK, spaCy). Experience with generative models like GANs, VAEs, or Transformers, as well as deployment skills with Docker, Kubernetes, and CI/CD pipelines, are also required. Strong analytical skills, excellent communication skills, and certifications in Data Science, ML, or AI from recognized institutions will be advantageous. This position is available in Hyderabad, Mumbai, Bangalore, and Chennai.,
Posted 1 week ago
8.0 - 11.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: Python Developer – Data Science & GenAI Solutions Experience: 8-11 years Department: AI/ML Engineering Band – C Job Summary We are seeking a highly motivated and skilled Python Developer with a strong foundation in automation, data science, machine learning, and NLP . The ideal candidate will also have hands-on experience or working knowledge of Generative AI (GenAI) techniques, particularly with Retrieval-Augmented Generation (RAG) implementations. This role requires the ability to work collaboratively across teams, write clean and scalable code, and develop intelligent solutions that drive impact. Key Responsibilities Develop and maintain Python-based automation scripts and pipelines for data ingestion, transformation, and model deployment. Build, train, and deploy machine learning models for predictive analytics and classification/regression tasks. Perform text analytics and Natural Language Processing (NLP), including text preprocessing, named entity recognition (NER), sentiment analysis, and topic modeling. Design and implement Generative AI solutions, including RAG pipelines, using tools like LangChain, LlamaIndex, or similar frameworks. Collaborate with data scientists and DevOps engineers to deploy solutions using cloud-native technologies (AWS preferred). Integrate models into production systems and ensure continuous delivery using version control systems like GitHub. Document code, workflows, and modeling decisions for both technical and non-technical stakeholders. Required Skills And Qualifications Strong proficiency in Python and related libraries (e.g., Pandas, NumPy, Scikit-learn, FastAPI/Flask). Experience with automation frameworks and scripting tools for ETL or system processes. Solid background in data science and ML model development, including model evaluation and optimization. Working knowledge of NLP libraries (e.g., SpaCy, NLTK, HuggingFace Transformers). Familiarity with GenAI technologies, including prompt engineering, fine-tuning, and RAG architecture. Hands-on experience with Git/GitHub for version control and collaboration. Understanding of cloud-native architecture and ability to work in cloud environments (AWS). Ability to write modular, reusable, and well-tested code. Preferred Qualifications Exposure to MLOps practices including CI/CD for ML, model monitoring, and pipelines. Experience with vector databases (e.g., Pinecone, FAISS, Weaviate) for RAG-based solutions. Knowledge of REST APIs and microservices architecture.
Posted 1 week ago
5.0 - 9.0 years
6 - 10 Lacs
Chennai
Work from Office
Educational Requirements Bachelor of Engineering Service Line Application Development and Maintenance Responsibilities Design and develop Python-based applications with integrated AI capabilities. Work on AI-driven solutions such as chatbots, recommendation systems, intelligent agents, or knowledge-based systems. Collaborate with cross-functional teams to gather requirements and deliver AI-powered features. Integrate third-party AI APIs and services (e.g., OpenAI, Azure Cognitive Services). Optimize performance and scalability of AI components. Maintain high standards of code quality and documentation Additional Responsibilities: Experience6+ YearsLocationBangalore Technical and Professional Requirements: Strong proficiency in Python and its standard libraries. Experience with AI concepts such as expert systems, symbolic AI, NLP, or heuristic algorithms. Familiarity with AI tools and libraries like spaCy, NLTK, OpenAI API, or LangChain. Experience with REST APIs, microservices, and cloud platforms (AWS, Azure, GCP). Strong understanding of data structures, algorithms, and software design principles. Excellent communication and problem-solving skills. Experience with AI integration in enterprise applications. Exposure to knowledge representation, semantic web, or ontology modeling. Experience working in Airline Domain will be added advantage Certifications in AI or cloud technologies. Preferred Skills: Technology-Machine Learning-Python Technology-Machine Learning-Responsible AI
Posted 1 week ago
3.0 years
4 - 5 Lacs
Chennai
On-site
Job Title: Freelance Technical Trainer Job Location: Chennai Job type: Freelance Job Description: We are looking for Freelance Technical Trainers to conduct hands-on training sessions, workshops for engineering college students in the following domains: Java Full Stack Development Ethical Hacking & Cybersecurity Natural Language Processing (NLP) Applied Soft Computing Image & Video Processing The trainer should deliver practical, project-based content aligned with academic learning and industry expectations. Responsibilities: Deliver interactive classroom training to college students. Provide industry-focused explanations with real-time tools and project guidance. Design course material, lab manuals &assignments. Assist students in understanding concepts and applying them in practice. Support final-year or minor project mentorship related to the domain. Domain Expertise Required: Java Full Stack Development Java, Spring Boot, REST APIs, React/Angular, MySQL, Git, Node js. Ethical Hacking Kali Linux, Foot printing, Scanning, Web App Security, Penetration Testing, OWASP NLP (Natural Language Processing) Python, NLTK, spaCy, Text Mining, Sentiment Analysis, Chatbots Applied Soft Computing Neural Networks, Fuzzy Systems, Genetic Algorithms (MATLAB/Python) Image & Video Processing OpenCV, Image Filtering, Object Detection, Motion Tracking, MATLAB or Python * We are looking for offline-only candidates. The project started on 25/07/2025* Job Type: Freelance Contract length: 3 months Pay: ₹35,000.00 - ₹45,000.00 per month Experience: Technical Trainer: 3 years (Required) Location: Chennai, Tamil Nadu (Required) Work Location: In person Application Deadline: 27/07/2025 Expected Start Date: 25/07/2025
Posted 1 week ago
4.0 years
0 Lacs
India
Remote
Job Title : Data Research Engineer Location : Remote (Hybrid for Chennai& Mumbai) Experience : 4 Years Responsibilities Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher Qualifications Is a Plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use Them To The Team’s And Company’s Benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently.
Posted 1 week ago
0.0 - 3.0 years
0 - 0 Lacs
Chennai, Tamil Nadu
On-site
Job Title: Freelance Technical Trainer Job Location: Chennai Job type: Freelance Job Description: We are looking for Freelance Technical Trainers to conduct hands-on training sessions, workshops for engineering college students in the following domains: Java Full Stack Development Ethical Hacking & Cybersecurity Natural Language Processing (NLP) Applied Soft Computing Image & Video Processing The trainer should deliver practical, project-based content aligned with academic learning and industry expectations. Responsibilities: Deliver interactive classroom training to college students. Provide industry-focused explanations with real-time tools and project guidance. Design course material, lab manuals &assignments. Assist students in understanding concepts and applying them in practice. Support final-year or minor project mentorship related to the domain. Domain Expertise Required: Java Full Stack Development Java, Spring Boot, REST APIs, React/Angular, MySQL, Git, Node js. Ethical Hacking Kali Linux, Foot printing, Scanning, Web App Security, Penetration Testing, OWASP NLP (Natural Language Processing) Python, NLTK, spaCy, Text Mining, Sentiment Analysis, Chatbots Applied Soft Computing Neural Networks, Fuzzy Systems, Genetic Algorithms (MATLAB/Python) Image & Video Processing OpenCV, Image Filtering, Object Detection, Motion Tracking, MATLAB or Python * We are looking for offline-only candidates. The project started on 25/07/2025* Job Type: Freelance Contract length: 3 months Pay: ₹35,000.00 - ₹45,000.00 per month Experience: Technical Trainer: 3 years (Required) Location: Chennai, Tamil Nadu (Required) Work Location: In person Application Deadline: 27/07/2025 Expected Start Date: 25/07/2025
Posted 1 week ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Title: Senior Data Scientist (NLP) Primary Skills: Data Science, AI/ML, GenAI, LLM, NLP, Any Cloud (Azure/AWS/GCP) Roles & Responsibilities: Develop custom data models and algorithms to apply to data sets. Developing and implementing advanced machine learning models to address complex problems. Communicate findings and recommendations, data-driven insights to PMs and executives. Qualifications: Bachelor's/Master's degree in Computer Science, Engineering, or a related technical field with minimum 4 years' experience. Experience in performing prompt engineering and fine-tuning of the AI/ML, GenAI, LLM models. Practical hands-on fine-tuning/transfer learning/optimisation of the Transformer architecture-based Deep Learning models. Experience in NLP tools such as Word2Vec, TextBlob, NLTK, SpaCy, Gensim, CoreNLP, BERT, GloVe etc. Experience in AWS / Azure / GCP cloud and deploying the APIs using the latest frameworks like FastAPI/ gRPC etc. Experience in Docker for deploying the containers. Deployment of the ML models on the Kubernetes clusters Experience in NoSQL/SQL databases. Good programming skills in Python. Domain knowledge in Online Reputation management or experience in a product-based company (added advantage). Expertise in delivering end-to-end analytical solutions covering multiple technologies & tools to multiple business problems. Interested candidates, please send their resumes to iqbal.kaur@birdeye.com Regards Iqbal kaur
Posted 1 week ago
6.0 - 12.0 years
0 Lacs
maharashtra
On-site
Automation Anywhere is a leader in AI-powered process automation, utilizing AI technologies to drive productivity and innovation across organizations. The company's Automation Success Platform offers a comprehensive suite of solutions including process discovery, RPA, end-to-end process orchestration, document processing, and analytics, all with a security and governance-first approach. By empowering organizations globally, Automation Anywhere aims to unleash productivity gains, drive innovation, enhance customer service, and accelerate business growth. Guided by the vision to enable the future of work through AI-powered automation, the company is committed to unleashing human potential. Learn more at www.automationanywhere.com. Qualifications: - Bachelor's or Master's degree in Computer Science, Engineering, or a related field. - 6 to 12 years of relevant experience. - Proven track record as a Solution Architect or Lead, focusing on integrating Generative AI or exposure to Machine Learning. - Expertise in at least one RPA tool such as Automation Anywhere, UiPath, Blue Prism, Power Automate, and proficiency in programming languages like Python or Java. Skills: - Proficiency in Python or Java for programming and architecture. - Strong analytical and problem-solving skills to translate business requirements into technical solutions. - Experience with statistical packages and machine learning libraries (e.g., R, Python scikit-learn, Spark MLlib). - Familiarity with RDBMS, NoSQL, and Cloud Platforms like AWS/AZURE/GCP. - Knowledge of ethical considerations and data privacy principles related to Generative AI for responsible integration within RPA solutions. - Experience in process analysis, technical documentation, and workflow diagramming. - Designing and implementing scalable, optimized, and secure automation solutions for enterprise-level AI applications. - Expertise in Generative AI technologies such as RAG, LLM, and AI Agent. - Advanced Python programming skills with specialization in Deep Learning frameworks, ML libraries, NLP libraries, and LLM frameworks. Responsibilities: - Lead the design and architecture of complex RPA solutions incorporating Generative AI technologies. - Collaborate with stakeholders to align automation strategies with organizational goals. - Develop high-level and detailed solution designs meeting scalability, reliability, and security standards. - Take technical ownership of end-to-end engagements and mentor a team of senior developers. - Assess the applicability of Generative AI algorithms to optimize automation outcomes. - Stay updated on emerging technologies, particularly in Generative AI, to evaluate their impact on RPA strategies. - Demonstrate adaptability, flexibility, and willingness to work from client locations or office environments as needed. Kindly note that all unsolicited resumes submitted to any @automationanywhere.com email address will not be eligible for an agency fee.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
jaipur, rajasthan
On-site
As a Senior Data Scientist, you will utilize your extensive experience and expertise in AI/ML technologies to drive innovation and solve real-world problems. Your responsibilities will include designing, developing, and optimizing recommendation systems to enhance user experience, building advanced NLP chatbots for automating customer interactions, leading Generative AI solutions, and applying Large Language Models like GPT and BERT to create innovative business-specific solutions. Collaboration with cross-functional teams is key as you integrate machine learning models into production systems, perform data exploration and feature engineering, and stay updated on the latest advancements in AI and ML technologies. Mentoring junior data scientists and engineers, collaborating with product managers and business stakeholders, and ensuring ethical and responsible model development are also crucial aspects of this role. To excel in this position, you should have 5+ years of experience in data science or machine learning, with strong expertise in recommendation systems, chatbots, Generative AI, and Large Language Models. Proficiency in machine learning frameworks like TensorFlow and PyTorch, as well as experience with cloud platforms and data pipelines, will be essential. Your problem-solving skills, statistical knowledge, and ability to convey technical concepts to non-technical stakeholders will further contribute to your success in this role. Joining our rapidly growing product-based company will provide you with the opportunity to collaborate with a talented team, work on high-quality software solutions, and enjoy a competitive compensation and benefits package. If you are passionate about AI/ML technologies and eager to make a significant impact, this role offers an exciting and rewarding career opportunity.,
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Ciklum is looking for a Data Science Engineer to join our team full-time in India. We are a custom product engineering company that supports both multinational organizations and scaling startups to solve their most complex business challenges. With a global team of over 4,000 highly skilled developers, consultants, analysts and product owners, we engineer technology that redefines industries and shapes the way people live. About the role: As a Data Science Engineer, become a part of a cross-functional development team engineering experiences of tomorrow. The prospective team you will be working with is responsible for the design, development, and deployment of innovative, enterprise technology, tools, and standard processes to support the delivery of tax services. The team focuses onthe ability to deliver comprehensive, value-added, and efficient tax services to our clients. It is a dynamic team with professionals of varying backgrounds from tax technical, technology development, change management, and project management. The team consults and executes on a wide range of initiatives involving process and tool development and implementation including training development, engagement management, tool design, and implementation. Responsibilities: Collaborate with engineers, data scientists, and business analysts to understand requirements, refine models, and integrate LLMs into AI solutions Incorporate RLHF and advanced techniques for tax-specific AI outputs Embed generative AI solutions into consolidation, reconciliation, and reporting processes Leverage LLMs to interpret unstructured tax documentation Development and implementation of Deep learning algorithms for AI solutions Stay updated with recent trends in GENAI and apply the latest research and techniques to projects Preprocess raw data, including text normalization, tokenization, and other techniques, to make it suitable for use with NLP models Setup and train large language models and other state-of-the-art neural networks Conduct thorough testing and validation to ensure accuracy and reliability of model implementations Perform statistical analysis of results and optimize model performance for various computational environments, including cloud and edge computing platforms Explore and propose innovative AI use cases to enhance tax functions Partner with tax, finance, and IT teams to integrate AI workflows Collaborate with legal teams to meet regulatory standards for tax data Perform model audits to identify and mitigate risks Monitor and optimize generative models for performance and scalability Requirements: Solid understanding of object-oriented design patterns, concurrency/multithreading, and scalable AI and GenAI model deployment Strong programming skills in Python, PyTorch, TensorFlow, and related libraries Proficiency in RegEx, Spacy, NLTK, and NLP techniques for text representation and semantic extraction Hands-on experience in developing, training, and fine-tuning LLMs and AI models Practical understanding and experience in implementing techniques like CNN, RNN, GANs, RAG, Langchain, and Transformers Expertise in Prompt Engineering techniques and various vector databases Familiarity with Azure Cloud Computing Platform Experience with Docker, Kubernetes, CI/CD pipelines Experience with Deep learning, Computer Vision, CNN, RNN, LSTM Experience with Vector Databases (Milvus, Postgres) What`s in it for you? Strong community: Work alongside top professionals in a friendly, open-door environment Growth focus: Take on large-scale projects with a global impact and expand your expertise Tailored learning: Boost your skills with internal events (meetups, conferences, workshops), Udemy access, language courses, and company-paid certifications Endless opportunities: Explore diverse domains through internal mobility, finding the best fit to gain hands-on experience with cutting-edge technologies Care: We’ve got you covered with company-paid medical insurance, mental health support, and financial & legal consultations About us: At Ciklum, we are always exploring innovations, empowering each other to achieve more, and engineering solutions that matter. With us, you’ll work with cutting-edge technologies, contribute to impactful projects, and be part of a One Team culture that values collaboration and progress. India is a strategic innovation hub for Ciklum, with growing teams in Chennai and Pune leading advancements in EdgeTech, AR/VR, IoT, and beyond. Join us to collaborate on game-changing solutions and take your career to the next level. Want to learn more about us? Follow us on Instagram , Facebook , LinkedIn . Explore, empower, engineer with Ciklum! Interested already? We would love to get to know you! Submit your application. We can’t wait to see you at Ciklum.
Posted 1 week ago
2.0 years
2 - 3 Lacs
India
On-site
Key Responsibilities: Develop and deploy machine learning and deep learning models Work on NLP, computer vision, or recommendation systems Optimize models for performance and scalability Stay updated with the latest AI research and trends Skills We’re Looking For: Strong Python programming skills Experience with ML frameworks (TensorFlow, PyTorch, Scikit-learn) Solid understanding of data preprocessing, model evaluation, and LOps Hands-on with tools like Pandas, NumPy, OpenCV, NLTK/spaCy Exposure to cloud platforms (AWS, GCP, Azure) is a plus Job Type: Full-time Pay: ₹20,000.00 - ₹30,000.00 per month Benefits: Provident Fund Ability to commute/relocate: Palarivattom, Kochi, Kerala: Reliably commute or planning to relocate before starting work (Preferred) Experience: AI: 2 years (Preferred) Work Location: In person Expected Start Date: 25/07/2025
Posted 1 week ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Coditas Solutions is seeking a highly skilled and motivated Data Scientist to join our dynamic team. As a Data Scientist, you will play a key role in designing, implementing, and optimizing machine learning models and algorithms to solve complex business challenges. If you have a passion for leveraging AI and ML technologies to drive innovation, this is an exciting opportunity to contribute to groundbreaking projects. Roles and Responsibilities Design, implement, and optimize machine learning algorithms using R and Python. Work on developing predictive models and decision-making systems. Conduct exploratory data analysis to understand data patterns and insights. Collaborate with data engineers to ensure the availability and quality of data for model training. Deploy machine learning models into production environments. Collaborate with cross-functional teams to integrate models into existing systems. Continuously optimize and improve the performance of machine learning models. Stay updated on the latest advancements in ML algorithms and technologies. Work closely with software engineers to ensure seamless integration of AI/ML solutions. Collaborate with clients to understand their business requirements and customize solutions accordingly. Technical Skills Excellent programming skills with the ability to implement complex algorithms in Python or R. Experience with cloud-based platforms (AWS, Azure, GCP) for deploying machine learning models. Strong experience of minimum 5 years in developing and implementing machine learning algorithms. Experience with model deployment and integration into production systems. Hands-on experience with use of standard classical machine learning libraries such as Scikit learn, NLTK, OpenCV as well as deep learning libraries Tensorflow, PyTorch, Keras. Understanding of machine learning algorithms, techniques, and concepts (linear regression, logistic regression, decision trees, random forests, neural networks, etc.). Experience with data preprocessing, feature engineering, and model evaluation techniques of structured and unstructured data. Proven experience with identifying, creating and selecting relevant features or variables to enhance model performance. Ability to collaborate effectively with cross-functional teams. Previous experience working on real-world AI/ML projects. Should be focused on linear algebra, machine learning, and statistics & probability are preferred. Ability to have a basic knowledge of the LLMs and optimal use of the GenAI models. Strong problem-solving and critical-thinking skills. Excellent communication and collaboration skills.
Posted 1 week ago
0.0 - 2.0 years
0 - 0 Lacs
Palarivattom, Kochi, Kerala
On-site
Key Responsibilities: Develop and deploy machine learning and deep learning models Work on NLP, computer vision, or recommendation systems Optimize models for performance and scalability Stay updated with the latest AI research and trends Skills We’re Looking For: Strong Python programming skills Experience with ML frameworks (TensorFlow, PyTorch, Scikit-learn) Solid understanding of data preprocessing, model evaluation, and LOps Hands-on with tools like Pandas, NumPy, OpenCV, NLTK/spaCy Exposure to cloud platforms (AWS, GCP, Azure) is a plus Job Type: Full-time Pay: ₹20,000.00 - ₹30,000.00 per month Benefits: Provident Fund Ability to commute/relocate: Palarivattom, Kochi, Kerala: Reliably commute or planning to relocate before starting work (Preferred) Experience: AI: 2 years (Preferred) Work Location: In person Expected Start Date: 25/07/2025
Posted 1 week ago
0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Inviting applications for the role of Senior Principal Consultant, Data Scientist for one of our Client (MNC) In this role, we are looking for candidates who have relevant years of experience in Text Mining / Natural Language Processing (NLP) tools, Data sciences, Big Data and algorithms. Full cycle experience desirable in at least 1 Large Scale Text Mining/NLP project from creating a Business use case, Text Analytics assessment/roadmap, Technology & Analytic Solutioning, Implementation and Change Management, considerable experience in Hadoop including development in map-reduce framework. The Text Mining Scientist (TMS) is expected to play a pivotal bridging role between enterprise database teams, and business /functional resources. At a broad level, the TMS will leverage his/her solutioning expertise to translate the customer’s business need into a techno-analytic problem and appropriately work with database teams to bring large scale text analytic solutions to fruition. The right candidate should have prior experience in developing text mining and NLP solutions using open source tools. Responsibilities Develop transformative AI/ML solutions to address our clients' business requirements and challenges Project Delivery - This would entail successful delivery of projects involving data Pre-processing, Model Training and Evaluation, Parameter Tuning Manage Stakeholder/Customer Expectations Project Blue Printing and Project Documentation Creating Project Plan Understand and research cutting edge industrial and academic developments in AI/ML with NLP/NLU applications in diverse industries such as CPG, Finance etc. Conceptualize, Design, build and develop solution algorithms which demonstrate the minimum required functionality within tight timelines Interact with clients to collect, synthesize, and propose requirements and create effective analytics/text mining roadmap. Work with digital development teams to integrate and transform these algorithms into production quality applications Do applied research on a wide array of text analytics and machine learning projects, file patents and publish the papers Collaborate with service line teams to design, implement and manage Gen-AI solution Familiarity with generative models, prompt engineering, and fine-tuning techniques to develop innovative AI solutions. Designing, developing, and implementing solutions tailored to meet client needs. Understanding business requirements and translating them into technical solutions using GEN AI Works closely with service line teams to design, implement, and manage Generative AI solutions Qualifications we seek in you! Minimum Qualifications / Skills MS in Computer Science, Information systems, or Computer engineering Systems Engineering with relevant experience in Text Mining / Natural Language Processing (NLP) tools, Data sciences, Big Data and algorithms Familiarity with Generative AI technologies, Design and Implement GenAI Solutions Technology Open Source Text Mining paradigms such as NLTK, OpenNLP, OpenCalais, StanfordNLP, GATE, UIMA, Lucene, and cloud based NLU tools such as DialogFlow, MS LUIS Exposure to Statistical Toolkits such as R, Weka, S-Plus, Matlab, SAS-Text Miner Strong Core Java experience in large scale product development and functional knowledge of RDBMs Hands on to programing in the Hadoop ecosystem, and concepts in distributed computing Very good python/R programming skills. Java programming skills a plus GenAI Tools Certifications in AI/ML or GenAI Methodology Solutioning & Consulting experience in verticals such as BFSI, CPG, with experience in delivering text analytics on large structured and unstructured data A solid foundation in AI Methodologies like ML, DL, NLP, Neural Networks, Information Retrieval and Extraction, NLG, NLU Exposed to concepts in Natural Language Processing & Statistics, esp., in their application such as Sentiment Analysis, Contextual NLP, Dependency Parsing, Parsing, Chunking, Summarization, etc Demonstrated ability to Conduct look-ahead client research with focus on supplementing and strengthening the client’s analytics agenda with newer tools and techniques Preferred Qualifications/ Skills Technology Expert level of understanding of NLP, NLU and Machine learning/Deep learning methods OpenNLP, OpenCalais, StanfordNLP, GATE, UIMA, Lucene, NoSQL UI development paradigms that would enable Text Mining Insights Visualization, e.g., Adobe Flex Builder, HTML5, CSS3 GenAI AI/ML Tools Linux, Windows, GPU Experience Spark, Scala for distributed computing Deep learning frameworks such as TensorFlow, Keras, Torch, Theano Certifications in AI/ML or GenAI Methodology Social Network modeling paradigms, tools & techniques Text Analytics using Natural Language Processing tools such as Support Vector Machines and Social Network Analysis Previous experience with Text analytics implementations, using open source packages and or SAS-Text Miner Ability to Prioritize, Consultative mindset & Time management skills
Posted 1 week ago
2.0 - 5.0 years
0 Lacs
Jaipur, Rajasthan, India
On-site
Job Title: Data Scientist Job Location: Jaipur Experience: 2 to 5 years Job Description: We are seeking a highly skilled and innovative Data Scientist to join our dynamic and forward-thinking team. This role is ideal for someone who is passionate about advancing the fields of Classical Machine Learning, Conversational AI, and Deep Learning Systems, and thrives on translating complex mathematical challenges into actionable machine learning models. The successful candidate will focus on developing, designing, and maintaining cutting-edge AI-based systems, ensuring seamless and engaging user experiences. Additionally, the role involves active participation in a wide variety of Natural Language Processing (NLP) tasks, including refining and optimizing prompts to enhance the performance of Large Language Models (LLMs). Key Responsibilities: • Generative AI Solutions: Develop innovative Generative AI solutions using machine learning and AI technologies, including building and fine-tuning models such as GANs, VAEs, and Transformers. • Classical ML Models: Design and develop machine learning models (regression, decision trees, SVMs, random forests, gradient boosting, clustering, dimensionality reduction) to address complex business challenges. • Deep Learning Systems: Train, fine-tune, and deploy deep learning models such as CNNs, RNNs, LSTMs, GANs, and Transformers to solve AI problems and optimize performance. • NLP and LLM Optimization: Participate in Natural Language Processing activities, refining and optimizing prompts to improve outcomes for Large Language Models (LLMs), such as GPT, BERT, and T5. • Data Management s Feature Engineering: Work with large datasets, perform data preprocessing, augmentation, and feature engineering to prepare data for machine learning and deep learning models. • Model Evaluation s Monitoring: Fine-tune models through hyperparameter optimization (grid search, random search, Bayesian optimization) to improve performance metrics (accuracy, precision, recall, F1-score). Monitor model performance to address drift, overfitting, and bias. • Code Review s Design Optimization: Participate in code and design reviews, ensuring quality and scalability in system architecture and development. Work closely with other engineers to review algorithms, validate models, and improve overall system efficiency. • Collaboration s Research: Collaborate with cross-functional teams including data scientists, engineers, and product managers to integrate machine learning solutions into production. Stay up to date with the latest AI/ML trends and research, applying cutting-edge techniques to projects. Qualifications: • Educational Background: Bachelor’s or Master’s degree in Computer Science, Mathematics, Statistics, Data Science, or any related field. • Experience in Machine Learning: Extensive experience in both classical machine learning techniques (e.g., regression, SVM, decision trees) and deep learning systems (e.g., neural networks, transformers). Experience with frameworks such as TensorFlow, PyTorch, or Keras. • Natural Language Processing Expertise: Proven experience in NLP, especially with Large Language Models (LLMs) like GPT, BERT, or T5. Experience in prompt engineering, fine-tuning, and optimizing model outcomes is a strong plus. • Programming Skills: Proficiency in Python and relevant libraries such as NumPy, Pandas, Scikit-learn, and natural language processing libraries (e.g., Hugging Face Transformers, NLTK, SpaCy). • Mathematical s Statistical Knowledge: Strong understanding of statistical modeling, probability theory, and mathematical optimization techniques used in machine learning. • Model Deployment s Automation: Experience with deploying machine learning models into production environments using platforms such as AWS SageMaker or Azure ML, GCP AI, or similar. Familiarity with MLOps practices is an advantage. • Code Review s System Design: Experience in code review, design optimization, and ensuring quality in large-scale AI/ML systems. Understanding of distributed computing and parallel processing is a plus. Soft Skills s Behavioural Qualifications: • Must be a good team player and self-motivated to achieve positive results • Must have excellent communication skills in English. • Exhibits strong presentation skills with attention to detail. • It’s essential to have a strong aptitude for learning new techniques. • Takes ownership for responsibilities • Demonstrates a high degree of reliability, integrity, and trustworthiness • Ability to manage time, displays appropriate sense of urgency and meet/exceed all deadlines • Ability to accurately process high volumes of work within established deadlines. Interested candidate can share your cv or reference at sulabh.tailang@celebaltech.com
Posted 1 week ago
0.0 - 5.0 years
0 Lacs
Mumbai, Maharashtra
On-site
Job Information Date Opened 07/23/2025 Industry AEC Job Type Permanent Work Experience 3 - 5 Years City Mumbai State/Province Maharashtra Country India Zip/Postal Code 400093 About Us Axium Global (formerly XS CAD), established in 2002, is a UK-based MEP (M&E) and architectural design and BIM Information Technology Enabled Services (ITES) provider with an ISO 9001:2015 and ISO 27001:2022 certified Global Delivery Centre in Mumbai, India. With additional presence in the USA, Australia and UAE, our global reach allows us to provide services to customers with the added benefit of local knowledge and expertise. Axium Global is established as one of the leading pre-construction planning services companies in the UK and India, serving the building services (MEP), retail, homebuilder, architectural and construction sectors with high-quality MEP engineering design and BIM solutions. Job Description We are looking for a hands-on and visionary AI Lead to spearhead all AI initiatives within our organization. You will lead a focused team comprising 1 Data Scientist, 1 ML Engineer, and 1 Intern, while also being directly involved in designing and implementing AI solutions. The role involves identifying impactful AI use cases, conducting research, proposing tools and deploying AI models into production to enhance products, processes and user experiences. You will work across diverse domains such as NLP, computer vision, recommendation systems, predictive analytics and generative AI. The position also covers conversational AI, intelligent automation, and AI-assisted workflows for the AEC industry. A strong understanding of ethical and responsible AI practices is expected. Key Responsibilities: Lead AI research, tool evaluation and strategy aligned with business needs Build and deploy models for NLP, computer vision, generative AI, recommendation systems and time-series forecasting Guide the development of conversational AI, intelligent automation and design-specific AI tools Mentor and manage a small team of AI/ML professionals Collaborate with cross-functional teams to integrate AI into products and workflows. Ensure ethical use of AI and compliance with data governance standards. Oversee lifecycle of AI models from prototyping to deployment and monitoring. Qualifications and Experience Required: Educational Qualification: BE/BTech or ME/MTech degree in Computer Science, Data Science, Artificial Intelligence or related field Certifications in AI/ML, cloud AI platforms or responsible AI practices are a plus Technical Skills: 4–5 years of experience in AI/ML projects Strong programming skills in Python (must-have); R is a plus Experience with TensorFlow , PyTorch , Scikit-learn , OpenCV Familiarity with NLP tools like spaCy , NLTK and Hugging Face Transformers Backend integration using FastAPI or Flask Experience deploying models using Docker , Kubernetes and cloud services like AWS , GCP or Azure ML Use of MLflow , DVC for experiment tracking and model versioning Strong data handling with Pandas , NumPy , and visualization using Matplotlib , Seaborn Working knowledge of SQL , NoSQL and BI tools like Power BI or Tableau Preferred Exposure (Nice to Have): Familiarity with AEC , design workflows or other data-rich industries Experience collaborating with domain experts to frame and solve AI problems Leadership and Strategic Skills: Proven ability to lead small AI/ML teams. Strong communication and stakeholder management Familiarity with ethical AI principles and data privacy frameworks Ability to translate business problems into AI solutions and deliver results Compensation: The selected candidate will receive competitive compensation and remuneration policies in line with qualifications and experience. Compensation will not be a constraint for the right candidate. What We Offer: A fulfilling working environment that is respectful and ethical A stable and progressive career opportunity State-of-the-art office infrastructure with the latest hardware and software for professional growth In-house, internationally certified training division and innovation team focusing on training and learning the latest tools and trends. Culture of discussing and implementing a planned career growth path with team leaders Transparent fixed and variable compensation policies based on team and individual performances, ensuring a productive association.
Posted 1 week ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Company Description WNS (Holdings) Limited (NYSE: WNS), is a leading Business Process Management (BPM) company. We combine our deep industry knowledge with technology and analytics expertise to co-create innovative, digital-led transformational solutions with clients across 10 industries. We enable businesses in Travel, Insurance, Banking and Financial Services, Manufacturing, Retail and Consumer Packaged Goods, Shipping and Logistics, Healthcare, and Utilities to re-imagine their digital future and transform their outcomes with operational excellence.We deliver an entire spectrum of BPM services in finance and accounting, procurement, customer interaction services and human resources leveraging collaborative models that are tailored to address the unique business challenges of each client. We co-create and execute the future vision of 400+ clients with the help of our 44,000+ employees. Job Description Job Summary:We are seeking a highly skilled and forward-thinking Senior Data Scientist to join our Automation Centre of Excellence within the Research & Analytics team. Expertise in Generative AI and Machine Learning. Adept at leading end-to-end development of high-performance GenAI/ML solutions that streamline complex business workflows and elevate analytical precision. This role demands deep expertise in Data Science, Generative AI (GenAI), Python programming and automation. The ideal candidate will lead the development of intelligent, scalable solutions that automate workflows, enhance decision-making, and unlock business value through advanced AI techniques. Awareness of Microsoft Power Platform is good to have.Roles and Responsibilities:Collaborate with cross-functional teams to identify automation opportunities and deliver AI-driven solutions.Design and implement end-to-end data science workflows using Python, integrating diverse data sources (on-premise and cloud).Lead the transformation of manual, Excel-based processes into robust, governed Python-based automation pipelines.Apply advanced data science techniques including data preprocessing, feature engineering, and model development.Leverage GenAI models (e.g., GPT, DALL·E, LLaMA) for content generation, data exploration, and intelligent automation.Build and deploy applications using Microsoft Power Platform (Power BI, Power Apps, and Power Automate).Integrate systems and automate workflows using WTW Unify.Ensure high standards of code quality, modularity, and scalability in Python development.Implement CI/CD pipelines using Azure DevOps for seamless deployment of data science solutions.Maintain data governance, quality, and compliance with organizational standards.Stay abreast of emerging trends in AI, GenAI, and data engineering to drive innovation.Technical Skills & Tools:Mandatory:Key Skills: Generative AI, Machine Learning, Deep Learning, NLPPython (Data Processing, Engineering, Automation)Libraries: Pandas, Numpy, Seaborn, Matplotlib, Scikit-learn, Tensorflow, Keras, OpenCV, NLTK, Spacy, Gensim, TextBlob, Fasttext, FastApiGenAI frameworks (e.g., OpenAI, Hugging Face, Meta AI, LangChain, LangGraph)Version Control & DevOps Tools: GitHub (CI/CD Actions), Azure DevOpsVersion control systems (ADO/Bitbucket)Preferred:R Programming, Posit Workbench, R ShinyKnowledge of Microsoft Power Platform, WTW UnifyFunctional Expertise:8+ years of experience in data science and GenAI projectsProven track record in deploying GenAI solutions in enterprise environments.Experience in the Insurance domain (e.g., Finance, Actuarial) is a plus.Strong communication skills to engage with technical and non-technical stakeholders. Qualifications Educational Qualifications:Master’s degree in Statistics, Mathematics, Economics, or Econometrics from Tier 1 institutionsOR BE/B-Tech, MCA, or MBA from Tier 1 institutions
Posted 1 week ago
5.0 - 7.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Title: Data Scientist Position Summary: - As a Data Scientist at FSL, you will leverage your expertise in Machine Learning, Deep Learning, Computer Vision, Natural Language Processing and Generative AI to develop innovative data-driven solutions and applications. You will play a key role in designing and deploying dynamic models and applications using modern web frameworks like Flask and FastAPI, ensuring efficient deployment and ongoing monitoring of these systems. Key Responsibilities: - Model Development and Application: Design and implement advanced ML and DL models. Develop web applications for model deployment using Flask and FastAPI to enable real-time data processing and user interaction. - Data Analysis: Perform exploratory data analysis to understand underlying patterns, correlations and trends. Develop comprehensive data processing pipelines to prepare large datasets for analysis and modeling. - Generative AI: Employ Generative AI techniques to create new data points, enhance content generation and innovate within the field of synthetic data production. - Collaborative Development: Work with cross-functional teams to integrate AI capabilities into products and systems. Ensure that all AI solutions are aligned with business goals and user needs. - Research and Innovation: Stay updated with the latest developments in AI, ML, DL, CV and NLP. Explore new technologies and methodologies that can impact our products and services positively. - Communication: Effectively communicate complex quantitative analysis in a clear, precise and actionable manner to senior management and other departments. Required Skills and Qualifications: - Education: BE or Master’s or PhD in Computer Science, Data Science, Statistics or a related field. - Experience: 5 to 7 years of relevant experience in a data science role with a strong focus on ML, DL and statistical modeling. - Technical Skills: Strong coding skills in Python, including experience with Flask or FastAPI. Proficiency in ML, DL frameworks (e.g., PyTorch, TensorFlow), CV (e.g., OpenCV) and NLP libraries (e.g., NLTK, spaCy). - Generative AI: Experience with generative models such as GANs, VAEs or Transformers. - Deployment Skills: Experience with Docker, Kubernetes and continuous integration/continuous deployment (CI/CD) pipelines. - Strong Analytical Skills: Ability to translate complex data into actionable insights. - Communication: Excellent written and verbal communication skills. - Certifications: Certifications in Data Science, ML or AI from recognized institutions is added advantage. Location: Hyderabad, Mumbai, Bangalore and Chennai
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Python Developer – Web Scraping & Data Processing About the Role We are seeking a skilled and detail-oriented Python Developer with hands-on experience in web scraping, document parsing (PDF, HTML, XML), and structured data extraction. You will be part of a core team working on aggregating biomedical content from diverse sources, including grant repositories, scientific journals, conference abstracts, treatment guidelines, and clinical trial databases. Key Responsibilities • Develop scalable Python scripts to scrape and parse biomedical data from websites, pre-print servers, citation indexes, journals, and treatment guidelines. • Build robust modules for splitting multi-record documents (PDFs, HTML, etc.) into individual content units. • Implement NLP-based field extraction pipelines using libraries like spaCy, NLTK, or regex for metadata tagging. • Design and automate workflows using schedulers like cron, Celery, or Apache Airflow for periodic scraping and updates. • Store parsed data in relational (PostgreSQL) or NoSQL (MongoDB) databases with efficient schema design. • Ensure robust logging, exception handling, and content quality validation across all processes. Required Skills and Qualifications • 3+ years of hands-on experience in Python, especially for data extraction, transformation, and loading (ETL). o Strong command over web scraping libraries: BeautifulSoup, Scrapy, Selenium, Playwright o Proficiency in PDF parsing libraries: PyMuPDF, pdfminer.six, PDFPlumber • Experience with HTML/XML parsers: lxml, XPath, html5lib • Familiarity with regular expressions, NLP, and field extraction techniques. • Working knowledge of SQL and/or NoSQL databases (MySQL, PostgreSQL, MongoDB). • Understanding of API integration (RESTful APIs) for structured data sources. • Experience with task schedulers and workflow orchestrators (cron, Airflow, Celery). • Version control using Git/GitHub and comfortable working in collaborative environments. Good to Have • Exposure to biomedical or healthcare data parsing (e.g., abstracts, clinical trials, drug labels). • Familiarity with cloud environments like AWS (Lambda, S3) • Experience with data validation frameworks and building QA rules. • Understanding of ontologies and taxonomies (e.g., UMLS, MeSH) for content tagging. Why Join Us • Opportunity to work on cutting-edge biomedical data aggregation for large-scale AI and knowledge graph initiatives. • Collaborative environment with a mission to improve access and insights from scientific literature. • Flexible work arrangements and access to industry-grade tools and infrastructure.
Posted 1 week ago
7.0 - 12.0 years
22 - 25 Lacs
India
On-site
TECHNICAL ARCHITECT Key Responsibilities 1. Designing technology systems: Plan and design the structure of technology solutions, and work with design and development teams to assist with the process. 2. Communicating: Communicate system requirements to software development teams, and explain plans to developers and designers. They also communicate the value of a solution to stakeholders and clients. 3. Managing Stakeholders: Work with clients and stakeholders to understand their vision for the systems. Should also manage stakeholder expectations. 4. Architectural Oversight: Develop and implement robust architectures for AI/ML and data science solutions, ensuring scalability, security, and performance. Oversee architecture for data-driven web applications and data science projects, providing guidance on best practices in data processing, model deployment, and end-to-end workflows. 5. Problem Solving: Identify and troubleshoot technical problems in existing or new systems. Assist with solving technical problems when they arise. 6. Ensuring Quality: Ensure if systems meet security and quality standards. Monitor systems to ensure they meet both user needs and business goals. 7. Project management: Break down project requirements into manageable pieces of work, and organise the workloads of technical teams. 8. Tool & Framework Expertise: Utilise relevant tools and technologies, including but not limited to LLMs, TensorFlow, PyTorch, Apache Spark, cloud platforms (AWS, Azure, GCP), Web App development frameworks and DevOps practices. 9. Continuous Improvement: Stay current on emerging technologies and methods in AI, ML, data science, and web applications, bringing insights back to the team to foster continuous improvement. Technical Skills 1. Proficiency in AI/ML frameworks such as TensorFlow, PyTorch, Keras, and scikit-learn for developing machine learning and deep learning models. 2. Knowledge or experience working with self-hosted or managed LLMs. 3. Knowledge or experience with NLP tools and libraries (e.g., SpaCy, NLTK, Hugging Face Transformers) and familiarity with Computer Vision frameworks like OpenCV and related libraries for image processing and object recognition. 4. Experience or knowledge in back-end frameworks (e.g., Django, Spring Boot, Node.js, Express etc.) and building RESTful and GraphQL APIs. 5. Familiarity with microservices, serverless, and event-driven architectures. Strong understanding of design patterns (e.g., Factory, Singleton, Observer) to ensure code scalability and reusability. 6. Proficiency in modern front-end frameworks such as React, Angular, or Vue.js, with an understanding of responsive design, UX/UI principles, and state management (e.g., Redux) 7. In-depth knowledge of SQL and NoSQL databases (e.g., PostgreSQL, MongoDB, Cassandra), as well as caching solutions (e.g., Redis, Memcached). 8. Expertise in tools such as Apache Spark, Hadoop, Pandas, and Dask for large-scale data processing. 9. Understanding of data warehouses and ETL tools (e.g., Snowflake, BigQuery, Redshift, Airflow) to manage large datasets. 10. Familiarity with visualisation tools (e.g., Tableau, Power BI, Plotly) for building dashboards and conveying insights. 11. Knowledge of deploying models with TensorFlow Serving, Flask, FastAPI, or cloud-native services (e.g., AWS SageMaker, Google AI Platform). 12. Familiarity with MLOps tools and practices for versioning, monitoring, and scaling models (e.g., MLflow, Kubeflow, TFX). 13. Knowledge or experience in CI/CD, IaC and Cloud Native toolchains. 14. Understanding of security principles, including firewalls, VPC, IAM, and TLS/SSL for secure communication. 15. Knowledge of API Gateway, service mesh (e.g., Istio), and NGINX for API security, rate limiting, and traffic management. Experience Required Technical Architect with 7 - 12 years of experience Salary 13-17 Lpa Job Types: Full-time, Permanent Pay: ₹2,200,000.00 - ₹2,500,000.00 per year Experience: total work: 1 year (Preferred) Work Location: In person
Posted 1 week ago
3.0 - 5.0 years
6 - 11 Lacs
India
On-site
Experience Required: 3-5 years of hands-on experience in full-stack development, system design, and supporting AI/ML data-driven solutions in a production environment. Key Responsibilities Implementing Technical Designs: Collaborate with architects and senior stakeholders to understand high-level designs and break them down into detailed engineering tasks. Implement system modules and ensure alignment with architectural direction. Cross-Functional Collaboration: Work closely with software developers, data scientists, and UI/UX teams to translate system requirements into working code. Clearly communicate technical concepts and implementation plans to internal teams. Stakeholder Support: Participate in discussions with product and client teams to gather requirements. Provide regular updates on development progress and raise flags early to manage expectations. System Development & Integration: Develop, integrate, and maintain components of AI/ML platforms and data-driven applications. Contribute to scalable, secure, and efficient system components based on guidance from architectural leads. Issue Resolution: Identify and debug system-level issues, including deployment and performance challenges. Proactively collaborate with DevOps and QA to ensure resolution. Quality Assurance & Security Compliance: Ensure that implementations meet coding standards, performance benchmarks, and security requirements. Perform unit and integration testing to uphold quality standards. Agile Execution: Break features into technical tasks, estimate efforts, and deliver components in sprints. Participate in sprint planning, reviews, and retrospectives with a focus on delivering value. Tool & Framework Proficiency: Use modern tools and frameworks in your daily workflow, including AI/ML libraries, backend APIs, front-end frameworks, databases, and cloud services, contributing to robust, maintainable, and scalable systems. Continuous Learning & Contribution: Keep up with evolving tech stacks and suggest optimizations or refactoring opportunities. Bring learnings from the industry into internal knowledge-sharing sessions. Proficiency in using AI-copilots for Coding: Adaptation to emerging tools and knowledge of prompt engineering to effectively use AI for day-to-day coding needs. Technical Skills Hands-on experience with Python-based AI/ML development using libraries such as TensorFlow, PyTorch, scikit-learn, or Keras. Hands-on exposure to self-hosted or managed LLMs, supporting integration and fine-tuning workflows as per system needs while following architectural blueprints. Practical implementation of NLP/CV modules using tools like SpaCy, NLTK, Hugging Face Transformers, and OpenCV, contributing to feature extraction, preprocessing, and inference pipelines. Strong backend experience using Django, Flask, or Node.js, and API development (REST or GraphQL). Front-end development experience with React, Angular, or Vue.js, with a working understanding of responsive design and state management. Development and optimization of data storage solutions, using SQL (PostgreSQL, MySQL) and NoSQL (MongoDB, Cassandra), with hands-on experience configuring indexes, optimizing queries, and using caching tools like Redis and Memcached. Working knowledge of microservices and serverless patterns, participating in building modular services, integrating event-driven systems, and following best practices shared by architectural leads. Application of design patterns (e.g., Factory, Singleton, Observer) during implementation to ensure code reusability, scalability, and alignment with architectural standards. Exposure to big data tools like Apache Spark, and Kafka for processing datasets. Familiarity with ETL workflows and cloud data warehouse, using tools such as Airflow, dbt, BigQuery, or Snowflake. Understanding of CI/CD, containerization (Docker), IaC (Terraform), and cloud platforms (AWS, GCP, or Azure). Implementation of cloud security guidelines, including setting up IAM roles, configuring TLS/SSL, and working within secure VPC setups, with support from cloud architects. Exposure to MLOps practices, model versioning, and deployment pipelines using MLflow, FastAPI, or AWS SageMaker. Configuration and management of cloud services such as AWS EC2, RDS, S3, Load Balancers, and WAF, supporting scalable infrastructure deployment and reliability engineering efforts. Personal Attributes Proactive Execution and Communication: Able to take architectural direction and implement it independently with minimal rework with regular communication with stakeholders Collaboration: Comfortable working across disciplines with designers, data engineers, and QA teams. Responsibility: Owns code quality and reliability, especially in production systems. Problem Solver: Demonstrated ability to debug complex systems and contribute to solutioning. Key: Python, Django, Django ORM, HTML, CSS, Bootstrap, JavaScript, jQuery, Multi-threading, Multi-processing, Database Design, Database Administration, Cloud Infrastructure, Data Science, self-hosted LLMs Qualifications Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Science, or a related field. Relevant certifications in cloud or machine learning are a plus. Package: 6-11 LPA Job Types: Full-time, Permanent Pay: ₹600,000.00 - ₹1,100,000.00 per year Work Location: In person
Posted 1 week ago
0 years
0 - 1 Lacs
Mohali
On-site
We are looking for a passionate and motivated Python Intern to join our development team. This role is ideal for candidates who have completed 3–6 months of prior training in Python development and are eager to work on real-world projects involving Django, Flask, NLP, and SQL. Key Responsibilities: Assist in the development of web applications using Django and Flask frameworks. Work with SQL databases to store, retrieve, and manage data efficiently. Collaborate with the team to integrate Natural Language Processing (NLP) features into existing products. Write clean, maintainable, and well-documented code. Participate in team meetings, code reviews, and project planning. Research and suggest improvements for existing workflows and codebases. Required Skills: Must have 3 to 6 months of hands-on training or project experience in Python. Strong understanding of Django and/or Flask frameworks. Basic to intermediate knowledge of SQL and database operations. Exposure to NLP concepts and libraries (like NLTK, spaCy, or similar). Familiarity with REST APIs and JSON. Good problem-solving skills and a willingness to learn. Nice to Have: . Understanding of front-end technologies like HTML, CSS, or JavaScript. Experience working on mini-projects. Salary: 8,000 - 10,000 Location: Mohali (Punjab) Interview Mode will be Personal (Only) Job Types: Full-time, Permanent, Fresher Schedule: Day shift Morning shift Work Location: In person Job Types: Full-time, Permanent, Fresher Pay: ₹8,000.00 - ₹10,000.00 per month
Posted 1 week ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
AI/ML Engineer – Core Algorithm and Model Expert 1. Role Objective: The engineer will be responsible for designing, developing, and optimizing advanced AI/ML models for computer vision, generative AI, Audio processing, predictive analysis and NLP applications. Must possess deep expertise in algorithm development and model deployment as production-ready products for naval applications. Also responsible for ensuring models are modular, reusable, and deployable in resource constrained environments. 2. Key Responsibilities: 2.1. Design and train models using Naval-specific data and deliver them in the form of end products 2.2. Fine-tune open-source LLMs (e.g. LLaMA, Qwen, Mistral, Whisper, Wav2Vec, Conformer models) for Navy-specific tasks. 2.3. Preprocess, label, and augment datasets. 2.4. Implement quantization, pruning, and compression for deployment-ready AI applications. 2.5. The engineer will be responsible for the development, training, fine-tuning, and optimization of Large Language Models (LLMs) and translation models for mission-critical AI applications of the Indian Navy. The candidate must possess a strong foundation in transformer-based architectures (e.g., BERT, GPT, LLaMA, mT5, NLLB) and hands-on experience with pretraining and fine-tuning methodologies such as Supervised Fine-Tuning (SFT), Instruction Tuning, Reinforcement Learning from Human Feedback (RLHF), and Parameter-Efficient Fine-Tuning (LoRA, QLoRA, Adapters). 2.6. Proficiency in building multilingual and domain-specific translation systems using techniques like backtranslation, domain adaptation, and knowledge distillation is essential. 2.7. The engineer should demonstrate practical expertise with libraries such as Hugging Face Transformers, PEFT, Fairseq, and OpenNMT. Knowledge of model compression, quantization, and deployment on GPU-enabled servers is highly desirable. Familiarity with MLOps, version control using Git, and cross-team integration practices is expected to ensure seamless interoperability with other AI modules. 2.8. Collaborate with Backend Engineer for integration via standard formats (ONNX, TorchScript). 2.9. Generate reusable inference modules that can be plugged into microservices or edge devices. 2.10. Maintain reproducible pipelines (e.g., with MLFlow, DVC, Weights & Biases). 3. Educational Qualifications Essential Requirements: 3.1. B Tech / M.Tech in Computer Science, AI/ML, Data Science, Statistics or related field with exceptional academic record. 3.2. Minimum 75% marks or 8.0 CGPA in relevant engineering disciplines. Desired Specialized Certifications: 3.3. Professional ML certifications from Google, AWS, Microsoft, or NVIDIA 3.4. Deep Learning Specialization. 3.5. Computer Vision or NLP specialization certificates. 3.6. TensorFlow/ PyTorch Professional Certification. 4. Core Skills & Tools: 4.1. Languages: Python (must), C++/Rust. 4.2. Frameworks: PyTorch, TensorFlow, Hugging Face Transformers. 4.3. ML Concepts: Transfer learning, RAG, XAI (SHAP/LIME), reinforcement learning LLM finetuning, SFT, RLHF, LoRA, QLorA and PEFT. 4.4. Optimized Inference: ONNX Runtime, TensorRT, TorchScript. 4.5. Data Tooling: Pandas, NumPy, Scikit-learn, OpenCV. 4.6. Security Awareness: Data sanitization, adversarial robustness, model watermarking. 5. Core AI/ML Competencies: 5.1. Deep Learning Architectures: CNNs, RNNs, LSTMs, GRUs, Transformers, GANs, VAEs, Diffusion Models 5.2. Computer Vision: Object detection (YOLO, R-CNN), semantic segmentation, image classification, optical character recognition, facial recognition, anomaly detection. 5.3. Natural Language Processing: BERT, GPT models, sentiment analysis, named entity recognition, machine translation, text summarization, chatbot development. 5.4. Generative AI: Large Language Models (LLMs), prompt engineering, fine-tuning, Quantization, RAG systems, multimodal AI, stable diffusion models. 5.5. Advanced Algorithms: Reinforcement learning, federated learning, transfer learning, few-shot learning, meta-learning 6. Programming & Frameworks: 6.1. Languages: Python (expert level), R, Julia, C++ for performance optimization. 6.2. ML Frameworks: TensorFlow, PyTorch, JAX, Hugging Face Transformers, OpenCV, NLTK, spaCy. 6.3. Scientific Computing: NumPy, SciPy, Pandas, Matplotlib, Seaborn, Plotly 6.4. Distributed Training: Horovod, DeepSpeed, FairScale, PyTorch Lightning 7. Model Development & Optimization: 7.1. Hyperparameter tuning using Optuna, Ray Tune, or Weights & Biases etc. 7.2. Model compression techniques (quantization, pruning, distillation). 7.3. ONNX model conversion and optimization. 8. Generative AI & NLP Applications: 8.1. Intelligence report analysis and summarization. 8.2. Multilingual radio communication translation. 8.3. Voice command systems for naval equipment. 8.4. Automated documentation and report generation. 8.5. Synthetic data generation for training simulations. 8.6. Scenario generation for naval training exercises. 8.7. Maritime intelligence synthesis and briefing generation. 9. Experience Requirements 9.1. Hands-on experience with at least 2 major AI domains. 9.2. Experience deploying models in production environments. 9.3. Contribution to open-source AI projects. 9.4. Led development of multiple end-to-end AI products. 9.5. Experience scaling AI solutions for large user bases. 9.6. Track record of optimizing models for real-time applications. 9.7. Experience mentoring technical teams 10. Product Development Skills 10.1. End-to-end ML pipeline development (data ingestion to model serving). 10.2. User feedback integration for model improvement. 10.3. Cross-platform model deployment (cloud, edge, mobile) 10.4. API design for ML model integration 11. Cross-Compatibility Requirements: 11.1. Define model interfaces (input/output schema) for frontend/backend use. 11.2. Build CLI and REST-compatible inference tools. 11.3. Maintain shared code libraries (Git) that backend/frontend teams can directly call. 11.4. Joint debugging and model-in-the-loop testing with UI and backend teams
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough