Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Machine Learning Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :These roles have many overlapping skills with GENAI Engineers and architects. Description may scaleup/scale down based on expected seniority. Roles & Responsibilities:-Implement generative AI models, identify insights that can be used to drive business decisions. Work closely with multi-functional teams to understand business problems, develop hypotheses, and test those hypotheses with data, collaborating with cross-functional teams to define AI project requirements and objectives, ensuring alignment with overall business goals.-Conducting research to stay up-to-date with the latest advancements in generative AI, machine learning, and deep learning techniques and identify opportunities to integrate them into our products and services.-Optimizing existing generative AI models for improved performance, scalability, and efficiency.-Ensure data quality and accuracy-Leading the design and development of prompt engineering strategies and techniques to optimize the performance and output of our GenAI models.-Implementing cutting-edge NLP techniques and prompt engineering methodologies to enhance the capabilities and efficiency of our GenAI models.-Determining the most effective prompt generation processes and approaches to drive innovation and excellence in the field of AI technology, collaborating with AI researchers and developers-Experience working with cloud based platforms (example:AWS, Azure or related)-Strong problem-solving and analytical skills-Proficiency in handling various data formats and sources through Omni Channel for Speech and voice applications, part of conversational AI-Prior statistical modelling experience-Demonstrable experience with deep learning algorithms and neural networks-Developing clear and concise documentation, including technical specifications, user guides, and presentations, to communicate complex AI concepts to both technical and non-technical stakeholders.-Contributing to the establishment of best practices and standards for generative AI development within the organization. Professional & Technical Skills:-Must have solid experience developing and implementing generative AI models, with a strong understanding of deep learning techniques such as GPT, VAE, and GANs.-Must be proficient in Python and have experience with machine learning libraries and frameworks such as TensorFlow, PyTorch, or Keras.-Must have strong knowledge of data structures, algorithms, and software engineering principles.-Must be familiar with cloud-based platforms and services, such as AWS, GCP, or Azure.-Need to have experience with natural language processing (NLP) techniques and tools, such as SpaCy, NLTK, or Hugging Face.-Must be familiar with data visualization tools and libraries, such as Matplotlib, Seaborn, or Plotly.-Need to have knowledge of software development methodologies, such as Agile or Scrum.-Possess excellent problem-solving skills, with the ability to think critically and creatively to develop innovative AI solutions. Additional Information:-Must have a degree in Computer Science, Artificial Intelligence, Machine Learning, or a related field. A Ph.D. is highly desirable.-strong communication skills, with the ability to effectively convey complex technical concepts to a diverse audience.-You possess a proactive mindset, with the ability to work independently and collaboratively in a fast-paced, dynamic environment. Additional Information: The candidate should have a minimum of 5 years of real time experience in Machine Learning. A 15 years full time education is required. Qualifications 15 years full time education
Posted 2 weeks ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Join us as an "BA4 - Control Data Analytics and Reporting" at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. To be successful as an "BA4 - Control Data Analytics and Reporting", you should have experience with: Basic/ Essential Qualifications Graduate in any discipline. Experience in Controls, Governance, Reporting and Risk Management preferably in a financial services organization. Proficient in MS Office – PPT, Excel, Work & Visio. Proficient in SQL, Tableau and Python. Generating Data Insights and Dashboards from large and diverse data sets. Excellent experience on Tableau, Alteryx, MS Office (i.e. Advance Excel, PowerPoint). Automation skills using VBA, Power Query, PowerApps, etc. Experience in using ETL tools. Good understanding of Risk and Control. Excellent communication skills (Verbal and Written). Good understanding of governance and control frameworks and processes. Highly motivated, business-focused and forward thinking. Experience in senior stakeholder management. Ability to manage relationships across multiple disciplines. Self-driven and proactively participates in team initiatives. Demonstrated initiative in identifying and resolving problems. Desirable Skillsets/ Good To Have Experience in data crunching/ analysis including automation. Experience in handling RDBMS (i.e. SQL/Oracle). Experience in Python, Data Science and Data Analytics Tools and Techniques e.g. MatPlotLib, Data Wrangling, Low Code/No Code environment development preferably in large bank on actual use cases. Understanding of Data Management Principles and data governance. Design and managing SharePoint. Financial Services experience. Location: Noida. You may be assessed on the key critical skills relevant for success in role, such as experience with MS office, MS Power Platforms, Python, Tableau as well as job-specific skillsets. Additional experience in Alteryx would be an added advantage. Purpose of the role To design, develop and consult on the bank’s internal controls framework and supporting policies and standards across the organisation, ensuring it is robust, effective, and aligned to the bank’s overall strategy and risk appetite. Accountabilities Identification and analysis of emerging and evolving risks across functions to understand their potential impact, and likelihood. Communication of the purpose, structure, and importance of the control framework to all relevant stakeholders, including senior management and audit. Support to the development and implementation of the bank's internal controls framework and principles tailored to the banks specific needs and risk profile including design, monitoring, and reporting initiatives . Monitoring and maintenance of the control's frameworks, to ensure compliance and adjust and update as internal and external requirements change. Embedment of the control framework across the bank through cross collaboration, training sessions and awareness campaigns which fosters a culture of knowledge sharing and improvement in risk management and the importance of internal control effectiveness. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team’s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Back to nav Share job X(Opens in new tab or window) Facebook(Opens in new tab or window) LinkedIn(Opens in new tab or window) Show more Show less
Posted 2 weeks ago
5.0 - 10.0 years
35 - 60 Lacs
Bengaluru
Work from Office
Job Role: Data Scientist Experience: 5-8 years Location: Bangalore, India Employment Type: Full Time, Hybrid. About Cognizant Join a rapidly growing consulting and IT services Fortune 500 company with around 350,000 employees worldwide, a very flexible international business, customers that are leaders in their respective sectors, and a high level of commitment. Cognizant Advanced AI Lab / Neuro AI team The Cognizant AI Labs were created to pioneer scientific innovation and bridge the gap to commercial applications. The AI Labs collaborate with institutions, academia and technology partners to develop groundbreaking AI solutions responsibly. The Lab’s mission is to maximize human potential with decision AI, a combination of multi-agent architectures, generative AI, deep learning, evolutionary AI and trustworthy techniques to create sophisticated decision-making systems. Through scientific publications, open-source software, AI for Good projects and the Cognizant Neuro® AI decisioning platform and Multi-Agent Accelerator, the AI Labs support our goal of improving everyday life. Your role: As a data scientist and software engineer you will develop the Neuro AI platform and use it on a variety of projects related to optimizing data-driven decision making. You are a data scientist, Python developer and AI researcher with knowledge in multiple technologies; you are driven, curious and passionate about your work; you are innovative, creative and focused on excellence; and you want to be part of an ego-free work environment where we value honest, healthy interactions and collaboration. Key responsibilities: Design, implement and deploy software applications that analyze datasets, train predictive and prescriptive models, assess uncertainties and interactively present results to end users Monitor and analyze the performance of software applications and infrastructure Collaborate with cross-functional teams to identify and prioritize business requirements Research, design and implement novel AI systems to support decision-making processes Work with the research team to publish papers and patents Communicate research findings to both technical and non-technical audiences Provide guidance on our Neuro AI offering and AI best practices Work in a highly collaborative, fast-paced environment with your peers on the Neuro AI platform and research teams Your profile: PhD or Master’s in Data Science, Computer Science, Statistics, Mathematics, Engineering or related field 5-8 years of experience in artificial intelligence, machine learning, data science and software engineering Strong programming skills in Python with Pandas, Numpy, TensorFlow, PyTorch, Jupyter Notebook, GitHub Experience with handling large datasets, data engineering, statistical analysis, and building predictive models Experience developing AI software platforms and tools Knowledge of data visualization tools (e.g. Matplotlib, Tableau, ) Knowledge of Generative AI and LLMs is a plus Utilize cloud platforms for data processing and analytics, optimize cloud-based solutions for performance, cost, and scalability Strong problem-solving and analytical skills Strong attention to detail and ability to work independently Ability to leverage design thinking, business process optimization, and stakeholder management skills
Posted 2 weeks ago
5.0 years
0 Lacs
Pune/Pimpri-Chinchwad Area
On-site
AI Data Scientist Locations: Pune, India Buildings are getting smarter with connected technologies. With more connectivity, there is access to more data from sensors installed in buildings. Johnson Controls is leading the way in providing AI enabled enterprise solutions that contribute to optimized energy utilization, auto- generation of building insights and enable predictive maintenance for installed devices. Our Data Strategy & Intelligence team is looking for a Data Scientist to join our growing team. You will play a critical role in developing and deploying machine learning/Generative AI and time series analysis models in production. The Role To be successful in this role, the Data Scientist should have a deep knowledge of machine learning concepts, Large Language Models (LLM) including their training , optimization and deployment, time series models as well as experience in developing and deploying ML/Generative AI/ time series models in production. What you will do As An AI Data Scientist At Johnson Controls, You Will Help Develop And Maintain The AI Algorithms And Capabilities Within Our Digital Products. These Applications Will Use Data From Commercial Buildings, Apply Machine Learning, GenAI Or Other Advanced Algorithms To Provide Value In The Following Ways Optimize building energy consumption, occupancy, reduce CO2 emissions, enhance users’ comfort, etc. Generate actionable insights to improve building operations Translate data into direct recommendations for various stakeholders Your efforts will ensure that our AI solutions deliver robust and repeatable outcomes through well-designed algorithms and well-written software. To be successful in this role, the AI Data Scientist should be comfortable applying machine-learning concepts to practical applications while handling the inherent challenges of real-world datasets. How you will do it Contribute as a member of the AI team with assigned tasks Collaborate with product managers to design new AI capabilities Explore and analyze available datasets for potential applications Write Python code to develop ML/Generative AI/time series prediction solutions that address complex business requirements Research and implement state-of-the-art techniques in Generative AI solutions Pre-train and finetune ML over CPU/GPU clusters while optimizing for trade-offs Follow code-quality standards and best practices in software development Develop and maintain test cases to validate algorithm correctness Assess failures to identify causes and plan fixes for bugs Communicate key results to stakeholders Leverage JIRA to plan work and track issues What we look for Bachelor's / Master’s degree in Computer Science, Statistics, Mathematics, or related field. 5+ years of experience of developing and deploying ML Models with a proven record of delivering production ready ML models. Proficiency with Python and standard ML libraries, e.g., PyTorch, Tensorflow, Keras, NumPy, Pandas, scikit-learn, Matplotlib, Transformers. Strong understanding of ML algorithms and techniques, e.g., Regression, Classification, Clustering, Deep Learning, NLP / Transformer models, LLMs and Time Series prediction models. Experience in developing SOA LLM frameworks and models (Azure OpenAI, Meta Llama, etc), advanced prompt engineering techniques, LLMs fine-tuning/training. Experience in working with cloud (AWS / GCP / Azure) based ML/GenAI model development / deployment. Excellent verbal and written communication skills. Preferred Prior Domain experience in smart buildings and building operations optimization Experience in working with Microsoft Azure Cloud. Show more Show less
Posted 2 weeks ago
5.0 - 8.0 years
0 Lacs
Bangalore Urban, Karnataka, India
On-site
Job Role: Data Scientist Experience: 5-8 years Location: Bangalore, India Employment Type: Full Time, Hybrid. About Cognizant Join a rapidly growing consulting and IT services Fortune 500 company with around 350,000 employees worldwide, a very flexible international business, customers that are leaders in their respective sectors, and a high level of commitment. Cognizant Advanced AI Lab / Neuro AI team The Cognizant AI Labs were created to pioneer scientific innovation and bridge the gap to commercial applications. The AI Labs collaborate with institutions, academia and technology partners to develop groundbreaking AI solutions responsibly. The Lab’s mission is to maximize human potential with decision AI, a combination of multi-agent architectures, generative AI, deep learning, evolutionary AI and trustworthy techniques to create sophisticated decision-making systems. Through scientific publications, open-source software, AI for Good projects and the Cognizant Neuro® AI decisioning platform and Multi-Agent Accelerator, the AI Labs support our goal of improving everyday life. Your Role As a data scientist and software engineer you will develop the Neuro AI platform and use it on a variety of projects related to optimizing data-driven decision making. You are a data scientist, Python developer and AI researcher with knowledge in multiple technologies; you are driven, curious and passionate about your work; you are innovative, creative and focused on excellence; and you want to be part of an ego-free work environment where we value honest, healthy interactions and collaboration. Key Responsibilities Design, implement and deploy software applications that analyze datasets, train predictive and prescriptive models, assess uncertainties and interactively present results to end users Monitor and analyze the performance of software applications and infrastructure Collaborate with cross-functional teams to identify and prioritize business requirements Research, design and implement novel AI systems to support decision-making processes Work with the research team to publish papers and patents Communicate research findings to both technical and non-technical audiences Provide guidance on our Neuro AI offering and AI best practices Work in a highly collaborative, fast-paced environment with your peers on the Neuro AI platform and research teams Your Profile PhD or Master’s in Data Science, Computer Science, Statistics, Mathematics, Engineering or related field 5-8 years of experience in artificial intelligence, machine learning, data science and software engineering Strong programming skills in Python with Pandas, Numpy, TensorFlow, PyTorch, Jupyter Notebook, GitHub Experience with handling large datasets, data engineering, statistical analysis, and building predictive models Experience developing AI software platforms and tools Knowledge of data visualization tools (e.g. Matplotlib, Tableau, …) Knowledge of Generative AI and LLMs is a plus Utilize cloud platforms for data processing and analytics, optimize cloud-based solutions for performance, cost, and scalability Strong problem-solving and analytical skills Strong attention to detail and ability to work independently Ability to leverage design thinking, business process optimization, and stakeholder management skills Show more Show less
Posted 2 weeks ago
5.0 - 10.0 years
8 - 13 Lacs
Bengaluru
Work from Office
At Boeing, we innovate and collaborate to make the world a better place. We’re committed to fostering an environment for every teammate that’s welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us. Overview As a leading global aerospace company, Boeing develops, manufactures and services commercial airplanes, defense products and space systems for customers in more than 150 countries. As a top U.S. exporter, the company leverages the talents of a global supplier base to advance economic opportunity, sustainability and community impact. Boeing’s team is committed to innovating for the future, leading with sustainability, and cultivating a culture based on the company’s core values of safety, quality and integrity. Technology for today and tomorrow The Boeing India Engineering & Technology Center (BIETC) is a 5500+ engineering workforce that contributes to global aerospace growth. Our engineers deliver cutting-edge R&D, innovation, and high-quality engineering work in global markets, and leverage new-age technologies such as AI/ML, IIoT, Cloud, Model-Based Engineering, and Additive Manufacturing, shaping the future of aerospace. People-driven culture At Boeing, we believe creativity and innovation thrives when every employee is trusted, empowered, and has the flexibility to choose, grow, learn, and explore. We offer variable arrangements depending upon business and customer needs, and professional pursuits that offer greater flexibility in the way our people work. We also believe that collaboration, frequent team engagements, and face-to-face meetings bring together different perspectives and thoughts – enabling every voice to be heard and every perspective to be respected. No matter where or how our teammates work, we are committed to positively shaping people’s careers and being thoughtful about employee wellbeing. Position Overview: Boeing Test and Evaluation team is currently looking for Associate BI Analyst to join their team in Bengaluru, KA. BI Analyst at Boeing make sure that products at the world’s largest aerospace company continue to meet the highest standards. From quality and reliability to safety and performance, their expertise is vital to the concept, design and certifications of a wide variety of commercial and military systems. Position Responsibilities Develop and maintain high-quality Python code for data analysis and data visualization. Utilize Tableau to create interactive and insightful dashboards and reports for data-driven decision-making. Collaborate closely with data scientists and business analysts to understand requirements and translate them into effective software and visualization solutions. Participate in code reviews, ensuring adherence to best practices and standards. Optimize existing algorithms and systems for improved performance and scalability. Contribute to the integration of machine learning models into production systems. Troubleshoot and resolve issues related to data quality, performance, and visualization. Stay abreast of new technologies and methodologies in software development, data science, and business intelligence. Document software developments and maintain software documentation. Prepare data for analysis, including cleansing, conditioning, transforming, handling missing fields, identifying new feature variables, and handling multivariate data. Monitor production and deployment. Prepare decision support visualizations and reports, algorithms, models, dashboards, and/or tools. Support the development of software applications integrated with insights obtained from data science and business analysis activities. Employer will not sponsor applicants for employment visa status. Basic Qualifications (Required Skills/Experience) Bachelor's degree in Computer Science, Engineering, Business Analytics, or a related field or higher is required. Minimum 5+ years of professional experience in Python development and Tableau. Position requires hands-on experience in working with SQL, Python, R, Data modeling, and Tableau. Solid understanding of software development principles and lifecycle. Familiarity with data structures, algorithms, system design, and business intelligence concepts. Experience with Python libraries such as NumPy, Pandas, Matplotlib, and Scikit-learn. Knowledge of version control systems, preferably Git. Strong problem-solving skills and ability to work in a team environment. Excellent verbal and written communication skills. Preferred Qualifications (Desired Skills/Experience) Candidate must be a self-starter with a positive attitude, high ethics, and a track record of working independently in developing the analytics solutions Must be able to work collaboratively with very strong teaming skills. Must be willing to work flexible hours (early or late as needed) to interface with Boeing personnel around the world. Develop and maintain relationships / partnerships with customers, stakeholders, peers, and partners to develop collaborative plans and execute on projects. Proactively seek information and direction to successfully complete the statement of work. Demonstrate strong written, verbal and interpersonal communication skills. Be fluent in written and spoken English, and have high degree of proficiency with MS Office tools to prepare comprehensive reports, presentations, proposals, and Statements of Work. Preferred experience in handling engineering data sets such as component failure data, engineering production process data, engineering test data, time series data etc. Preferred experience in deploying data science solutions on cloud platforms like PCF, GCP, etc Experience in C# Language or ReactJS is a huge plus. Typical Education & Experience Bachelor or Master degree in Computer Science/ Engineering (Software / Instrumentation / Electronics / Electrical / Mechanical or equivalent discipline) Relocation This position offers relocation based on candidate eligibility within India. Applications for this position will be accepted until Jun. 06, 2025 Export Control This is not an Export Control position. Education Bachelor's Degree or Equivalent Required Relocation This position offers relocation based on candidate eligibility. Visa Sponsorship Employer will not sponsor applicants for employment visa status. Shift Not a Shift Worker (India) Equal Opportunity Employer: We have teams in more than 65 countries, and each person plays a role in helping us become one of the world’s most innovative, diverse and inclusive companies. We are proud members of the Valuable 500 and welcome applications from candidates with disabilities. Applicants are encouraged to share with our recruitment team any accommodations required during the recruitment process. Accommodations may include but are not limited toconducting interviews in accessible locations that accommodate mobility needs, encouraging candidates to bring and use any existing assistive technology such as screen readers and offering flexible interview formats such as virtual or phone interviews.
Posted 2 weeks ago
0 years
0 Lacs
India
Remote
Data Science Intern (Remote) Company: Coreline Solutions Location: Remote / Pune, India Duration: 3 to 6 Months Stipend: Unpaid (Full-time offer based on performance) Work Mode: Remote About Coreline Solutions We’re a tech and consulting company focused on digital transformation, custom software development, and data-driven solutions. Role Overview We’re looking for a Data Science Intern to work on real-world data projects involving analytics, modeling, and business insights. Great opportunity for students or freshers to gain practical experience in the data science domain. Key Responsibilities Collect, clean, and analyze large datasets using Python, SQL, and Excel. Develop predictive and statistical models using libraries like scikit-learn or statsmodels. Visualize data and present insights using tools like Matplotlib, Seaborn, or Power BI. Support business teams with data-driven recommendations. Collaborate with data analysts, ML engineers, and developers. Requirements Pursuing or completed degree in Data Science, Statistics, CS, or related field. Proficient in Python and basic understanding of machine learning. Familiarity with data handling tools (Pandas, NumPy) and SQL. Good analytical and problem-solving skills. Perks Internship Certificate Letter of Recommendation (Top Performers) Mentorship & real-time project experience Potential full-time role To Apply Email your resume to 📧 hr@corelinesolutions.site Subject: “Application for Data Science Intern – [Your Full Name]” Show more Show less
Posted 2 weeks ago
5.0 - 10.0 years
6 - 16 Lacs
Vadodara
Work from Office
Job Title: Python Developer Location: Vadodara, Gujarat Time: 2 pm 11 pm Job description The role of Python Developer will be to develop and maintain various software applications related to North American Power Markets. The developer will work closely with front-office analysts and traders to provide seamless support to all day-to-day operations and provide analytical excellence. Job responsibilities include writing and testing applications, debugging existing applications and creating analytical views in visualization software like Power BI. To be successful in this role, you should have core Python expertise, strong analytical skills and be able to work well in a team environment. Ultimately, you will build and maintain highly responsive applications and reports that align with our business needs. Tasks/Responsibilities: Build new tools using Python based on requirements and maintain features in our existing Python applications. Review, maintain and enhance code (Python) written by current developers Communicate with team to gather requirements, document the applications, and debug production issues on ongoing basis. Enhancing data collection procedures to include information that is relevant for building analytic systems. Processing, Cleansing, and verifying the integrity of the data used for analysis Develop and maintain cash flow analysis tools for weekly and monthly accounting. Develop and maintain automated reporting, charting, and/or dashboard tools that communicate market insights, risks, and opportunities using data visualization software such as Power BI Qualifications : Educational qualification requires minimum of Bachelor’s in Engineering, Mathematics, Statistics, or related field of study. Focus on data analytics or operations research is preferred Strong quantitative and analytics background, including advanced-level skillset in simulation, optimization, statistics, and big data management Expertise in quantitative modelling using Python. Relevant coding experience in data mining is a plus Experience with Python, Django and React JS Familiarity with python modules like Pandas, Numpy, Matplotlib and PyQt is preferred Broad understanding of databases (e.g., SQL Server) is preferred Basic Understanding of Source Control (GIT) is preferred Hands on experience with Data Visualization software such as Tableau or Power BI is a plus Strong written and verbal communication skills Team-oriented and collaborative
Posted 2 weeks ago
2.0 - 7.0 years
6 - 9 Lacs
Noida
Work from Office
We are looking for an experienced and passionate Data Science Trainer who can deliver high-quality training to students, working professionals, and corporate clients. The ideal candidate should be proficient in the latest tools and technologies in the data science field and have a passion for teaching and mentoring. Key Responsibilities: Deliver Engaging Training Sessions: Conduct in-depth, interactive training sessions (online/offline) on core Data Science topics, including Python programming, statistics, machine learning, and data visualization. Curriculum Development: Design, structure, and regularly update course content, projects, and assessments based on current industry standards and student needs. Hands-On Project Guidance: Mentor students on capstone and real-time projects using real-world datasets to strengthen their practical skills and portfolio. Technical Evaluation: Develop and assess assignments, quizzes, and case studies to measure students progress and provide constructive feedback. Interview Preparation: Conduct mock interviews, technical tests, and soft skills sessions to prepare students for job placements in data-related roles. Technology Upgradation: Stay updated with evolving tools and technologies like TensorFlow, Power BI, Tableau, Spark, and integrate them into training modules as needed. Mentorship & Support: Provide personalized mentorship and career guidance to help learners overcome challenges and reach their goals. Corporate & Workshop Training (Optional): Conduct specialized workshops, webinars, and corporate training sessions based on demand. Key Skills Required: Strong knowledge of Python , NumPy , Pandas , Matplotlib , Seaborn Proficiency in Machine Learning (Supervised & Unsupervised algorithms) Hands-on experience with Scikit-learn , TensorFlow , or Keras Familiarity with SQL , Statistics , Data Wrangling , and Data Visualization Exposure to Deep Learning , NLP , and Big Data Tools is a plus Experience with platforms like Jupyter Notebook , Google Colab Strong communication and presentation skills Why Join Uncodemy? Work with a passionate and skilled team of trainers. Opportunity to shape the careers of aspiring professionals. Flexible working hours and a positive work environment.
Posted 2 weeks ago
5.0 - 10.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Requirements Role/ Job Title: Data Scientist Function/ Department: Data & Analytics Job Purpose In this specialized role, you will leverage your expertise in machine learning and statistics to derive valuable insights from data. Your role will include developing predictive models, interpreting data and working closely with out ML engineers to ensure the effective deployment and functioning of these models. Key / Primary Responsibilities Lead cross-functional teams in the design, development, and deployment of Generative AI solutions, with a strong focus on Large Language Models (LLMs). Architect, train, and fine-tune state-of-the-art LLMs (e.g., GPT, BERT, T5) for various business applications, ensuring alignment with project goals. Deploy and scale LLM-based solutions, integrating them seamlessly into production environments and optimizing for performance and efficiency. Develop and maintain machine learning workflows and pipelines for training, evaluating, and deploying Generative AI models, using Python or R, and leveraging libraries like Hugging Face Transformers, TensorFlow, and PyTorch. Collaborate with product, data, and engineering teams to define and refine use cases for LLM applications such as conversational agents, content generation, and semantic search. Design and implement fine-tuning strategies to adapt pre-trained models to domain-specific tasks, ensuring high relevance and accuracy. Evaluate and optimize LLM performance, including handling challenges such as prompt engineering, inference time, and model bias. Manage and process large, unstructured datasets using SQL and NoSQL databases, ensuring smooth integration with AI models. Build and deploy AI-driven APIs and services, providing scalable access to LLM-based solutions. Use data visualization tools (e.g., Matplotlib, Seaborn, Tableau) to communicate AI model performance, insights, and results to non-technical stakeholders. Secondary Responsibilities Contribute to data analysis projects, with a strong emphasis on text analytics, natural language understanding, and Generative AI applications. Build, validate, and deploy predictive models specifically tailored to text data, including models for text generation, classification, and entity recognition. Handle large, unstructured text datasets, performing essential preprocessing and data cleaning steps, such as tokenization, lemmatization, and noise removal, for machine learning and NLP tasks. Work with cutting-edge text data processing techniques, ensuring high-quality input for training and fine-tuning Large Language Models (LLMs). Collaborate with cross-functional teams to develop and deploy scalable AI-powered solutions that process and analyze textual data at scale. Key Success Metrics Ensure timely deliverables. Spot Training Infrastructure fixes. Lead technical aspects of the projects. Error free deliverables. Education Qualification Graduation: Bachelor of Science (B.Sc) / Bachelor of Technology (B.Tech) / Bachelor of Computer Applications (BCA) Post-Graduation: Master of Science (M.Sc) /Master of Technology (M.Tech) / Master of Computer Applications (MCA Experience: 5-10 years of relevant experience Show more Show less
Posted 2 weeks ago
2.0 years
0 Lacs
Thiruvananthapuram, Kerala, India
On-site
Sr. Product Engineer - AI/ML : We are seeking a highly skilled and experienced Sr. Product Engineer - AI/ML with 2+ years experience to join our dynamic team. As a Sr. Product Engineer, you will be responsible for designing, developing, and implementing AI/ML solutions that will drive the success of our products. This is a challenging and rewarding role that requires a strong understanding of AI/ML technologies, as well as excellent problem-solving and communication skills. Duties and Responsibilities Collaborate with cross-functional teams to define product requirements and develop AI/ML solutions. Design and implement machine learning algorithms and models to solve complex business problems. Conduct data analysis and visualization to identify patterns and trends in large datasets. Build and maintain scalable data pipelines for data ingestion, processing, and storage. Research and evaluate new AI/ML technologies and tools to improve product performance and efficiency. Work closely with product managers to prioritize and plan product roadmap based on market trends and customer needs. Collaborate with software engineers to integrate AI/ML models into production systems. Provide technical guidance and mentorship to junior team members. Stay updated with the latest advancements and best practices in AI/ML and apply them to improve product offerings. Ensure compliance with data privacy and security regulations in all AI/ML solutions. Skills and Qualifications Strong understanding of AI/ML concepts and algorithms Proficient in programming languages such as Python, Java, or C++ Experience with machine learning frameworks such as TensorFlow, PyTorch, or Keras Familiarity with data analysis and visualization tools like Pandas, NumPy, and Matplotlib Knowledge of cloud computing platforms like AWS, Azure, or Google Cloud Experience with natural language processing (NLP) and computer vision Ability to design and implement AI/ML models from scratch Strong problem-solving and critical thinking skills Excellent communication and collaboration abilities Experience with agile development methodologies Ability to work independently and in a team environment Knowledge of software development lifecycle (SDLC) Experience with version control systems like Git or SVN Understanding of software testing and debugging processes Ability to adapt to new technologies and learn quickly Notice Period- IMMEDIATE TO 15 DAYS Locations can be Kochi, Trivandrum or Kozhikode. Show more Show less
Posted 2 weeks ago
3.0 - 4.0 years
5 - 6 Lacs
Pune
Work from Office
Job Type Full time Job Description Key Responsibilities: Analyze large datasets to extract actionable insights and drive data-driven decision-making. Develop predictive models and machine learning algorithms to solve business problems. Design and implement data preprocessing, feature engineering, and model training pipelines. Collaborate with cross-functional teams to understand business requirements and translate them into analytical solutions. Communicate findings and recommendations to stakeholders through data visualization and storytelling. Stay abreast of the latest developments in data science, machine learning, and AI technologies. Required Skills: 3-4 years of hands-on experience in data analysis and machine learning. Proficiency in programming languages such as Python or R for data manipulation and analysis. Experience with machine learning libraries/frameworks such as scikit-learn, TensorFlow, or PyTorch. Strong understanding of statistical concepts and techniques. Experience with data visualization tools such as Matplotlib, Seaborn, or Tableau. Familiarity with SQL and relational databases for data querying and manipulation. Excellent problem-solving and analytical skills, with attention to detail. Good communication and collaboration skills, with the ability to work effectively in a team environment.
Posted 2 weeks ago
0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Hiring Now: Expert Teacher for MATLAB, Simulink, Java & Python 📅 Job Type : Part-Time / Full-Time / Freelance (Flexible Options) 🧠 Subject Expertise Required : We are looking for a passionate and experienced educator who can teach the following: MATLAB : Advanced mathematical modeling, simulations, and engineering applications Simulink : Block diagram environment, dynamic systems modeling Java : OOP concepts, GUI programming, application development Python : Core Python, Data Structures, Libraries (NumPy, Pandas, Matplotlib, etc.) 🎓 Eligibility & Qualifications : Bachelor’s/Master’s/Ph.D. in Computer Science, Engineering, or related fields Strong command over at least two of the mentioned subjects (all four is a plus) Prior teaching experience (online or offline) will be an added advantage Excellent communication skills and passion for teaching 🧾 Roles & Responsibilities : Deliver clear, concept-based lessons to students Design assignments, quizzes, and project tasks Conduct doubt-clearing and revision sessions Guide students in project work and real-world problem-solving Track student progress and provide feedback 💸 Salary & Perks : Attractive Pay Package (based on subject and experience) Flexible working hours Opportunity to work with a reputed educational brand Chance to impact the lives of aspiring engineers and programmers 📨 How to Apply : Submit your CV via WhatsApp at 8981679014 📞 For queries or more information, call 8981679014 🔔 Limited vacancies. Apply soon to become a part of our dynamic teaching team! Show more Show less
Posted 2 weeks ago
4.0 - 8.0 years
18 - 25 Lacs
Chennai, Bengaluru
Work from Office
DataScientist with NLP AI ML and testing: Experience: 4-8 years Location: Chennai/bangalore Senior Engineer Key Responsibilities Develop, test, and maintain Python-based applications for AI/ML and NLP usecases Design, build, and train models using ML/NLP techniques and GenAI frameworks Implement unit, integration, and functional tests for model pipelines and services Fine-tune large language models (LLMs) or integrate pre-trained APIs (e.g., OpenAI, Hugging Face) Collaborate with data scientists and MLOps teams to deploy models in production Write reusable, testable, and efficient code with proper documentation Design data pipelines and preprocessing functions for unstructured textual data Participate in code reviews and follow CI/CD best practices Perform exploratory data analysis (EDA) and model validation _______________________________ _________ Must-Have Technical SkillsCategory Languages - Python (Advanced), Shell scripting (Basic) Testing - PyTest, unittest, integration testing, test coverage tools (e.g., coverage.py), mocking (pytest-mock) GenAI Platforms - OpenAI API, LangChain, Hugging Face Transformers, LlamaIndex ML Frameworks - scikit-learn, TensorFlow, PyTorch, XGBoost NLP Libraries - spaCy, NLTK, Hugging Face, Gensim, fastText Data Manipulation - Pandas, NumPy, Dask Visualization - Matplotlib, Seaborn, Plotly DevOps / CI-CD - Git, Docker, Jenkins, GitHub Actions MLOps / Model Serving - MLflow, FastAPI, Streamlit, Flask, ONNX, TorchServe Cloud Platforms - AWS (S3, SageMaker, Lambda), Azure ML, GCP (Vertex AI) Databases - PostgreSQL, MongoDB, Redis, SQLite _______________________________ _________ Nice-to-Have Skills Experience working with vector databases (e.g., FAISS, Pinecone, Weaviate) Prompt engineering for LLMs Experience working with vector databases (e.g., FAISS, Pinecone, Weaviate) Prompt engineering for LLMs Familiarity with LLMOps tools (e.g., LangSmith, PromptLayer, BentoML)
Posted 2 weeks ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Us Jar is India’s leading Daily Saving app that helps people build strong saving habits—one small step at a time. Our goal is to make saving simple, rewarding, and truly life-changing . Founded in 2021 by Misbah Ashraf and Nishchay AG , Jar is a Bengaluru-based startup with one simple belief: saving a little every day in 24K Digital Gold can truly transform your future. Today, 20 million+ Indians trust Jar as their saving partner. With flexible saving options— Daily, Weekly, Monthly, and Instant Saving —we have made it easy for everyone to save in their own way and withdraw anytime. We are one of the leaders in UPI autopay transactions, crossing more than 1 million transactions per day. In 2023, we expanded our vision with Nek , our jewelry brand crafted to bring together luxury and affordability, it has since surpassed ₹100 crore in revenue. We have a big dream of bringing “ Har Ghar Sona”. Small, consistent savings are just the start. We’re here to walk alongside our users, helping Indians secure their financial future every step of the way. Backed by Tiger Global Management, Arkam Ventures, and WEH Ventures, among others, we have raised 50 million+ in funding. In January 2025 , we hit a huge milestone of becoming profitable . Now, we’re charging ahead, focused on sustainable growth and scaling impact. And this is just the beginning! What will be your responsibilities? Data Analysis & Insights Perform deep dives on large, structured, and unstructured datasets to identify trends, irregularities, and actionable insights that drive product development and business decisions. Provide actionable insights Develop and maintain dashboards and reports for key stakeholders. Continuously monitor transactions to ensure accuracy, completeness, and integrity. Ensure data accuracy and consistency across multiple sources. Identify discrepancies, gaps, and failures in transactions and escalate for immediate resolution. Analyze historical sales data and market trends to develop accurate demand forecasts. Strategic Decision-Making Collaborate with product, engineering, and business teams to solve problems using data. Lead analysis to improve key retention and renewal metrics such as churn rate, renewal rate, transaction success rate, and GMV growth Support A/B testing and experiments to optimize product and feature performance. Analyze transaction failures, payment declines, and retry success rates to optimize the auto debit payment funnel. Data Management & Modeling Design and optimize data models that support real-time transaction monitoring, churn prediction, and cohort analysis for subscription-based customers. Partner with data engineers to ensure data accuracy, to improve data collection, completeness, and accessibility. Build and optimize reports that track business performance over time. Automate recurring reports and processes to improve efficiency. Leadership & Mentorship Guide and mentor junior analysts in best practices, technical skills, and storytelling through data. Lead projects end-to-end, ensuring clarity, timeliness, and high-quality outcomes. Metrics & Reporting & Ownerships Own reporting of subscription transactions, payment success rates, churn trends, and GMV impact, ensuring timely insights to Business & Product teams. Automate and optimize reporting to provide timely insights to leadership. Supporting other teams in setting up success metrics for particular products/features What’s required from you? Technical Skills Strong proficiency in Python & MongoDb for data analysis, including Pandas, NumPy, Scikit-learn, Matplotlib/Seaborn, Dask, Stats models, Re(regular expressions for text cleaning), textblob (sentiment analysis & text processing) & Automations. Object oriented programming is a plus. SQL: Expertise in writing complex SQL (postgres) queries for data extraction, transformation, and reporting. Process, clean, and wrangle large datasets using Python to ensure data quality and readiness for analysis. Strong understanding of Excel/Google Sheets for quick ad-hoc analysis. Experience working with large, structured/unstructured datasets. Able to develop KPIs related to retention, acquisition, A/B experiments. Visualization Tools: Data Exploration and Visualization with tools like Amplitude, Clevertap, MS Excel, Tableau, Power BI, etc. Soft Skills High sense of ownership, accountability, and proactive problem-solving mindset. Strong problem-solving, critical thinking, and business acumen. Excellent communication skills with the ability to translate complex findings into clear insights for non-technical stakeholders. Experience 3+ years of experience in data analysis, preferably in fintech or startups. Proven experience leading high-impact projects independently. A desire to work in a fast-paced environment What makes us different? We’re not just building a product—we’re shaping the future of savings in India. We seek people who bring passion, energy, and fresh ideas to help us make that happen. Experience matters, but we are a potential first organisation. We move fast, learn from our mistakes, and take bold risks to solve problems that haven’t been attempted before. If you’re excited about working in an environment where people inspire and truly support each other, you’ve found the right fit. What do we stand for? The Five Values That We Live By Passion: At Jar, we strive to create an environment where people love what they do, are motivated and equipped to do their best work. Equality: We bring diverse skills, ideas, and experiences to the table, supporting and challenging each other across teams to create something bigger than ourselves. Growth: When our people grow, Jar grows. We create opportunities for learning, development, and meaningful impact. Accountability: The core of our work ethic is taking ownership of our work, showing initiative, and having the freedom to ask questions. Consistency: We believe in doing the right things consistently. Big change doesn’t happen overnight—it’s built one step at a time. Join us and let’s build something amazing together! What employee benefits do we have? Glad you asked! Among other things, we have Medical Insurance for employees and their families ESOPs allocation Pluxee meal card Swish club card for exclusive employee discounts Advance salary plans Relocation assistance L&D programmes Skills: python,tableau,sql,numpy,data wrangling,data exploration,scikit-learn,power bi,mentoring,clevertap,report writing,analytical skills,nlp,pandas,mongodb,excel,matplotlib,data analysis,business sense,amplitude,ms excel,google sheets,data visualization,querying,etl,data analysis packages Show more Show less
Posted 2 weeks ago
5.0 - 10.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Requirements Job Requirements Role/ Job Title: Data Scientist Function/ Department: Data & Analytics Job Purpose In this specialized role, you will leverage your expertise in machine learning and statistics to derive valuable insights from data. Your role will include developing predictive models, interpreting data and working closely with engineers to ensure the effective deployment and functioning of these models. Key / Primary Responsibilities Deploy and scale solutions, integrating them seamlessly into production environments and optimizing for performance and efficiency. Develop and maintain machine learning workflows and pipelines for training, evaluating, and deploying models, using Python or R, and leveraging libraries like Hugging Face Transformers, TensorFlow, and PyTorch. Collaborate with product, data, and engineering teams to define and refine use cases for applications such as conversational agents, content generation, and semantic search. Design and implement fine-tuning strategies to adapt pre-trained models to domain-specific tasks, ensuring high relevance and accuracy. Evaluate and optimize performance, including handling challenges such as prompt engineering, inference time, and model bias. Manage and process large, unstructured datasets using SQL and NoSQL databases, ensuring smooth integration with models. Build and deploy APIs and services, providing scalable solutions. Use data visualization tools (e.g., Matplotlib, Seaborn, Tableau) to communicate model performance, insights, and results to non-technical stakeholders. Key Success Metrics Ensure timely deliverables. Spot Training Infrastructure fixes. Lead technical aspects of the projects. Error free deliverables. Education Qualification Graduation: Bachelor of Science (B.Sc) / Bachelor of Technology (B.Tech) / Bachelor of Computer Applications (BCA) Post-Graduation: Master of Science (M.Sc) /Master of Technology (M.Tech) / Master of Computer Applications (MCA Experience: 5-10 years of relevant experience Show more Show less
Posted 2 weeks ago
0 years
0 - 0 Lacs
Calicut
On-site
Data Science & Analytics Trainer – Only Passionate Trainers Need to Apply Maitexa Technologies is Hiring – Calicut (Offline/Hybrid) Are you someone who finds joy in turning data into insights—and dreams of guiding others to do the same? We are looking for a Data Science and Analytics Trainer who is deeply passionate about teaching and ready to mentor students through real-world data challenges. You Should Have: Strong foundation in Python, NumPy, Pandas, Matplotlib, Scikit-Learn Expertise in Data Cleaning, Exploratory Data Analysis, Machine Learning, and Statistics Experience with Power BI / Tableau, Excel, SQL Ability to design capstone projects , case studies, and industry-aligned training Bonus: Exposure to AI concepts , ChatGPT , or Big Data tools is a plus! Location : Calicut (Offline preferred / Hybrid possible) If you're not truly passionate about training , please don't apply. Apply now and help students unlock the power of data! Job Type: Full-time Pay: ₹20,000.00 - ₹58,065.17 per month Schedule: Day shift Supplemental Pay: Performance bonus Work Location: In person
Posted 2 weeks ago
170.0 years
2 - 6 Lacs
Hyderābād
On-site
Country/Region: IN Requisition ID: 25995 Work Model: Position Type: Salary Range: Location: INDIA - HYDERABAD - BIRLASOFT OFFICE Title: Sr Architect Description: Area(s) of responsibility About Us: Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise solutions. The company’s consultative and design-thinking approach empowers societies worldwide, enhancing the efficiency and productivity of businesses. As part of the multibillion-dollar diversified CKA Birla Group, Birlasoft with its 12,000+ professionals, is committed to continuing the Group’s 170-year heritage of building sustainable communities. Location: Any BSL Location Job Description: Bachelor/master’s degree in computer science, Artificial Intelligence, Machine Learning, or a related field with 15+ years of experience 2+ years of experience in developing and implementing Generative AI models, with a strong understanding of techniques such as GPT, T5 and BERT Experience with LLMs OpenAI GPT and others Experience in latest GenAI application development frameworks Project management and governance experience preferred. Candidates with Technical and management skills given stronger preference. Life sciences experience preferred. Experience with handling customers at SLT and VP level. Proficiency in Python and have 8+ years of experience with machine learning libraries and frameworks such as TensorFlow, PyTorch, or Keras Experience in developing Fast API and documentations Experience working with SQL, NoSQL, Elastic, Files etc. Strong knowledge of data structures, algorithms, and software engineering principles Familiarity with cloud-based platforms and services, such as AWS (must), GCP, or Azure 8+ years of experience with advanced natural language processing ( NLP ) techniques and tools, such as SpaCy, NLTK, or Hugging Face Familiarity with data visualization tools and libraries, such as Matplotlib, Seaborn, or Plotly Significant experience architecting cutting-edge MLOps systems in enterprise environments Knowledge of software development methodologies, such as Agile or Scrum Excellent problem-solving skills , with the ability to think critically and creatively to develop innovative AI solutions. Strong communication skills , with the ability to effectively convey complex technical concepts to a diverse audience Proactive mindset, with the ability to work independently and collaboratively in a fast-paced, dynamic environment. Able to execute thought leadership. Responsibilities: Design, develop, and implement Generative AI models and algorithms, using techniques such as GPT, T5, BERT Collaborate with cross-functional teams to define AI project requirements and objectives, ensuring alignment with overall business goals Conduct research to stay up-to-date with the latest advancements in Generative AI , machine learning, and deep learning techniques, and identify opportunities to integrate them into customer products and services Optimize Generative AI models for improved performance, scalability, and efficiency Develop and maintain AI pipelines, including data preprocessing, feature extraction, model training, and evaluation Contribute to the establishment of best practices and standards for Generative AI development within the organization
Posted 2 weeks ago
0 years
2 - 4 Lacs
Hyderābād
On-site
Location: Hyderabad, IN Employment type: Employee Place of work: Office Offshore/Onshore: Onshore TechnipFMC is committed to driving real change in the energy industry. Our ambition is to build a sustainable future through relentless innovation and global collaboration – and we want you to be part of it. You’ll be joining a culture that values curiosity, expertise, and ideas as well as diversity, inclusion, and authenticity. Bring your unique energy to our team of more than 20,000 people worldwide, and discover a rewarding, fulfilling, and varied career that you can take in anywhere you want to go. Job Purpose Seeking a skilled Python Developer to join our team and help us develop applications and tooling to streamline in-house engineering design processes with a continuous concern for quality, targets, and customer satisfaction. Job Description 1.Write clean and maintainable Python code using PEP guidelines 2. Build and maintain software packages for scientific computing 3. Build and maintain command line interfaces (CLIs) 4. Build and maintain web applications and dashboards 5. Design and implement data analysis pipelines 6. Create and maintain database schemas and queries 7. Optimise code performance and scalability 8. Develop and maintain automated tests to validate software 9. Contribute and adhere to team software development practices, e.g., Agile product management, source code version control, continuous integration/deployment (CI/CD) 10. Build and maintain machine learning models (appreciated, but not a prerequisite) Technical Stack 1. Languages: Python, SQL 2. Core libraries: Scipy, Pandas, NumPy 3. Web frameworks: Streamlit, Dash, Flask 4. Visualisation: Matplotlib, Seaborn, Plotly 5. Automated testing: pytest 6. CLI development: Click, Argparse 7. Source code version control: Git 8. Agile product management: Azure DevOps, GitHub 9. CI/CD: Azure Pipelines, Github Actions, Docker 10. Database systems: PostgreSQL, Snowflake, SQlite, HDF5 11. Performance: Numba, Dask 12. Machine Learning: Scikit-learn, TensorFlow, PyTorch (Desired) You are meant for this job if: • Bachelor's degree in computer science or software engineering • Master's degree is a plus• Strong technical basis in engineering • Presentation skills • Good organizational and problem-solving skills • Service/Customer oriented • Ability to work in a team-oriented environment • Good command of English Skills Spring Boot Data Modelling CI/CD Internet of Things (IoT) Jira/Confluence React/Angular SAFe Scrum Kamban Collaboration SQL Bash/Shell/Powershell AWS S3 AWS lambda Cypress/Playwright Material Design Empirical Thinking Agility Github HTML/CSS Javascript/TypeScript GraphQL Continuous Learning Cybersecurity Computer Programming Java/Kotlin Test Driven Development Being a global leader in the energy industry requires an inclusive and diverse environment. TechnipFMC promotes diversity, equity, and inclusion by ensuring equal opportunities to all ages, races, ethnicities, religions, sexual orientations, gender expressions, disabilities, or all other pluralities. We celebrate who you are and what you bring. Every voice matters and we encourage you to add to our culture. TechnipFMC respects the rights and dignity of those it works with and promotes adherence to internationally recognized human rights principles for those in its value chain. Date posted: Jun 2, 2025 Requisition number: 13580
Posted 2 weeks ago
5.0 years
2 - 8 Lacs
Gurgaon
On-site
Experience: 5 - 8 Years Job Location: Gurgoan, Hyderabad Purpose of the Job – A simple statement to identify clearly the objective of the job. The Senior Machine Learning Engineer is responsible for designing, implementing, and deploying scalable and efficient machine learning algorithms to solve complex business problems. The Machine Learning Engineer is also responsible of the lifecycle of models, once deployed in production environments, through monitoring performance and model evolution. The position is highly technical and requires an ability to collaborate with multiple technical and non-technical profiles (data scientists, data engineers, data analysts, product owners, business experts), and actively take part in a large data science community. Key Responsibilities and Expected Deliverables– This details what actually needs to be done; the duties and expected outcomes. Managing the lifecycle of machine learning models Develop and implement machine learning models to solve complex business problems. Ensure that models are accurate, efficient, reliable, and scalable. Deploy machine learning models to production environments, ensuring that models are integrated with software systems. Monitor machine learning models in production, ensuring that models are performing as expected and that any errors or performance issues are identified and resolved quickly. Maintain machine learning models over time. This includes updating models as new data becomes available, retraining models to improve performance, and retiring models that are no longer effective. Develop and implement policies and procedures for ensuring the ethical and responsible use of machine learning models. This includes addressing issues related to bias, fairness, transparency, and accountability. Continuous Improvements Stay up to date with the latest developments in the field: read research papers, attend conferences, and participate in trainings to expand their knowledge and skills. Identify and evaluate new technologies and tools that can improve the efficiency and effectiveness of machine learning projects. Propose and implement optimizations for current machine learning workflows and systems. Proactively identify areas of improvement within the pipelines. Make sure that created code is compliant with our set of engineering standards. Collaboration with other data experts (Data Engineers, Platform Engineers, and Data Analysts) Participate to pull requests reviews coming from other team members. Ask for review and comments when submitting their own work. Actively participate to the day-to-day life of the project (Agile rituals), the data science team (DS meeting) and the rest of the Global Engineering team Education & Experience Engineering Master’s degree or PhD in Data Science, Statistics, Mathematics, or related fields 5 years+ experience in a Machine Learning Engineer role into large corporate organizations Experience of working with ML models in a cloud ecosystem Statistics & Machine Learning Statistics: Strong understanding of statistical analysis and modelling techniques (e.g., regression analysis, hypothesis testing, time series analysis) Classical ML: Very strong knowledge in classical ML algorithms for regression & classification, supervised and unsupervised machine learning, both theoretical and practical (e.g. using scikit-learn, xgboost) ML niche: Expertise in at least one of the following ML specialisations: Timeseries forecasting / Natural Language Processing / Computer Vision Deep Learning: Good knowledge of Deep Learning fundamentals (CNN, RNN, transformer architecture, attention mechanism, …) and one of the deep learning frameworks (pytorch, tensorflow, keras) Generative AI: Good understanding of Generative AI specificities and previous experience in working with Large Language Models is a plus (e.g. with openai, langchain) MLOps Model strategy: Expertise in designing, implementing, and testing machine learning strategies. Model integration: Very strong skills in integrating a machine learning algorithm in a data science application in production. Model performance: Deep understanding of model performance evaluation metrics and existing libraries (e.g., scikit-learn, evidently) Model deployment: Experience in deploying and managing machine learning models in production either using specific cloud platform, model serving frameworks, or containerization. Model monitoring: Experience with model performance monitoring tools is a plus (Grafana, Prometheus) Software Engineering Python: Very strong coding skills in Python including modularity, OOP, data & config manipulation frameworks (e.g., pandas, pydantic) etc. Python ecosystem: Strong knowledge of tooling in Python ecosystem such as dependency management tooling (venv, poetry), documentation frameworks (e.g. sphinx, mkdocs, jupyter-book), testing frameworks (unittest, pytest) Software engineering practices: Experience in putting in place good software engineering practices such as design patterns, testing (unit, integration), clean code, code formatting etc. Debugging: Ability to troubleshoot and debug issues within machine learning pipelines Data Science Experimentation and Analytics Data Visualization: Knowledge of data visualization tools such as plotly, seaborn, matplotlib, etc. to visualise, interpret and communicate the results of machine learning models to stakeholders. Basic knowledge of PowerBI is a plus Data Cleaning: Experience with data cleaning and preprocessing techniques such as feature scaling, dimensionality reduction, and outlier detection (e.g. with pandas, scikit-learn). Data Science Experiments: Understanding of experimental design and A/B testing methodologies Data Processing: Databricks/Spark: Basic knowledge of PySpark for big data processing Databases: Basic knowledge of SQL to query data in internal systems Data Formats: Familiarity with different data storage formats such as Parquet and Delta DevOps Azure DevOps: Experience using a DevOps platform such as Azure DevOps for using Boards, Repositories, Pipelines Git: Experience working with code versioning (git), branch strategies, and collaborative work with pull requests. Proficient with the most basic git commands. CI / CD: Experience in implementing/maintaining pipelines for continuous integration (including execution of testing strategy) and continuous deployment is preferable. Cloud Platform: Azure Cloud: Previous experience with services like Azure Machine Learning Services and/or Azure Databricks on Azure is preferable. Soft skills Strong analytical and problem-solving skills, with attention to detail Excellent verbal and written communication and pedagogical skills with technical and non-technical teams Excellent teamwork and collaboration skills Adaptability and reactivity to new technologies, tools, and techniques Fluent in English
Posted 2 weeks ago
0 years
0 Lacs
India
Remote
Data Science Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Application Deadline: 4th June 2025 Opportunity: Full-time role based on performance + Internship Certificate About Unified Mentor Unified Mentor provides aspiring professionals with hands-on experience in data science through industry-relevant projects, helping them build successful careers. Responsibilities Collect, preprocess, and analyze large datasets Develop predictive models and machine learning algorithms Perform exploratory data analysis (EDA) to extract insights Create data visualizations and dashboards for effective communication Collaborate with cross-functional teams to deliver data-driven solutions Requirements Enrolled in or a graduate of Data Science, Computer Science, Statistics, or a related field Proficiency in Python or R for data analysis and modeling Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) Familiarity with data visualization tools like Tableau, Power BI, or Matplotlib Strong analytical and problem-solving skills Excellent communication and teamwork abilities Stipend & Benefits Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) Hands-on experience in data science projects Certificate of Internship & Letter of Recommendation Opportunity to build a strong portfolio of data science models and applications Potential for full-time employment based on performance How to Apply Submit your resume and a cover letter with the subject line "Data Science Intern Application." Equal Opportunity: Unified Mentor welcomes applicants from all backgrounds. Show more Show less
Posted 2 weeks ago
2.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job description Job Title: AI Engineer Salary: 4 - 5.4 LPA Experience: Minimum 2 years Location: Hinjewadi, Pune Work Mode: Work from Office Availability: Immediate Joiner About Us: Rasta.AI, a product of AI Unika Technologies (P) Ltd, is a pioneering technology company based in Pune. We specialize in road infrastructure monitoring and maintenance using cutting-edge AI, computer vision, and 360-degree imaging. Our platform delivers real-time insights into road conditions to improve safety, efficiency, and sustainability. We collaborate with government agencies, private enterprises, and citizens to enhance road management through innovative tools and solutions. The Role This is a full-time, on-site role. As an AI Engineer, you will be responsible for developing innovative AI models and software solutions to address real-world challenges. You will collaborate with cross-functional teams to identify business opportunities and provide customized solutions. You will also work alongside talented engineers, designers, and data scientists to implement and maintain these models and solutions. Technical Skills Programming Languages: Python (and other AI-supported languages) Databases: SQL, Cassandra, MongoDB Python Libraries: NumPy, Pandas, Scikit-learn Deep Neural Networks: CNN, RNN, and LLM Data Analysis Libraries: TensorFlow, Pandas, NumPy, Scikit-learn, Matplotlib, Tensor Board Frameworks: Django, Flask, Pyramid, and Cherrypie Operating Systems: Ubuntu, Windows Tools: Jupyter Notebook, PyCharm IDE, Excel, Roboflow Big Data (Bonus): Hadoop (Hive, Sqoop, Flume), Kafka, Spark Code Repository Tools: Git, GitHub DevOps-AWS: Docker, Kubernetes, Instance hosting and management Analytical Skills Exploratory Data Analysis Predictive Modeling Text Mining Natural Language Processing Machine Learning Image Processing Object Detection Instance Segmentation Deep Learning DevOps AWS Knowledge Expertise Proficiency in TensorFlow library with RNN and CNN Familiarity with pre-trained models like VGG-16, ResNet-50, and Mobile Net Knowledge of Spark Core, Spark SQL, Spark Streaming, Cassandra, and Kafka Designing and Architecting Hadoop Applications Experience with chat-bot platforms (a bonus) Responsibilities The entire lifecycle of model development: Data Collection and Preprocessing Model Development Model Training Model Testing Model Validation Deployment and Maintenance Collaboration and Communication Qualifications Bachelor's or Master's degree in a relevant field (AI, Data Science, Computer Science, etc.) Minimum 2 years of experience developing and deploying AI-based software products Strong programming skills in Python (and potentially C++ or Java) Experience with machine learning libraries (TensorFlow, PyTorch, Kera's, scikit-learn) Experience with computer vision, natural language processing, or recommendation systems Experience with cloud computing platforms (Google Cloud, AWS) Problem-solving skills Excellent communication and presentation skills Experience with data infrastructure and tools (SQL, NoSQL, and big data platforms) Teamwork skills Join Us! If you are passionate about AI and want to contribute to groundbreaking projects in a dynamic startup environment, we encourage you to apply! Be part of our mission to drive technological advancement in India. Drop Your CV - hr@aiunika.com Show more Show less
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Head - Python Engineering Job Summary: We are looking for a skilled Python, AI/ML Developer with 8 to 12 years of experience to design, develop, and maintain high-quality back-end systems and applications. The ideal candidate will have expertise in Python and related frameworks, with a focus on building scalable, secure, and efficient software solutions. This role requires a strong problem-solving mindset, collaboration with cross-functional teams, and a commitment to delivering innovative solutions that meet business objectives. Responsibilities Application and Back-End Development: Design, implement, and maintain back-end systems and APIs using Python frameworks such as Django, Flask, or FastAPI, focusing on scalability, security, and efficiency. Build and integrate scalable RESTful APIs, ensuring seamless interaction between front-end systems and back-end services. Write modular, reusable, and testable code following Python’s PEP 8 coding standards and industry best practices. Develop and optimize robust database schemas for relational and non-relational databases (e.g., PostgreSQL, MySQL, MongoDB), ensuring efficient data storage and retrieval. Leverage cloud platforms like AWS, Azure, or Google Cloud for deploying scalable back-end solutions. Implement caching mechanisms using tools like Redis or Memcached to optimize performance and reduce latency. AI/ML Development: Build, train, and deploy machine learning (ML) models for real-world applications, such as predictive analytics, anomaly detection, natural language processing (NLP), recommendation systems, and computer vision. Work with popular machine learning and AI libraries/frameworks, including TensorFlow, PyTorch, Keras, and scikit-learn, to design custom models tailored to business needs. Process, clean, and analyze large datasets using Python tools such as Pandas, NumPy, and PySpark to enable efficient data preparation and feature engineering. Develop and maintain pipelines for data preprocessing, model training, validation, and deployment using tools like MLflow, Apache Airflow, or Kubeflow. Deploy AI/ML models into production environments and expose them as RESTful or GraphQL APIs for integration with other services. Optimize machine learning models to reduce computational costs and ensure smooth operation in production systems. Collaborate with data scientists and analysts to validate models, assess their performance, and ensure their alignment with business objectives. Implement model monitoring and lifecycle management to maintain accuracy over time, addressing data drift and retraining models as necessary. Experiment with cutting-edge AI techniques such as deep learning, reinforcement learning, and generative models to identify innovative solutions for complex challenges. Ensure ethical AI practices, including transparency, bias mitigation, and fairness in deployed models. Performance Optimization and Debugging: Identify and resolve performance bottlenecks in applications and APIs to enhance efficiency. Use profiling tools to debug and optimize code for memory and speed improvements. Implement caching mechanisms to reduce latency and improve application responsiveness. Testing, Deployment, and Maintenance: Write and maintain unit tests, integration tests, and end-to-end tests using Pytest, Unittest, or Nose. Collaborate on setting up CI/CD pipelines to automate testing, building, and deployment processes. Deploy and manage applications in production environments with a focus on security, monitoring, and reliability. Monitor and troubleshoot live systems, ensuring uptime and responsiveness. Collaboration and Teamwork: Work closely with front-end developers, designers, and product managers to implement new features and resolve issues. Participate in Agile ceremonies, including sprint planning, stand-ups, and retrospectives, to ensure smooth project delivery. Provide mentorship and technical guidance to junior developers, promoting best practices and continuous improvement. Required Skills and Qualifications Technical Expertise: Strong proficiency in Python and its core libraries, with hands-on experience in frameworks such as Django, Flask, or FastAPI. Solid understanding of RESTful API development, integration, and optimization. Experience working with relational and non-relational databases (e.g., PostgreSQL, MySQL, MongoDB). Familiarity with containerization tools like Docker and orchestration platforms like Kubernetes. Expertise in using Git for version control and collaborating in distributed teams. Knowledge of CI/CD pipelines and tools like Jenkins, GitHub Actions, or CircleCI. Strong understanding of software development principles, including OOP, design patterns, and MVC architecture. Preferred Skills: Experience with asynchronous programming using libraries like asyncio, Celery, or RabbitMQ. Knowledge of data visualization tools (e.g., Matplotlib, Seaborn, Plotly) for generating insights. Exposure to machine learning frameworks (e.g., TensorFlow, PyTorch, scikit-learn) is a plus. Familiarity with big data frameworks like Apache Spark or Hadoop. Experience with serverless architecture using AWS Lambda, Azure Functions, or Google Cloud Run. Soft Skills: Strong problem-solving abilities with a keen eye for detail and quality. Excellent communication skills to effectively collaborate with cross-functional teams. Adaptability to changing project requirements and emerging technologies. Self-motivated with a passion for continuous learning and innovation. Education: Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field. Show more Show less
Posted 2 weeks ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Experience: Junior and senior data scientist(5+ and 7+years exp) Location: PAN INDIA Responsibilities Analyze large, complex datasets to identify trends, patterns, and opportunities. Develop and deploy machine learning models to solve business challenges. Communicate findings through data visualizations and reports. Collaborate with data engineers, analysts, and product teams to turn insights into action. Design and implement A/B tests to evaluate feature performance or business hypotheses. Translate business problems to data science problem Work with stakeholders to understand data needs and deliver custom solutions. Must have Skills Proficiency in Python or R for data analysis and modeling. Strong knowledge of SQL and working with relational databases. Experience with machine learning algorithms and libraries (e.g., scikit-learn, TensorFlow, XGBoost etc). Experience in Forecasting will be added advantage Solid understanding of statistics , probability , and hypothesis testing . Experience with data visualization tools (e.g., Matplotlib, Seaborn, Tableau, Power BI). Experience with big data tools (e.g., Spark, Hadoop). Knowledge of deep learning frameworks (e.g., PyTorch, TensorFlow). Familiarity with cloud platforms like AWS, GCP, or Azure. Experience in MLOP's and CI/CD work flow Expeirence in developing end to end pipeline in any of cloud platform (AWS,Azure etc.) Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Itanagar, Arunachal Pradesh, India
On-site
Title : Sr. Data Scientist/ML Engineer (4+ years & above) Required Technical Skillset Language : Python, PySpark Framework : Scikit-learn, TensorFlow, Keras, PyTorch, Libraries : NumPy, Pandas, Matplotlib, SciPy, Scikit-learn - DataFrame, Numpy, boto3 Database : Relational Database(Postgres), NoSQL Database (MongoDB) Cloud : AWS cloud platforms Other Tools : Jenkins, Bitbucket, JIRA, Confluence A machine learning engineer is responsible for designing, implementing, and maintaining machine learning systems and algorithms that allow computers to learn from and make predictions or decisions based on data. The role typically involves working with data scientists and software engineers to build and deploy machine learning models in a variety of applications such as natural language processing, computer vision, and recommendation systems. The key responsibilities of a machine learning engineer includes : Collecting and preprocessing large volumes of data, cleaning it up, and transforming it into a format that can be used by machine learning models. Model building which includes Designing and building machine learning models and algorithms using techniques such as supervised and unsupervised learning, deep learning, and reinforcement learning. Evaluating the model performance of machine learning models using metrics such as accuracy, precision, recall, and F1 score. Deploying machine learning models in production environments and integrating them into existing systems using CI/CD Pipelines, AWS Sagemaker Monitoring the performance of machine learning models and making adjustments as needed to improve their accuracy and efficiency. Working closely with software engineers, product managers and other stakeholders to ensure that machine learning models meet business requirements and deliver value to the organization. Requirements And Skills Mathematics and Statistics : A strong foundation in mathematics and statistics is essential. They need to be familiar with linear algebra, calculus, probability, and statistics to understand the underlying principles of machine learning algorithms. Programming Skills Should be proficient in programming languages such as Python. The candidate should be able to write efficient, scalable, and maintainable code to develop machine learning models and algorithms. Machine Learning Techniques Should have a deep understanding of various machine learning techniques, such as supervised learning, unsupervised learning, and reinforcement learning and should also be familiar with different types of models such as decision trees, random forests, neural networks, and deep learning. Data Analysis And Visualization Should be able to analyze and manipulate large data sets. The candidate should be familiar with data cleaning, transformation, and visualization techniques to identify patterns and insights in the data. Deep Learning Frameworks Should be familiar with deep learning frameworks such as TensorFlow, PyTorch, and Keras and should be able to build and train deep neural networks for various applications. Big Data Technologies A machine learning engineer should have experience working with big data technologies such as Hadoop, Spark, and NoSQL databases. They should be familiar with distributed computing and parallel processing to handle large data sets. Software Engineering A machine learning engineer should have a good understanding of software engineering principles such as version control, testing, and debugging. They should be able to work with software development tools such as Git, Jenkins, and Docker. Communication And Collaboration A machine learning engineer should have good communication and collaboration skills to work effectively with cross-functional teams such as data scientists, software developers, and business stakeholders. (ref:hirist.tech) Show more Show less
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Matplotlib is a popular data visualization library in Python that is widely used in various industries. Job opportunities for matplotlib professionals in India are on the rise due to the increasing demand for data visualization skills. In this article, we will explore the job market for matplotlib in India and provide insights for job seekers looking to build a career in this field.
Here are 5 major cities in India actively hiring for matplotlib roles: 1. Bangalore 2. Delhi 3. Mumbai 4. Hyderabad 5. Pune
The average salary range for matplotlib professionals in India varies based on experience levels. Entry-level positions can expect to earn around INR 3-5 lakhs per annum, while experienced professionals can earn upwards of INR 10 lakhs per annum.
Career progression in the field of matplotlib typically follows a path from Junior Developer to Senior Developer to Tech Lead. As professionals gain more experience and expertise in data visualization using matplotlib, they can take on more challenging roles and responsibilities.
In addition to proficiency in matplotlib, professionals in this field are often expected to have knowledge and experience in the following areas: - Python programming - Data analysis - Data manipulation - Statistics - Machine learning
Here are 25 interview questions for matplotlib roles:
- What is matplotlib and how is it used in data visualization? (basic)
- What are the different types of plots that can be created using matplotlib? (basic)
- How would you customize the appearance of a plot in matplotlib? (medium)
- Explain the difference between plt.show()
and plt.savefig()
in matplotlib. (medium)
- How do you handle missing data in a dataset before visualizing it using matplotlib? (medium)
- What is the purpose of the matplotlib.pyplot.subplots()
function? (advanced)
- How would you create a subplot with multiple plots in matplotlib? (medium)
- Explain the use of matplotlib.pyplot.bar()
and matplotlib.pyplot.hist()
functions. (medium)
- How can you annotate a plot in matplotlib? (basic)
- Describe the process of creating a 3D plot in matplotlib. (advanced)
- How do you set the figure size in matplotlib? (basic)
- What is the purpose of the matplotlib.pyplot.scatter()
function? (medium)
- How would you create a line plot with multiple lines using matplotlib? (medium)
- Explain the difference between plt.plot()
and plt.scatter()
in matplotlib. (medium)
- How do you add a legend to a plot in matplotlib? (basic)
- Describe the use of color maps in matplotlib. (medium)
- How can you save a plot as an image file in matplotlib? (basic)
- What is the purpose of the matplotlib.pyplot.subplots_adjust()
function? (medium)
- How do you create a box plot in matplotlib? (medium)
- Explain the use of matplotlib.pyplot.pie()
function for creating pie charts. (medium)
- How would you create a heatmap in matplotlib? (advanced)
- What are the different types of coordinate systems in matplotlib? (advanced)
- How do you handle axis limits and ticks in matplotlib plots? (medium)
- Explain the role of matplotlib.pyplot.imshow()
function. (medium)
- How would you create a bar plot with error bars in matplotlib? (advanced)
As the demand for data visualization skills continues to grow, mastering matplotlib can open up exciting job opportunities in India. By preparing thoroughly and showcasing your expertise in matplotlib, you can confidently apply for roles and advance your career in this dynamic field. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.