Home
Jobs
Companies
Resume

162 Plotly Jobs - Page 2

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Data Scientist – Roles And Responsibilities We are seeking a skilled Data Scientist to join our team and leverage data to create actionable insights and innovative solutions. The ideal candidate will have strong analytical skills, expertise in statistical modeling, and proficiency in programming and machine learning techniques. You will work closely with cross-functional teams to identify business opportunities, optimize processes, and develop data-driven strategies. Key Responsibilities Data Collection & Preparation: Gather, clean, and preprocess large datasets from various sources to ensure data quality and usability. Exploratory Data Analysis: Perform in-depth analysis to identify trends, patterns, and correlations that inform business decisions. Model Development : Design, build, and deploy machine learning models and statistical algorithms to solve complex problems, such as predictive analytics, classification, or recommendation systems. Data Visualization : Create compelling visualizations and dashboards to communicate insights to stakeholders using tools like Tableau, Power BI, or Python libraries (e.g., Matplotlib, Seaborn). Collaboration : Work with team leads, engineers, and business leaders to understand requirements, define key metrics, and translate insights into actionable strategies. Experimentation : Design and analyze A/B tests or other experiments to evaluate the impact of business initiatives. Automation : Develop pipelines and scripts to automate data processing and model deployment. Keep up with advancements in data science, machine learning, and industry trends to implement cutting-edge techniques. Preferred Qualifications Experience with deep learning, natural language processing (NLP), or computer vision. Knowledge of software engineering practices, such as version control (Git) and CI/CD pipelines. Contributions to open-source projects or publications in data science. Technical Skills Proficiency in programming languages like Python Experience with SQL for querying and managing databases. Knowledge of machine learning frameworks (e.g., TensorFlow, PyTorch, Scikit-learn). Familiarity with big data tools (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, GCP, Azure) is a plus. Experience with data visualization tools (e.g., Tableau, Power BI, or Plotly). Strong understanding of statistics, probability, and experimental design Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Data Scientist – Roles And Responsibilities We are seeking a skilled Data Scientist to join our team and leverage data to create actionable insights and innovative solutions. The ideal candidate will have strong analytical skills, expertise in statistical modeling, and proficiency in programming and machine learning techniques. You will work closely with cross-functional teams to identify business opportunities, optimize processes, and develop data-driven strategies. Key Responsibilities Data Collection & Preparation: Gather, clean, and preprocess large datasets from various sources to ensure data quality and usability. Exploratory Data Analysis: Perform in-depth analysis to identify trends, patterns, and correlations that inform business decisions. Model Development : Design, build, and deploy machine learning models and statistical algorithms to solve complex problems, such as predictive analytics, classification, or recommendation systems. Data Visualization : Create compelling visualizations and dashboards to communicate insights to stakeholders using tools like Tableau, Power BI, or Python libraries (e.g., Matplotlib, Seaborn). Collaboration : Work with team leads, engineers, and business leaders to understand requirements, define key metrics, and translate insights into actionable strategies. Experimentation : Design and analyze A/B tests or other experiments to evaluate the impact of business initiatives. Automation : Develop pipelines and scripts to automate data processing and model deployment. Keep up with advancements in data science, machine learning, and industry trends to implement cutting-edge techniques. Preferred Qualifications Experience with deep learning, natural language processing (NLP), or computer vision. Knowledge of software engineering practices, such as version control (Git) and CI/CD pipelines. Contributions to open-source projects or publications in data science. Technical Skills Proficiency in programming languages like Python Experience with SQL for querying and managing databases. Knowledge of machine learning frameworks (e.g., TensorFlow, PyTorch, Scikit-learn). Familiarity with big data tools (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, GCP, Azure) is a plus. Experience with data visualization tools (e.g., Tableau, Power BI, or Plotly). Strong understanding of statistics, probability, and experimental design Show more Show less

Posted 5 days ago

Apply

1.5 - 6.5 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

POSITION SUMMARY Zoetis, Inc. is the world's largest producer of medicine and vaccinations for pets and livestock. The Zoetis Tech & Digital (ZTD) Global ERP organization is as a key building block of ZTD comprising of enterprise applications and systems platforms. Join us at Zoetis India Capability Center (ZICC) in Hyderabad, where innovation meets excellence. As part of the world's leading animal healthcare company, ZICC is at the forefront of driving transformative advancements and applying technology to solve the most complex problems. Our mission is to ensure sustainable growth and maintain a competitive edge for Zoetis globally by leveraging the exceptional talent in India. At ZICC, you'll be part of a dynamic team that partners with colleagues worldwide, embodying the true spirit of One Zoetis. Together, we ensure seamless integration and collaboration, fostering an environment where your contributions can make a real impact. Be a part of our journey to pioneer innovation and drive the future of animal healthcare. Responsibilities: * Data Analysis and Interpretation o Perform exploratory and advanced data analysis using Python, SQL, and relevant statistical techniques. o Identify trends, patterns, and actionable insights to support business decisions. o Cleanse, transform, and validate large datasets from diverse sources, including Azure-based platforms. * Data Visualization o Design and build clear, interactive dashboards and visual reports using Excels, Power BI, Tableau, or similar tools. o Translate complex datasets into easy-to-understand visual narratives for stakeholders. o Ensure visualizations effectively highlight key metrics and business drivers. * Problem-Solving and Attention to Detail o Apply strong analytical thinking to identify anomalies and resolve data inconsistencies. o Maintain accuracy and completeness in data reporting, adhering to defined SLA timelines. Provide ongoing support and troubleshooting for business users and stakeholders. * Deployment and Maintenance o Deploy reports and dashboards in secure and scalable environments, including Azure services (e.g., Azure Synapse, Azure Data Factory). o Monitor performance and data refresh processes, ensuring reliability and efficiency. o Implement feedback-based enhancements and maintain documentation for data products. * Collaboration o Collaborate cross-functionally with product teams, data engineers, and business users to align on data needs and outcomes. o Participate in data reviews and contribute to shared standards and best practices. o Communicate findings clearly and effectively, both verbally and in writing. * Continuous Learning and Innovation o Stay current with advancements in data analytics, cloud technologies, and BI tools. o Pursue ongoing learning and certifications to deepen technical expertise. o Explore and pilot new tools, methodologies, or frameworks to improve data processes. POSITION RESPONSIBILITIES Percent of Time Design, develop, deploy, and support Data solutions. 60% Code reviews 20% Cross-Team Collaboration and Learning New Technologies to stay-up to date. 10% Global Manufacturing Supply process understanding like production planning, quality, inventory, and supply chain. MES (execution system) understanding, and SAP-ERP landscape. 10% ORGANIZATIONAL RELATIONSHIPS * Interacting with business stakeholders to gather integration requirements, understand business processes, and ensure that integration solutions align with organizational goals and objectives. * Work with implementation partners who may be responsible for deploying, configuring, or maintaining integrated solutions within Zoetis IT landscape. * Coordinate with developers and other members of the team to implement integration solutions, share knowledge, and address technical challenges. EDUCATION AND EXPERIENCE Education: Bachelors/master's degree in computer science/applications. Experience: * 1.5-6.5 years of overall experience in data analysis/science and business intelligence. * Solid knowledge of SQL and Python for data analysis, transformation, and automation. * Strong analytical mindset with excellent communication skills and a proactive, problem-solving attitude. * Familiarity with CI/CD processes for automating report deployments and data workflows. * Experience using Git for version control and collaborative development. * Understanding of API integration to extract, manipulate, or serve data from cloud platforms or databases like Azure Data Lake and PostgreSQL. * Knowledge of data visualization best practices and libraries (e.g., matplotlib, seaborn, Plotly) is a plus. Proficiency in Power BI is required; experience with Tableau is a strong advantage. TECHNICAL SKILLS REQUIREMENTS * Python, R, ruby, SQL, CI/CD, Data Viz., Power-BI PHYSICAL POSITION REQUIREMENTS Regular working hours are from 11 AM to 8:00 PM IST. Sometimes, more overlap with the EST Time zone is required during production go-live. This description indicates the general nature and level of work expected. It is not designed to cover or contain a comprehensive listing of activities or responsibilities required of the incumbent. Incumbent may be asked to perform other duties as required. Additional position specific requirements/responsibilities are contained in approved training curricula. About Zoetis At Zoetis , our purpose is to nurture the world and humankind by advancing care for animals. As a Fortune 500 company and the world leader in animal health, we discover, develop, manufacture and commercialize vaccines, medicines, diagnostics and other technologies for companion animals and livestock. We know our people drive our success. Our award-winning culture, built around our Core Beliefs, focuses on our colleagues' careers, connection and support. We offer competitive healthcare and retirement savings benefits, along with an array of benefits, policies and programs to support employee well-being in every sense, from health and financial wellness to family and lifestyle resources. Global Job Applicant Privacy Notice Show more Show less

Posted 5 days ago

Apply

0.0 - 1.0 years

0 Lacs

Bengaluru, Karnataka

Remote

Indeed logo

Job Title: AI/ML Developer – (Intern) Company: VASPP Technologies Pvt. Ltd. Location: Bengaluru, Karnataka, India Job Type: Full-Time Experience: Fresher (0–1 year) Department: Technology / Development About VASPP Technologies: VASPP Technologies Pvt. Ltd. is a fast-growing software company focused on delivering cutting-edge digital transformation solutions for global enterprises. Our innovative projects span across AI/ML, data analytics, enterprise solutions, and cloud computing. We foster a collaborative and dynamic environment that encourages learning and growth. Job Summary: We are seeking a motivated and enthusiastic AI/ML Developer – Fresher to join our growing technology team. The ideal candidate will have a foundational understanding of machine learning algorithms, data analysis, and model deployment. You will work closely with senior developers to contribute to real-world AI/ML projects and software applications. Responsibilities: ·Assist in the design, development, training, and deployment of AI and machine learning models. Collaborate with cross-functional teams including software engineers, data scientists, and product managers to build intelligent applications. Perform data collection, cleaning, transformation, and exploratory data analysis (EDA). Test various ML algorithms (e.g., classification, regression, clustering) and optimize them for performance. Implement model evaluation metrics and fine-tune hyperparameters. Contribute to integrating ML models into software applications using REST APIs or embedded services. Stay updated with the latest AI/ML frameworks, research papers, and industry trends. Document all work including model development, experiments, and deployment steps in a structured format. Required Skills: Proficiency in Python and libraries such as NumPy, Pandas, Scikit-learn, TensorFlow, or PyTorch. Solid understanding of machine learning principles: supervised/unsupervised learning, overfitting, cross-validation, etc. Familiarity with data visualization tools: Matplotlib, Seaborn, Plotly. Basic knowledge of SQL and working with relational databases. Good understanding of software development basics, version control (Git), and collaborative tools. Strong problem-solving mindset, eagerness to learn, and ability to work in a team environment. Educational Qualification: Bachelor’s degree in Computer Science , Information Technology , Data Science , Artificial Intelligence , or related fields from a recognized institution. Preferred Qualifications (Optional): Internship or academic projects related to AI/ML. Participation in online competitions (e.g., Kaggle, DrivenData) or open-source contributions. Exposure to cloud platforms like AWS, Google Cloud (GCP), or Microsoft Azure. Familiarity with model deployment techniques using Flask/FastAPI, Docker, or Streamlit. Compensation: CTC/ Stipend: 5000 or 8000 rs per month How to Apply: Send your updated resume and portfolio to: Email: piyush.vs@vaspp.com or aparna.bs@vaspp.com Job Type: Internship Contract length: 2 months Pay: ₹5,000.00 - ₹8,000.00 per month Benefits: Paid sick time Work from home Schedule: Monday to Friday Morning shift Application Question(s): This is an 2 month Internship and the stipend will be based on performance and interview process so, is it okay for you ? Education: Bachelor's (Preferred) Experience: AI: 1 year (Preferred) Language: English (Preferred) Location: Bangalore, Karnataka (Required) Work Location: In person Application Deadline: 14/06/2025

Posted 6 days ago

Apply

0.0 - 6.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Indeed logo

Role: Senior Analyst - Data Engineering Experience: 3 to 6 years Location: Bengaluru, Karnataka , India (BLR) Job Descrition: We are seeking a highly experienced and skilled Senior Data Engineer to join our dynamic team. This role requires hands-on experience with databases such as Snowflake and Teradata, as well as advanced knowledge in various data science and AI techniques. The successful candidate will play a pivotal role in driving data-driven decision-making and innovation within our organization. Job Resonsbilities: Design, develop, and implement advanced machine learning models to solve complex business problems. Apply AI techniques and generative AI models to enhance data analysis and predictive capabilities. Utilize Tableau and other visualization tools to create insightful and actionable dashboards for stakeholders. Manage and optimize large datasets using Snowflake and Teradata databases. Collaborate with cross-functional teams to understand business needs and translate them into analytical solutions. Stay updated with the latest advancements in data science, machine learning, and AI technologies. Mentor and guide junior data scientists, fostering a culture of continuous learning and development. Communicate complex analytical concepts and results to non-technical stakeholders effectively. Key Technologies & Skills: Machine Learning Models: Supervised learning, unsupervised learning, reinforcement learning, deep learning, neural networks, decision trees, random forests, support vector machines (SVM), clustering algorithms, etc. AI Techniques: Natural language processing (NLP), computer vision, generative adversarial networks (GANs), transfer learning, etc. Visualization Tools: Tableau, Power BI, Matplotlib, Seaborn, Plotly, etc. Databases: Snowflake, Teradata, SQL, NoSQL databases. Programming Languages: Python (essential), R, SQL. Python Libraries: TensorFlow, PyTorch, scikit-learn, pandas, NumPy, Keras, SciPy, etc. Data Processing: ETL processes, data warehousing, data lakes. Cloud Platforms: AWS, Azure, Google Cloud Platform. Big Data Technologies: Apache Spark, Hadoop. Job Snapshot Updated Date 11-06-2025 Job ID J_3679 Location Bengaluru, Karnataka, India Experience 3 - 6 Years Employee Type Permanent

Posted 6 days ago

Apply

4.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Company Description UsefulBI Corporation provides comprehensive solutions across Data Engineering, Data Science, AI/ML, and Business Intelligence. The company's mission is to empower astute business decisions through integrating data insights and cutting-edge AI. UsefulBI excels in data architecture, cloud strategies, Business Intelligence, and Generative AI to deliver outcomes that surpass individual capabilities. Role Description We are seeking a skilled R and Python Developer with hands-on experience developing and deploying applications using Posit (formerly RStudio) tools, including Shiny Server, Posit Connect, and R Markdown. The ideal candidate will have a strong background in data analysis, application development, and creating interactive dashboards for data-driven decision-making. Key Responsibilities Design, develop, and deploy interactive web applications using R Shiny and Posit Connect. Write clean, efficient, and modular code in R and Python for data processing and analysis. Build and maintain R Markdown reports and Python notebooks for business reporting. Integrate R and Python scripts for advanced analytics and automation workflows. Collaborate with data scientists, analysts, and business users to gather requirements and deliver scalable solutions. Troubleshoot application issues and optimize performance on Posit platform (RStudio Server, Posit Connect). Work with APIs, databases (SQL, NoSQL), and cloud platforms (e.g., AWS, Azure) as part of application development. Ensure version control using Git and CI/CD for application deployment. Required Qualifications 4+ years of development experience using R and Python. Strong experience with Shiny apps, R Markdown, and Posit Connect. Proficient in using packages like dplyr, ggplot2, plotly, reticulate, and shiny. Experience with Python data stack (pandas, numpy, matplotlib, etc.) Hands-on experience with deploying apps on Posit Server / Connect. Familiarity with Git, Docker, and CI/CD tools. Excellent problem-solving and communication skills. (ref:hirist.tech) Show more Show less

Posted 6 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Machine Learning Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: These roles have many overlapping skills with GENAI Engineers and architects. Description may scaleup/scale down based on expected seniority. Roles & Responsibilities: -Implement generative AI models, identify insights that can be used to drive business decisions. Work closely with multi-functional teams to understand business problems, develop hypotheses, and test those hypotheses with data, collaborating with cross-functional teams to define AI project requirements and objectives, ensuring alignment with overall business goals. -Conducting research to stay up-to-date with the latest advancements in generative AI, machine learning, and deep learning techniques and identify opportunities to integrate them into our products and services. -Optimizing existing generative AI models for improved performance, scalability, and efficiency. -Ensure data quality and accuracy -Leading the design and development of prompt engineering strategies and techniques to optimize the performance and output of our GenAI models. -Implementing cutting-edge NLP techniques and prompt engineering methodologies to enhance the capabilities and efficiency of our GenAI models. -Determining the most effective prompt generation processes and approaches to drive innovation and excellence in the field of AI technology, collaborating with AI researchers and developers -Experience working with cloud based platforms (example: AWS, Azure or related) -Strong problem-solving and analytical skills -Proficiency in handling various data formats and sources through Omni Channel for Speech and voice applications, part of conversational AI -Prior statistical modelling experience -Demonstrable experience with deep learning algorithms and neural networks -Developing clear and concise documentation, including technical specifications, user guides, and presentations, to communicate complex AI concepts to both technical and non-technical stakeholders. -Contributing to the establishment of best practices and standards for generative AI development within the organization. Professional & Technical Skills: -Must have solid experience developing and implementing generative AI models, with a strong understanding of deep learning techniques such as GPT, VAE, and GANs. -Must be proficient in Python and have experience with machine learning libraries and frameworks such as TensorFlow, PyTorch, or Keras. -Must have strong knowledge of data structures, algorithms, and software engineering principles. -Must be familiar with cloud-based platforms and services, such as AWS, GCP, or Azure. -Need to have experience with natural language processing (NLP) techniques and tools, such as SpaCy, NLTK, or Hugging Face. -Must be familiar with data visualization tools and libraries, such as Matplotlib, Seaborn, or Plotly. -Need to have knowledge of software development methodologies, such as Agile or Scrum. -Possess excellent problem-solving skills, with the ability to think critically and creatively to develop innovative AI solutions. Additional Information: -Must have a degree in Computer Science, Artificial Intelligence, Machine Learning, or a related field. A Ph.D. is highly desirable. -strong communication skills, with the ability to effectively convey complex technical concepts to a diverse audience. -You possess a proactive mindset, with the ability to work independently and collaboratively in a fast-paced, dynamic environment. Show more Show less

Posted 6 days ago

Apply

5.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Role Description React JS Developer Lead II - Software Engineering Act creatively to develop applications by selecting appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions. Account for others' developmental activities; assisting Project Manager in day to day project execution. Outcomes Interpret the application feature and component designs to develop the same in accordance with specifications. Code debug test document and communicate product component and feature development stages. Validate results with user representatives integrating and commissions the overall solution. Select and create appropriate technical options for development such as reusing improving or reconfiguration of existing components while creating own solutions for new contexts Optimises efficiency cost and quality. Influence and improve customer satisfaction Influence and improve employee engagement within the project teams Set FAST goals for self/team; provide feedback to FAST goals of team members Measures Of Outcomes Adherence to engineering process and standards (coding standards) Adherence to project schedule / timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues Percent of voluntary attrition On time completion of mandatory compliance trainings Code Outputs Expected: Code as per the design Define coding standards templates and checklists Review code – for team and peers Documentation Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation Requirements test cases and results Configure Define and govern configuration management plan Ensure compliance from the team Test Review/Create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain Relevance Advise software developers on design and development of features and components with deeper understanding of the business problem being addressed for the client Learn more about the customer domain and identify opportunities to provide value addition to customers Complete relevant domain certifications Manage Project Support Project Manager with inputs for the projects Manage delivery of modules Manage complex user stories Manage Defects Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate Create and provide input for effort and size estimation and plan resources for projects Manage Knowledge Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release Execute and monitor release process Design Contribute to creation of design (HLD LLD SAD)/architecture for applications features business components and data models Interface With Customer Clarify requirements and provide guidance to Development Team Present design options to customers Conduct product demos Work closely with customer architects for finalizing design Manage Team Set FAST goals and provide feedback Understand aspirations of the team members and provide guidance opportunities etc Ensure team members are upskilled Ensure team is engaged in project Proactively identify attrition risks and work with BSE on retention measures Certifications Obtain relevant domain and technology certifications Skill Examples Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort resources required for developing / debugging features / components Perform and evaluate test in the customer or target environments Make quick decisions on technical/project related challenges Manage a team mentor and handle people related issues in team Have the ability to maintain high motivation levels and positive dynamics within the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback for team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers and answer customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning handling multiple tasks. Build confidence with customers by meeting the deliverables timely with a quality product. Estimate time and effort of resources required for developing / debugging features / components Knowledge Examples Appropriate software programs / modules Functional & technical designing Programming languages – proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile – Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Broad knowledge of customer domain and deep knowledge of sub domain where problem is solved Additional Comments Who we are: At UST, we help the world’s best organizations grow and succeed through transformation. Bringing together the right talent, tools, and ideas, we work with our client to co-create lasting change. Together, with over 30,000 employees in over 25 countries, we build for boundless impact—touching billions of lives in the process. Visit us at UST.com. Key Responsibilities Understand the key requirements to augment the system and application architecture as needed. Be a team player and interact with different stakeholders as required. Quickly learn new skills required to perform the job role effectively. Provide accurate estimate on the work items and effectively communicate any bottle necks on time. Deliver the assigned work items on schedule. Follow coding standards and guidelines set by the team and write secure, reliable, testable & readable code. Participate in technical discussions with software development team. Participate in planning, design, development, and implementation of multiple initiatives. Develop applications following agile software development methodologies and principles. Essential skills 5-8 Years of Professional Front-end development experience with minimum 3 years of recent hands-on experience on React JS. Good Experience and knowledge on React component libraries (E.g.: Bootstrap, Material UI, etc.) Good experience in CSS toolkits like SASS, SCSS or Styled components and BEM Guidelines for CSS. Experience with React performance testing, performance optimization and debugging (React profiler, server-side rendering, code splitting/lazy loading) Strong experience in HTML, CSS, and JavaScript. Strong knowledge in Data structures and Algorithms Strong understanding on SQL and NoSQL Databases (E.g.: Mongo DB, MS SQL Server, etc.) Proficient in Software development design patterns (E.g.: Singleton, Factory, etc.) Experience in Miro-frontend development using Module Federation plugin or Similar (E.g.: Single SPA) Experience in building dynamic visualizations using charting libraries like D3.js, Plotly JS or similar (E.g.: High charts, Chart JS, etc.) Strong Analytical and Problem-Solving skills. Good in using IDEs like VS Code or Jet Brains WebStorm/PyCharm/Rider Experience using version control systems. (e.g., Git) Experience with Frontend dev tools like Webpack, Vite, Prettier, ESlint, Rollup, Babel, etc. Desired skills Experience in Other JavaScript Frameworks is an added advantage (e.g.: Vue JS, Angular, Node JS, etc.) Good understanding on Data Grids and other relevant component libraries (E.g.: AG Grid, Handson table) Hands-on experience testing, debugging, and troubleshooting REST APIs implemented using Python Fast API or .Net Core WebAPI. Familiarity with Data science and ML frameworks Data caching and related technologies (E.g.: Redis or Memcached DB) Understanding on Queues and Tasks (E.g.: Rabbit MQ) Knowledge on SOLID design principles Experience in any one cloud platform (E.g.: AWS, Azure) Experience in building progressive web apps using React JS or Flutter (Dart) Knowledge on Containerization using Dockers and/or Kubernetes and scaling. Knowledge on CI/CD pipeline and build tools like Jenkins, JFrog, Openshift, etc Educational Qualifications Engineering Degree, Preferably in CS, ECE What we believe: We’re proud to embrace the same values that have shaped UST since the beginning. Since day one, we’ve been building enduring relationships and a culture of integrity. And today, it's those same values that are inspiring us to encourage innovation from everyone, to champion diversity and inclusion and to place people at the centre of everything we do. Humility: We will listen, learn, be empathetic and help selflessly in our interactions with everyone. Humanity: Through business, we will better the lives of those less fortunate than ourselves. Integrity: We honour our commitments and act with responsibility in all our relationships. Equal Employment Opportunity Statement UST is an Equal Opportunity Employer. We believe that no one should be discriminated against because of their differences, such as age, disability, ethnicity, gender, gender identity and expression, religion, or sexual orientation. All employment decisions shall be made without regard to age, race, creed, colour, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law. UST reserves the right to periodically redefine your roles and responsibilities based on the requirements of the organization and/or your performance. To support and promote the values of UST. Comply with all Company policies and procedures Skills design,CSS,Html,Javascript Show more Show less

Posted 6 days ago

Apply

2.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Growexx is seeking a talented and motivated Software Engineer to join our growing engineering team. You will play a key role in designing, developing, and maintaining scalable software solutions that power our analytics platform. This is an exciting opportunity to work on impactful projects in a collaborative, fast-paced environment. Key Responsibilities Design, develop, test, and deploy high-quality software solutions Collaborate with product managers, designers, and data scientists to deliver new features and enhancements Write clean, maintainable, and efficient code following best practices Participate in code reviews and contribute to the continuous improvement of engineering processes Troubleshoot and resolve technical issues across the stack Stay current with emerging technologies and propose innovative solutions Key Skills Proficiency in one or more programming languages (e.g., Python, JavaScript, TypeScript, Go, Java) Experience with modern web frameworks (e.g., React, Angular, Vue) Familiarity with RESTful APIs, microservices, and cloud platforms (e.g., AWS, Azure, GCP) Strong problem-solving skills and attention to detail Preferred Experience with data visualization libraries (e.g., D3.js, Plotly) Knowledge of data pipelines, ETL processes, or big data technologie Familiarity with containerization (Docker, Kubernetes) Exposure to machine learning or AI-driven applications Education and Experience Bachelor’s degree in Computer Science, Engineering, or related field 2+ years of professional software development experience Analytical and Personal skills Must have good logical reasoning and analytical skills Ability to break big goals to small incremental actions Excellent Communication skills in English – both written and verbal Demonstrate Ownership and Accountability of their work Great attention to details Self-Criticizing Demonstrate ownership of tasks Positive and Cheerful outlook in life Show more Show less

Posted 6 days ago

Apply

5.0 - 6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

We are looking for a Python Data Engineer with expertise in real-time data monitoring, extraction, transformation, and visualization. The ideal candidate will have experience working with Oracle SQL databases, multithreading, and AI/ML techniques and should be proficient in deploying Python applications on IIS servers . The role involves developing a system to monitor live files and folders, extract data, transform it using various techniques, and display insights on a Plotly Dash-based dashboard . Responsibilities Backend & Frontend Development: Build end-to-end solutions using Python for both backend and frontend functionalities. Data Extraction & Transformation: Implement data cleaning, regex, formatting, and data handling to process extracted information. Database Management: Insert and update records in an Oracle SQL database, ensuring data integrity and efficiency. Live File & Folder Monitoring: Develop Python scripts using Watchdog to monitor logs, detect new files/folders, and extract data in real time. Fetch live data from the database using multithreading for smooth real-time updates. Data Visualization: Develop an interactive dashboard using Plotly Dash or react for real-time data representation. Data Analytics & Pattern Finding: Perform exploratory data analysis (EDA) to identify trends, anomalies, and key insights. Cloud & AI/ML Integration: Leverage AI/ML techniques for data processing. Deployment & Maintenance: Deploy applications on an IIS server/Cloud and ensure system scalability and security. Qualifications BE/BTECH degree in Computer Science, EE, or related field. Essential Skills Strong Python programming skills Experience with Watchdog for real-time monitoring. Expertise in Oracle SQL (data insertion, updates, query optimization). Knowledge of AI/ML techniques and their practical applications. Hands-on experience with Plotly Dash/React/Angular any UI framework for dashboard development. Familiarity with IIS deployment and troubleshooting. Good understanding of data cleaning, ETL pipelines, and real-time data streaming. Strong debugging and problem-solving skills. Prior experience working on real-time monitoring systems. Experience Year of Experience: 5 - 6 years Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Step into the world of AI innovation with the Deccan AI Experts Community (By Soul AI), where you become a creator, not just a consumer. We are reaching out to the top 1% of Soul AI’s Data Visualization Engineers like YOU for a unique job opportunity to work with the industry leaders. What’s in it for you? pay above market standards The role is going to be contract based with project timelines from 2 - 6 months, or freelancing. Be a part of an Elite Community of professionals who can solve complex AI challenges. Work location could be: Remote Onsite on client location: US, UAE, UK, India etc. Deccan AI’s Office: Hyderabad or Bangalore Responsibilities: Architect and implement enterprise-level BI solutions to support strategic decision-making along with data democratization by enabling self-service analytics for non-technical users. Lead data governance and data quality initiatives to ensure consistency and design data pipelines and automated reporting solutions using SQL and Python. Optimize big data queries and analytics workloads for cost efficiency and Implement real-time analytics dashboards and interactive reports. Mentor junior analysts and establish best practices for data visualization. Required Skills: Advanced SQL, Python (Pandas, NumPy), and BI tools (Tableau, Power BI, Looker). Expertise in AWS (Athena, Redshift), GCP (Big Query), or Snowflake. Experience with data governance, lineage tracking, and big data tools (Spark, Kafka). Exposure to machine learning and AI-powered analytics. Nice to Have: Experience with graph analytics, geospatial data, and visualization libraries (D3.js, Plotly). Hands-on experience with BI automation and AI-driven analytics. Who can be a part of the community? We are looking for top-tier Data Visualization Engineers with expertise in analyzing and visualizing complex datasets. Proficiency in SQL, Tableau, Power BI, and Python (Pandas, NumPy, Matplotlib) is a plus. If you have experience in this field then this is your chance to collaborate with industry leaders. What are the next steps? 1. Register on our Soul AI website. 2. Our team will review your profile. 3. Clear all the screening rounds: Clear the assessments once you are shortlisted. As soon as you qualify all the screening rounds (assessments, interviews) you will be added to our Expert Community! 4. Profile matching: Be patient while we align your skills and preferences with the available project. 5 . Project Allocation: You’ll be deployed on your preferred project! Skip the Noise. Focus on Opportunities Built for You! Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Associate Analyst, R Programmer-3 Overview The Mastercard Economics Institute (MEI) is an economics lab powering scale at Mastercard by owning economic thought leadership in support of Mastercard’s efforts to build a more inclusive and sustainable digital economy The Economics Institute was launched in 2020 to analyze economic trends through the lens of the consumer to deliver tailored and actionable insights on economic issues for customers, partners and policymakers The Institute is composed of a team of economists and data scientists that utilize & synthesize the anonymized and aggregated data from the Mastercard network together with public data to bring powerful insights to life, in the form of 1:1 presentation, global thought leadership, media participation, and commercial work through the company’s product suites About The Role We are looking for an R programmer to join Mastercard’s Economics Institute, reporting to the team lead for Economics Technology. An individual who will: create clear, compelling data visualisations that communicate economic insights to diverse audiences develop reusable R functions and packages to support analysis and automation create and format analytical content using R Markdown and/or Quarto design and build scalable Shiny apps develop interactive visualisations using JavaScript charting libraries (e.g. Plotly, Highcharts, D3.js) or front-end frameworks (e.g. React, Angular, Vue.js)work with databases and data platforms (eg. SQL, Hadoop) write clear, well-documented code that others can understand and maintain collaborate using Git for version control All About You proficient in R and the RStudio IDE proficient in R packages like dplyr for data cleaning, transformation, and aggregation familiarity with dependency management and documentation in R (e.g. roxygen2) familiar with version control concepts and tools (e.g. Git, GitHub, Bitbucket) for collaborative development experience writing SQL and working with relational databases creative and passionate about data, coding, and technology strong collaborator who can also work independently organized and able to prioritise work across multiple projects comfortable working with engineers, product owners, data scientists, economists Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-250450 Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Associate Analyst, R Programmer-2 Overview The Mastercard Economics Institute (MEI) is an economics lab powering scale at Mastercard by owning economic thought leadership in support of Mastercard’s efforts to build a more inclusive and sustainable digital economy The Economics Institute was launched in 2020 to analyze economic trends through the lens of the consumer to deliver tailored and actionable insights on economic issues for customers, partners and policymakers The Institute is composed of a team of economists and data scientists that utilize & synthesize the anonymized and aggregated data from the Mastercard network together with public data to bring powerful insights to life, in the form of 1:1 presentation, global thought leadership, media participation, and commercial work through the company’s product suites About The Role We are looking for an R programmer to join Mastercard’s Economics Institute, reporting to the team lead for Economics Technology. An individual who will: create clear, compelling data visualisations that communicate economic insights to diverse audiences develop reusable R functions and packages to support analysis and automation create and format analytical content using R Markdown and/or Quarto design and build scalable Shiny apps develop interactive visualisations using JavaScript charting libraries (e.g. Plotly, Highcharts, D3.js) or front-end frameworks (e.g. React, Angular, Vue.js)work with databases and data platforms (eg. SQL, Hadoop) write clear, well-documented code that others can understand and maintain collaborate using Git for version control All About You proficient in R and the RStudio IDE proficient in R packages like dplyr for data cleaning, transformation, and aggregation familiarity with dependency management and documentation in R (e.g. roxygen2) familiar with version control concepts and tools (e.g. Git, GitHub, Bitbucket) for collaborative development experience writing SQL and working with relational databases creative and passionate about data, coding, and technology strong collaborator who can also work independently organized and able to prioritise work across multiple projects comfortable working with engineers, product owners, data scientists, economists Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-250449 Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Associate Analyst, R Programmer-1 Overview The Mastercard Economics Institute (MEI) is an economics lab powering scale at Mastercard by owning economic thought leadership in support of Mastercard’s efforts to build a more inclusive and sustainable digital economy The Economics Institute was launched in 2020 to analyze economic trends through the lens of the consumer to deliver tailored and actionable insights on economic issues for customers, partners and policymakers The Institute is composed of a team of economists and data scientists that utilize & synthesize the anonymized and aggregated data from the Mastercard network together with public data to bring powerful insights to life, in the form of 1:1 presentation, global thought leadership, media participation, and commercial work through the company’s product suites About The Role We are looking for an R programmer to join Mastercard’s Economics Institute, reporting to the team lead for Economics Technology. An individual who will: create clear, compelling data visualisations that communicate economic insights to diverse audiences develop reusable R functions and packages to support analysis and automation create and format analytical content using R Markdown and/or Quarto design and build scalable Shiny apps develop interactive visualisations using JavaScript charting libraries (e.g. Plotly, Highcharts, D3.js) or front-end frameworks (e.g. React, Angular, Vue.js)work with databases and data platforms (eg. SQL, Hadoop) write clear, well-documented code that others can understand and maintain collaborate using Git for version control All About You proficient in R and the RStudio IDE proficient in R packages like dplyr for data cleaning, transformation, and aggregation familiarity with dependency management and documentation in R (e.g. roxygen2) familiar with version control concepts and tools (e.g. Git, GitHub, Bitbucket) for collaborative development experience writing SQL and working with relational databases creative and passionate about data, coding, and technology strong collaborator who can also work independently organized and able to prioritise work across multiple projects comfortable working with engineers, product owners, data scientists, economists Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-250448 Show more Show less

Posted 6 days ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Role Description Data Engineer – Job Description We are looking for a highly skilled Data Engineer to design, build, and maintain robust data pipelines and infrastructure. This role requires expertise in Python, PySpark, SQL, and modern cloud platforms such as Snowflake. The ideal candidate will collaborate with business stakeholders and analytics teams to ensure the efficient collection, transformation, and delivery of data to power insights and decision-making. Responsibilities Understand business requirements, system designs, and security standards. Collaborate with SMEs to analyze existing processes, gather functional requirements, and identify improvements. Build and streamline data pipelines using Python, PySpark, SQL, and Spark from various data sources. Support data cataloging and knowledge base development. Develop tools for analytics and data science teams to optimize data product consumption. Enhance data system functionality in collaboration with data and analytics experts. Communicate insights using statistical analysis, data visualization, and storytelling techniques. Manage technical and business documentation for all data engineering efforts. Participate in hands-on development and coordinate with onshore/offshore teams. Requirements 5+ years of experience building data pipelines on on-premise and cloud platforms (e.g., Snowflake). Strong expertise in Python, PySpark, and SQL for data ingestion, transformation, and automation. Experience in developing Python-based applications with visualization libraries such as Plotly and Streamlit. Solid knowledge of data engineering concepts and practices including metadata management and data governance. Proficient in using cloud-based data warehousing and data lake environments. Familiarity with ELT/ETL tools like DBT and Cribl. Experience with incremental data capture, stream ingestion, and real-time data processing. Preferred Qualifications Background in cybersecurity, IT infrastructure, or software systems. 3+ years of experience in cloud-based data warehouse and data lake architectures. Hands-on experience with data visualization tools (e.g., Tableau, Plotly, Streamlit). Strong communication skills and ability to translate complex data into actionable insights. Technical Skills Python PySpark SQL Snowflake (or other cloud data platforms) Plotly, Streamlit, Flask, Dask ELT/ETL tools (DBT, Cribl) Data visualization (Tableau, Plotly) Metadata management & data governance Stream processing & real-time data ingestion Skills Python,Sql,Cloud Platform Show more Show less

Posted 6 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

What You'll be doing: Dashboard Development: Design, develop, and maintain interactive and visually compelling dashboards using Power BI. Implement DAX queries and data models to support business intelligence needs. Optimize performance and usability of dashboards for various stakeholders. Python & Streamlit Applications: Build and deploy lightweight data applications using Streamlit for internal and external users. Integrate Python libraries (e.g., Pandas, NumPy, Plotly, Matplotlib) for data processing and visualization. Data Integration & Retrieval: Connect to and retrieve data from RESTful APIs, cloud storage (e.g., Azure Data Lake, Cognite Data Fusion, and SQL/NoSQL databases. Automate data ingestion pipelines and ensure data quality and consistency. Collaboration & Reporting: Work closely with business analysts, data engineers, and stakeholders to gather requirements and deliver insights. Present findings and recommendations through reports, dashboards, and presentations. Requirements: Bachelor’s or master’s degree in computer science, Data Science, Information Systems, or a related field. 3+ years of experience in data analytics or business intelligence roles. Proficiency in Power BI, including DAX, Power Query, and data modeling. Strong Python programming skills, especially with Streamlit, Pandas, and API integration. Experience with REST APIs, JSON/XML parsing, and cloud data platforms (Azure, AWS, or GCP). Familiarity with version control systems like Git. Excellent problem-solving, communication, and analytical skills. Preferred Qualifications: Experience with CI/CD pipelines for data applications. Knowledge of DevOps practices and containerization (Docker). Exposure to machine learning or statistical modeling is a plus. Show more Show less

Posted 6 days ago

Apply

10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About Us: Athena is India's largest institution in the "premium undergraduate study abroad" space. Founded 10 years ago by two Princeton graduates, Poshak Agrawal and Rahul Subramaniam, Athena is headquartered in Gurgaon, with offices in Mumbai and Bangalore, and caters to students from 26 countries. Athena’s vision is to help students become the best version of themselves. Athena’s transformative, holistic life coaching program embraces both depth and breadth, sciences and the humanities. Athena encourages students to deepen their theoretical knowledge and apply it to address practical issues confronting society, both locally and globally. Through our flagship program, our students have gotten into various, universities including Harvard University, Princeton University, Yale University, Stanford University, University of Cambridge, MIT, Brown, Cornell University, University of Pennsylvania, University of Chicago, among others. Learn more about Athena: https://www.athenaeducation.co.in/article.aspx Role Overview We are looking for an AI/ML Engineer who can mentor high-potential scholars in creating impactful technology projects. This role requires a blend of strong engineering expertise, the ability to distill complex topics into digestible concepts, and a deep passion for student-driven innovation. You’ll help scholars explore the frontiers of AI—from machine learning models to generative AI systems—while coaching them in best practices and applied engineering. Key Responsibilities: Guide scholars through the full AI/ML development cycle—from problem definition, data exploration, and model selection to evaluation and deployment. Teach and assist in building: Supervised and unsupervised machine learning models. Deep learning networks (CNNs, RNNs, Transformers). NLP tasks such as classification, summarization, and Q&A systems. Provide mentorship in Prompt Engineering: Craft optimized prompts for generative models like GPT-4 and Claude. Teach the principles of few-shot, zero-shot, and chain-of-thought prompting. Experiment with fine-tuning and embeddings in LLM applications. Support scholars with real-world datasets (e.g., Kaggle, open data repositories) and help integrate APIs, automation tools, or ML Ops workflows. Conduct internal training and code reviews, ensuring technical rigor in projects. Stay updated with the latest research, frameworks, and tools in the AI ecosystem. Technical Requirements: Proficiency in Python and ML libraries: scikit-learn, XGBoost, Pandas, NumPy. Experience with deep learning frameworks : TensorFlow, PyTorch, Keras. Strong command of machine learning theory , including: Bias-variance tradeoff, regularization, and model tuning. Cross-validation, hyperparameter optimization, and ensemble techniques. Solid understanding of data processing pipelines , data wrangling, and visualization (Matplotlib, Seaborn, Plotly). Advanced AI & NLP Experience with transformer architectures (e.g., BERT, GPT, T5, LLaMA). Hands-on with LLM APIs : OpenAI (ChatGPT), Anthropic, Cohere, Hugging Face. Understanding of embedding-based retrieval , vector databases (e.g., Pinecone, FAISS), and Retrieval-Augmented Generation (RAG). Familiarity with AutoML tools , MLflow, Weights & Biases, and cloud AI platforms (AWS SageMaker, Google Vertex AI). Prompt Engineering & GenAI Proficiency in crafting effective prompts using: Instruction tuning Role-playing and system prompts Prompt chaining tools like LangChain or LlamaIndex Understanding of AI safety , bias mitigation, and interpretability. Required Qualifications: Bachelor’s degree from a Tier-1 Engineering College in Computer Science, Engineering, or a related field. 2-5 years of relevant experience in ML/AI roles. Portfolio of projects or publications in AI/ML (GitHub, blogs, competitions, etc.) Passion for education, mentoring , and working with high school scholars. Excellent communication skills, with the ability to convey complex concepts to a diverse audience. Preferred Qualifications: Prior experience in student mentorship, teaching, or edtech. Exposure to Arduino, Raspberry Pi, or IoT for integrated AI/ML projects. Strong storytelling and documentation abilities to help scholars write compelling project reports and research summaries. Show more Show less

Posted 6 days ago

Apply

3.0 years

0 Lacs

India

Remote

Linkedin logo

About BeGig BeGig is the leading tech freelancing marketplace. We empower innovative, early-stage, non-tech founders to bring their visions to life by connecting them with top-tier freelance talent. By joining BeGig, you’re not just taking on one role—you’re signing up for a platform that will continuously match you with high-impact opportunities tailored to your expertise. Your Opportunity Join our network as a Data Scientist and help fast-growing startups transform data into actionable insights, predictive models, and intelligent decision-making tools. You’ll work on real-world data challenges across domains like marketing, finance, healthtech, and AI—with full flexibility to work remotely and choose the engagements that best fit your goals. Role Overview As a Data Scientist, you will: Extract Insights from Data: Analyze complex datasets to uncover trends, patterns, and opportunities. Build Predictive Models: Develop, validate, and deploy machine learning models that solve core business problems. Communicate Clearly: Work with cross-functional teams to present findings and deliver data-driven recommendations. What You’ll Do Analytics & Modeling: Explore, clean, and analyze structured and unstructured data using statistical and ML techniques. Build predictive and classification models using tools like scikit-learn, XGBoost, TensorFlow, or PyTorch. Conduct A/B testing, customer segmentation, forecasting, and anomaly detection. Data Storytelling & Collaboration: Present complex findings in a clear, actionable way using data visualizations (e.g., Tableau, Power BI, Matplotlib). Work with product, marketing, and engineering teams to integrate models into applications or workflows. Technical Requirements & Skills Experience: 3+ years in data science, analytics, or a related field. Programming: Proficient in Python (preferred), R, and SQL. ML Frameworks: Experience with scikit-learn, TensorFlow, PyTorch, or similar tools. Data Handling: Strong understanding of data preprocessing, feature engineering, and model evaluation. Visualization: Familiar with visualization tools like Matplotlib, Seaborn, Plotly, Tableau, or Power BI. Bonus: Experience working with large datasets, cloud platforms (AWS/GCP), or MLOps practices. What We’re Looking For A data-driven thinker who can go beyond numbers to tell meaningful stories. A freelancer who enjoys solving real business problems using machine learning and advanced analytics. A strong communicator with the ability to simplify complex models for stakeholders. Why Join Us? Immediate Impact: Work on projects that directly influence product, growth, and strategy. Remote & Flexible: Choose your working hours and project commitments. Future Opportunities: BeGig will continue matching you with data science roles aligned to your strengths. Dynamic Network: Collaborate with startups building data-first, insight-driven products. Ready to turn data into decisions? Apply now to become a key Data Scientist for our client and a valued member of the BeGig network! Show more Show less

Posted 6 days ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Function: Data Science Job: Machine Learning Engineer Position: Senior Immediate manager (N+1 Job title and name): AI Manager Additional reporting line to: Global VP Engineering Position location: Mumbai, Pune, Bangalore, Hyderabad, Noida. 1. Purpose of the Job – A simple statement to identify clearly the objective of the job. The Senior Machine Learning Engineer is responsible for designing, implementing, and deploying scalable and efficient machine learning algorithms to solve complex business problems. The Machine Learning Engineer is also responsible of the lifecycle of models, once deployed in production environments, through monitoring performance and model evolution. The position is highly technical and requires an ability to collaborate with multiple technical and non-technical profiles (data scientists, data engineers, data analysts, product owners, business experts), and actively take part in a large data science community. 2. Organization chart – Indicate schematically the position of the job within the organization. It is sufficient to indicate one hierarchical level above (including possible functional boss) and, if applicable, one below the position. In the horizontal direction, the other jobs reporting to the same superior should be indicated. A Machine Learning Engineer reports to the AI Manager who reports to the Global VP Engineering. 3. Key Responsibilities and Expected Deliverables– This details what actually needs to be done; the duties and expected outcomes. Managing the lifecycle of machine learning models Develop and implement machine learning models to solve complex business problems. Ensure that models are accurate, efficient, reliable, and scalable. Deploy machine learning models to production environments, ensuring that models are integrated with software systems. Monitor machine learning models in production, ensuring that models are performing as expected and that any errors or performance issues are identified and resolved quickly. Maintain machine learning models over time. This includes updating models as new data becomes available, retraining models to improve performance, and retiring models that are no longer effective. Develop and implement policies and procedures for ensuring the ethical and responsible use of machine learning models. This includes addressing issues related to bias, fairness, transparency, and accountability. Development of data science assets Identify cross use cases data science needs that could be mutualised in a reusable piece of code. Design, contribute and participate in the implementation of python libraries answering a data science transversal need that can be reused in several projects. Maintain existing data science assets (timeseries forecasting asset, model monitoring asset) Create documentation and knowledge base on data science assets to ensure a good understanding from users. Participate to asset demos to showcase new features to users. Be an active member of the Sodexo Data Science Community Participate to the definition and maintenance of engineering standards and set of good practices around machine learning. Participate to data science team meeting and regularly share knowledge, ask questions, and learn from others. Mentor and guide junior machine learning engineers and data scientists. Participate to internal or external relevant conferences and meet ups. Continuous Improvements Stay up to date with the latest developments in the field: read research papers, attend conferences, and participate in trainings to expand their knowledge and skills. Identify and evaluate new technologies and tools that can improve the efficiency and effectiveness of machine learning projects. Propose and implement optimizations for current machine learning workflows and systems. Proactively identify areas of improvement within the pipelines. Make sure that created code is compliant with our set of engineering standards. Collaboration with other data experts (Data Engineers, Platform Engineers, and Data Analysts) Participate to pull requests reviews coming from other team members. Ask for review and comments when submitting their own work. Actively participate to the day-to-day life of the project (Agile rituals), the data science team (DS meeting) and the rest of the Global Engineering team 4. Education & Experience – Indicate the skills, knowledge and experience that the job holder should require to conduct the role effectively Engineering Master’s degree or PhD in Data Science, Statistics, Mathematics, or related fields 5 years+ experience in a Data Scientist / Machine Learning Engineer role into large corporate organizations Experience of working with ML models in a cloud ecosystem Statistics & Machine Learning Statistics : Strong understanding of statistical analysis and modelling techniques (e.g., regression analysis, hypothesis testing, time series analysis) Classical ML : Very strong knowledge in classical ML algorithms for regression & classification, supervised and unsupervised machine learning, both theoretical and practical (e.g. using scikit-learn, xgboost) ML niche: Expertise in at least one of the following ML specialisations: Timeseries forecasting / Natural Language Processing / Computer Vision Deep Learning: Good knowledge of Deep Learning fundamentals (CNN, RNN, transformer architecture, attention mechanism, …) and one of the deep learning frameworks (pytorch, tensorflow, keras) Generative AI: Good understanding of Generative AI specificities and previous experience in working with Large Language Models is a plus (e.g. with openai, langchain) MLOps Model strategy : Expertise in designing, implementing, and testing machine learning strategies. Model integration : Very strong skills in integrating a machine learning algorithm in a data science application in production. Model performance: Deep understanding of model performance evaluation metrics and existing libraries (e.g., scikit-learn, evidently) Model deployment: Experience in deploying and managing machine learning models in production either using specific cloud platform, model serving frameworks, or containerization. Model monitoring : Experience with model performance monitoring tools is a plus (Grafana, Prometheus) Software Engineering Python: Very strong coding skills in Python including modularity, OOP, data & config manipulation frameworks (e.g., pandas, pydantic) etc. Python ecosystem: Strong knowledge of tooling in Python ecosystem such as dependency management tooling (venv, poetry), documentation frameworks (e.g. sphinx, mkdocs, jupyter-book), testing frameworks (unittest, pytest) Software engineering practices: Experience in putting in place good software engineering practices such as design patterns, testing (unit, integration), clean code, code formatting etc. Debugging : Ability to troubleshoot and debug issues within machine learning pipelines Data Science Experimentation and Analytics Data Visualization : Knowledge of data visualization tools such as plotly, seaborn, matplotlib, etc. to visualise, interpret and communicate the results of machine learning models to stakeholders. Basic knowledge of PowerBI is a plus Data Cleaning : Experience with data cleaning and preprocessing techniques such as feature scaling, dimensionality reduction, and outlier detection (e.g. with pandas, scikit-learn). Data Science Experiments : Understanding of experimental design and A/B testing methodologies Data Processing: Databricks/Spark : Basic knowledge of PySpark for big data processing Databases : Basic knowledge of SQL to query data in internal systems Data Formats : Familiarity with different data storage formats such as Parquet and Delta DevOps Azure DevOps : Experience using a DevOps platform such as Azure DevOps for using Boards, Repositories, Pipelines Git: Experience working with code versioning (git), branch strategies, and collaborative work with pull requests. Proficient with the most basic git commands. CI / CD : Experience in implementing/maintaining pipelines for continuous integration (including execution of testing strategy) and continuous deployment is preferable. Cloud Platform: Azure Cloud : Previous experience with services like Azure Machine Learning Services and/or Azure Databricks on Azure is preferable. Soft skills Strong analytical and problem-solving skills, with attention to detail Excellent verbal and written communication and pedagogical skills with technical and non-technical teams Excellent teamwork and collaboration skills Adaptability and reactivity to new technologies, tools, and techniques Fluent in English 5. Competencies – Indicate which of the Sodexo core competencies and any professional competencies that the role requires Communication & Collaboration Adaptability & Agility Analytical & technical skills Innovation & Change Rigorous Problem Solving & Troubleshooting Show more Show less

Posted 1 week ago

Apply

16.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. What Primary Responsibilities: Business Knowledge: Capable of understanding the requirements for the entire project (not just own features) Capable of working closely with PMG during the design phase to drill down into detailed nuances of the requirements Has the ability and confidence to question the motivation behind certain requirements and work with PMG to refine them. Design: Can design and implement machine learning models and algorithms Can articulate and evaluate pros/cons of different AI/ML approaches Can generate cost estimates for model training and deployment Coding/Testing: Builds and optimizes machine learning pipelines Knows & brings in external ML frameworks and libraries Consistently avoids common pitfalls in model development and deployment How Quality: Solves cross-functional problems using data-driven approaches Identifies impacts/side effects of models outside of immediate scope of work Identifies cross-module issues related to data integration and model performance Identifies problems predictively using data analysis Productivity: Capable of working on multiple AI/ML projects simultaneously and context switching between them Process: Enforces process standards for model development and deployment. Independence: Acts independently to determine methods and procedures on new or special assignments Prioritizes large tasks and projects effectively Agility: Release Planning: Works with the PO to do high-level release commitment and estimation Works with PO on defining stories of appropriate size for model development Agile Maturity: Able to drive the team to achieve a high level of accomplishment on the committed stories for each iteration Shows Agile leadership qualities and leads by example WITH Team Work: Capable of working with development teams and identifying the right division of technical responsibility based on skill sets. Capable of working with external teams (e.g., Support, PO, etc.) that have significantly different technical skill sets and managing the discussions based on their needs Initiative: Capable of creating innovative AI/ML solutions that may include changes to requirements to create a better solution Capable of thinking outside-the-box to view the system as it should be rather than only how it is Proactively generates a continual stream of ideas and pushes to review and advance ideas if they make sense Takes initiative to learn how AI/ML technology is evolving outside the organization Takes initiative to learn how the system can be improved for the customers Should make problems open new doors for innovations Communication: Communicates complex AI/ML concepts internally with ease Accountability: Well versed in all areas of the AI/ML stack (data preprocessing, model training, evaluation, deployment, etc.) and aware of all components in play Leadership: Disagree without being disagreeable Use conflict as a way to drill deeper and arrive at better decisions Frequent mentorship Builds ad-hoc cross-department teams for specific projects or problems Can achieve broad scope 'buy in' across project teams and across departments Takes calculated risks Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications B.E/B.Tech/MCA/MSc/MTech (Minimum 16 years of formal education, Correspondence courses are not relevant) 5+ years of experience working on multiple layers of technology Experience deploying and maintaining ML models in production Experience in Agile teams Experience with one or more data-oriented workflow orchestration frameworks (Airflow, KubeFlow etc.) Working experience or good knowledge of cloud platforms (e.g., Azure, AWS, OCI) Ability to design, implement, and maintain CI/CD pipelines for MLOps and DevOps function Familiarity with traditional software monitoring, scaling, and quality management (QMS) Knowledge of model versioning and deployment using tools like MLflow, DVC, or similar platforms Familiarity with data versioning tools (Delta Lake, DVC, LakeFS, etc.) Demonstrate hands-on knowledge of OpenSource adoption and use cases Good understanding of Data/Information security Proficient in Data Structures, ML Algorithms, and ML lifecycle Product/Project/Program Related Tech Stack: Machine Learning Frameworks: Scikit-learn, TensorFlow, PyTorch Programming Languages: Python, R, Java Data Processing: Pandas, NumPy, Spark Visualization: Matplotlib, Seaborn, Plotly Familiarity with model versioning tools (MLFlow, etc.) Cloud Services: Azure ML, AWS SageMaker, Google Cloud AI GenAI: OpenAI, Langchain, RAG etc. Demonstrate good knowledge in Engineering Practices Demonstrates excellent problem-solving skills Proven excellent verbal, written, and interpersonal communication skills At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Show more Show less

Posted 1 week ago

Apply

16.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. What Primary Responsibilities: Business Knowledge: Capable of understanding the requirements for the entire project (not just own features) Capable of working closely with PMG during the design phase to drill down into detailed nuances of the requirements Has the ability and confidence to question the motivation behind certain requirements and work with PMG to refine them. Design: Can design and implement machine learning models and algorithms Can articulate and evaluate pros/cons of different AI/ML approaches Can generate cost estimates for model training and deployment Coding/Testing: Builds and optimizes machine learning pipelines Knows & brings in external ML frameworks and libraries Consistently avoids common pitfalls in model development and deployment How Quality: Solves cross-functional problems using data-driven approaches Identifies impacts/side effects of models outside of immediate scope of work Identifies cross-module issues related to data integration and model performance Identifies problems predictively using data analysis Productivity: Capable of working on multiple AI/ML projects simultaneously and context switching between them Process: Enforces process standards for model development and deployment. Independence: Acts independently to determine methods and procedures on new or special assignments Prioritizes large tasks and projects effectively Agility: Release Planning: Works with the PO to do high-level release commitment and estimation Works with PO on defining stories of appropriate size for model development Agile Maturity: Able to drive the team to achieve a high level of accomplishment on the committed stories for each iteration Shows Agile leadership qualities and leads by example WITH Team Work: Capable of working with development teams and identifying the right division of technical responsibility based on skill sets. Capable of working with external teams (e.g., Support, PO, etc.) that have significantly different technical skill sets and managing the discussions based on their needs Initiative: Capable of creating innovative AI/ML solutions that may include changes to requirements to create a better solution Capable of thinking outside-the-box to view the system as it should be rather than only how it is Proactively generates a continual stream of ideas and pushes to review and advance ideas if they make sense Takes initiative to learn how AI/ML technology is evolving outside the organization Takes initiative to learn how the system can be improved for the customers Should make problems open new doors for innovations Communication: Communicates complex AI/ML concepts internally with ease Accountability: Well versed in all areas of the AI/ML stack (data preprocessing, model training, evaluation, deployment, etc.) and aware of all components in play Leadership: Disagree without being disagreeable Use conflict as a way to drill deeper and arrive at better decisions Frequent mentorship Builds ad-hoc cross-department teams for specific projects or problems Can achieve broad scope 'buy in' across project teams and across departments Takes calculated risks Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications B.E/B.Tech/MCA/MSc/MTech (Minimum 16 years of formal education, Correspondence courses are not relevant) 4+ years of experience working on multiple layers of technology Experience deploying and maintaining ML models in production Experience in Agile teams Experience with one or more data-oriented workflow orchestration frameworks (Airflow, KubeFlow etc.) Working experience or good knowledge of cloud platforms (e.g., Azure, AWS, OCI) Ability to design, implement, and maintain CI/CD pipelines for MLOps and DevOps function Familiarity with traditional software monitoring, scaling, and quality management (QMS) Knowledge of model versioning and deployment using tools like MLflow, DVC, or similar platforms Familiarity with data versioning tools (Delta Lake, DVC, LakeFS, etc.) Demonstrate hands-on knowledge of OpenSource adoption and use cases Good understanding of Data/Information security Proficient in Data Structures, ML Algorithms, and ML lifecycle Product/Project/Program Related Tech Stack: Machine Learning Frameworks: Scikit-learn, TensorFlow, PyTorch Programming Languages: Python, R, Java Data Processing: Pandas, NumPy, Spark Visualization: Matplotlib, Seaborn, Plotly Familiarity with model versioning tools (MLFlow, etc.) Cloud Services: Azure ML, AWS SageMaker, Google Cloud AI GenAI: OpenAI, Langchain, RAG etc. Demonstrate good knowledge in Engineering Practices Demonstrates excellent problem-solving skills Proven excellent verbal, written, and interpersonal communication skills At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

About Us Bain & Company is a global consultancy that helps the world’s most ambitious change makers define the future. Across 65 offices in 40 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. In 2004, the firm established its presence in the Indian market by opening the Bain Capability Center (BCC) in New Delhi. The BCC is now known as BCN (Bain Capability Network) with its nodes across various geographies. BCN is an integral and largest unit of (ECD) Expert Client Delivery. ECD plays a critical role as it adds value to Bain's case teams globally by supporting them with analytics and research solutioning across all industries, specific domains for corporate cases, client development, private equity diligence or Bain intellectual property. The BCN comprises of Consulting Services, Knowledge Services and Shared Services. Who You Will Work With The Consumer Products Center of Expertise collaborates with Bain’s global Consumer Products Practice leadership, client-facing Bain leadership and teams, and with end clients on development and delivery of Bain’s proprietary CP products and solutions. These solutions aim to answer strategic questions of Bain’s CP clients relating to brand strategy (consumer needs, assortment, pricing, distribution), revenue growth management (pricing strategy, promotions, profit pools, trade terms), negotiation strategy with key retailers, optimization of COGS etc. You will work as part of the team in CP CoE comprising of a mix of Director, Managers, Projects Leads, Associates and Analysts working to implement cloud-based end-to-end advanced analytics solutions. Delivery models on projects vary from working as part of a CP Center of Expertise, broader global Bain case team within the CP ringfence, or within other industry CoEs such as FS / Retail / TMT / Energy / CME / etc with BCN on need basis What You’ll Do Lead AI Initiatives: Architect and implement AI-driven solutions for Consumer Products consulting and client delivery AI & Machine Learning: Develop and deploy Generative AI-based solutions, leveraging cloud computing platforms (AWS/ Azure/GCP) Technical Leadership: Guide the team in AI, machine learning, and data engineering, ensuring optimal architecture and deployment Software Development & Data Engineering: Hands-on experience with Python programming, microservices architecture, and cloud technologies (AWS) Database: Proven experience with database systems such as SQL Server, PostgreSQL, good to have knowledge of Snowflake, Oracle Frontend and Backend Technologies: Exposure to HTML, CSS, Javascript, ReactJS and FastAPI, Django Python Libraries: Exposure to data analysis and visualization libraries like Matplotlib, Plotly, Pandas, Numpy etc. Prompt Engineering & Generative AI: Strong understanding of LLMs, NLP, Chains, Agents and AI model fine-tuning Client Engagement: Work in client-facing roles, understand business needs, and translate them into AI-driven solutions Strategic AI Roadmap: Identify technical limitations, propose alternative approaches, and ensure the scalability of AI applications About You 8+ years of experience in Data Engineering, Software Engineering, and AI/ML Experience in CPG consulting and client delivery Expertise in Python, Generative AI, and cloud computing (AWS/Azure/GCP) Hands-on experience with microservices architecture and chatbot development Adept at Prompt Engineering and AI model optimization Proven track record of leading AI-driven projects with a strategic and solution-oriented mindset Experience in client-facing roles, managing interactions and delivering AI- AI-powered business solutions Prior experience in AI-driven consulting for the Consumer Products industry. Exposure to LLM-based applications Ability to bridge the gap between business and technology, translating client needs into actionable AI solutions What Makes Us a Great Place To Work We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents.. Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Delhi

On-site

About us Bain & Company is a global consultancy that helps the world’s most ambitious change makers define the future. Across 65 offices in 40 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. In 2004, the firm established its presence in the Indian market by opening the Bain Capability Center (BCC) in New Delhi. The BCC is now known as BCN (Bain Capability Network) with its nodes across various geographies. BCN is an integral and largest unit of (ECD) Expert Client Delivery. ECD plays a critical role as it adds value to Bain's case teams globally by supporting them with analytics and research solutioning across all industries, specific domains for corporate cases, client development, private equity diligence or Bain intellectual property. The BCN comprises of Consulting Services, Knowledge Services and Shared Services. Who you will work with The Consumer Products Center of Expertise collaborates with Bain’s global Consumer Products Practice leadership, client-facing Bain leadership and teams, and with end clients on development and delivery of Bain’s proprietary CP products and solutions. These solutions aim to answer strategic questions of Bain’s CP clients relating to brand strategy (consumer needs, assortment, pricing, distribution), revenue growth management (pricing strategy, promotions, profit pools, trade terms), negotiation strategy with key retailers, optimization of COGS etc. You will work as part of the team in CP CoE comprising of a mix of Director, Managers, Projects Leads, Associates and Analysts working to implement cloud-based end-to-end advanced analytics solutions. Delivery models on projects vary from working as part of a CP Center of Expertise, broader global Bain case team within the CP ringfence, or within other industry CoEs such as FS / Retail / TMT / Energy / CME / etc with BCN on need basis What you’ll do Lead AI Initiatives : Architect and implement AI-driven solutions for Consumer Products consulting and client delivery AI & Machine Learning : Develop and deploy Generative AI-based solutions , leveraging cloud computing platforms (AWS/ Azure/GCP) Technical Leadership : Guide the team in AI, machine learning, and data engineering , ensuring optimal architecture and deployment Software Development & Data Engineering : Hands-on experience with Python programming, microservices architecture, and cloud technologies (AWS) Database: Proven experience with database systems such as SQL Server, PostgreSQL, good to have knowledge of Snowflake, Oracle Frontend and Backend Technologies : Exposure to HTML, CSS, Javascript, ReactJS and FastAPI, Django Python Libraries: Exposure to data analysis and visualization libraries like Matplotlib, Plotly, Pandas, Numpy etc. Prompt Engineering & Generative AI : Strong understanding of LLMs, NLP, Chains, Agents and AI model fine-tuning Client Engagement : Work in client-facing roles, understand business needs, and translate them into AI-driven solutions Strategic AI Roadmap : Identify technical limitations , propose alternative approaches, and ensure the scalability of AI applications About you 8+ years of experience in Data Engineering, Software Engineering, and AI/ML Experience in CPG consulting and client delivery Expertise in Python, Generative AI, and cloud computing (AWS/Azure/GCP) Hands-on experience with microservices architecture and chatbot development Adept at Prompt Engineering and AI model optimization Proven track record of leading AI-driven projects with a strategic and solution-oriented mindset Experience in client-facing roles , managing interactions and delivering AI- AI-powered business solutions Prior experience in AI-driven consulting for the Consumer Products industry . Exposure to LLM-based applications Ability to bridge the gap between business and technology , translating client needs into actionable AI solutions What makes us a great place to work We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents..

Posted 1 week ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About us Bain & Company is a global consultancy that helps the world’s most ambitious change makers define the future. Across 65 offices in 40 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. In 2004, the firm established its presence in the Indian market by opening the Bain Capability Center (BCC) in New Delhi. The BCC is now known as BCN (Bain Capability Network) with its nodes across various geographies. BCN is an integral and largest unit of (ECD) Expert Client Delivery. ECD plays a critical role as it adds value to Bain's case teams globally by supporting them with analytics and research solutioning across all industries, specific domains for corporate cases, client development, private equity diligence or Bain intellectual property. The BCN comprises of Consulting Services, Knowledge Services and Shared Services. Who you will work with The Consumer Products Center of Expertise collaborates with Bain’s global Consumer Products Practice leadership, client-facing Bain leadership and teams, and with end clients on development and delivery of Bain’s proprietary CP products and solutions. These solutions aim to answer strategic questions of Bain’s CP clients relating to brand strategy (consumer needs, assortment, pricing, distribution), revenue growth management (pricing strategy, promotions, profit pools, trade terms), negotiation strategy with key retailers, optimization of COGS etc. You will work as part of the team in CP CoE comprising of a mix of Director, Managers, Projects Leads, Associates and Analysts working to implement cloud-based end-to-end advanced analytics solutions. Delivery models on projects vary from working as part of a CP Center of Expertise, broader global Bain case team within the CP ringfence, or within other industry CoEs such as FS / Retail / TMT / Energy / CME / etc with BCN on need basis What you’ll do Lead AI Initiatives : Architect and implement AI-driven solutions for Consumer Products consulting and client delivery AI & Machine Learning : Develop and deploy Generative AI-based solutions , leveraging cloud computing platforms (AWS/ Azure/GCP) Technical Leadership : Guide the team in AI, machine learning, and data engineering , ensuring optimal architecture and deployment Software Development & Data Engineering : Hands-on experience with Python programming, microservices architecture, and cloud technologies (AWS) Database: Proven experience with database systems such as SQL Server, PostgreSQL, good to have knowledge of Snowflake, Oracle Frontend and Backend Technologies : Exposure to HTML, CSS, Javascript, ReactJS and FastAPI, Django Python Libraries: Exposure to data analysis and visualization libraries like Matplotlib, Plotly, Pandas, Numpy etc. Prompt Engineering & Generative AI : Strong understanding of LLMs, NLP, Chains, Agents and AI model fine-tuning Client Engagement : Work in client-facing roles, understand business needs, and translate them into AI-driven solutions Strategic AI Roadmap : Identify technical limitations , propose alternative approaches, and ensure the scalability of AI applications About you 8+ years of experience in Data Engineering, Software Engineering, and AI/ML Experience in CPG consulting and client delivery Expertise in Python, Generative AI, and cloud computing (AWS/Azure/GCP) Hands-on experience with microservices architecture and chatbot development Adept at Prompt Engineering and AI model optimization Proven track record of leading AI-driven projects with a strategic and solution-oriented mindset Experience in client-facing roles , managing interactions and delivering AI- AI-powered business solutions Prior experience in AI-driven consulting for the Consumer Products industry . Exposure to LLM-based applications Ability to bridge the gap between business and technology , translating client needs into actionable AI solutions What makes us a great place to work We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents .. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Machine Learning Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: These roles have many overlapping skills with GENAI Engineers and architects. Description may scaleup/scale down based on expected seniority. Roles & Responsibilities: -Implement generative AI models, identify insights that can be used to drive business decisions. Work closely with multi-functional teams to understand business problems, develop hypotheses, and test those hypotheses with data, collaborating with cross-functional teams to define AI project requirements and objectives, ensuring alignment with overall business goals. -Conducting research to stay up-to-date with the latest advancements in generative AI, machine learning, and deep learning techniques and identify opportunities to integrate them into our products and services. -Optimizing existing generative AI models for improved performance, scalability, and efficiency. -Ensure data quality and accuracy -Leading the design and development of prompt engineering strategies and techniques to optimize the performance and output of our GenAI models. -Implementing cutting-edge NLP techniques and prompt engineering methodologies to enhance the capabilities and efficiency of our GenAI models. -Determining the most effective prompt generation processes and approaches to drive innovation and excellence in the field of AI technology, collaborating with AI researchers and developers -Experience working with cloud based platforms (example: AWS, Azure or related) -Strong problem-solving and analytical skills -Proficiency in handling various data formats and sources through Omni Channel for Speech and voice applications, part of conversational AI -Prior statistical modelling experience -Demonstrable experience with deep learning algorithms and neural networks -Developing clear and concise documentation, including technical specifications, user guides, and presentations, to communicate complex AI concepts to both technical and non-technical stakeholders. -Contributing to the establishment of best practices and standards for generative AI development within the organization. Professional & Technical Skills: -Must have solid experience developing and implementing generative AI models, with a strong understanding of deep learning techniques such as GPT, VAE, and GANs. -Must be proficient in Python and have experience with machine learning libraries and frameworks such as TensorFlow, PyTorch, or Keras. -Must have strong knowledge of data structures, algorithms, and software engineering principles. -Must be familiar with cloud-based platforms and services, such as AWS, GCP, or Azure. -Need to have experience with natural language processing (NLP) techniques and tools, such as SpaCy, NLTK, or Hugging Face. -Must be familiar with data visualization tools and libraries, such as Matplotlib, Seaborn, or Plotly. -Need to have knowledge of software development methodologies, such as Agile or Scrum. -Possess excellent problem-solving skills, with the ability to think critically and creatively to develop innovative AI solutions. Additional Information: -Must have a degree in Computer Science, Artificial Intelligence, Machine Learning, or a related field. A Ph.D. is highly desirable. -strong communication skills, with the ability to effectively convey complex technical concepts to a diverse audience. -You possess a proactive mindset, with the ability to work independently and collaboratively in a fast-paced, dynamic environment. Show more Show less

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies