Home
Jobs

817 Matplotlib Jobs - Page 11

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

12 - 16 Lacs

Mumbai, Gurugram, Bengaluru

Work from Office

Naukri logo

Research and Problem-Solving: Identify and frame business problems, conduct exploratory data analysis, and propose innovative data science solutions tailored to business needs. Leadership & Communication: Serve as a technical referent for the research team, driving high-impact, high-visibility initiatives. Effectively communicate complex scientific concepts to senior stakeholders, ensuring insights are actionable for both technical and non-technical audiences. Mentor and develop scientists within the team, fostering growth and technical excellence. Algorithm Development: Design, optimize, and implement advanced machine learning algorithms, including neural networks, ensemble models (XGBoost, random forests), and clustering techniques. End-to-End Project Ownership: Lead the development, deployment, and monitoring of machine learning models and data pipelines for large-scale applications. Model Optimization and Scalability: Focus on optimizing algorithms for performance and scalability, ensuring robust, well-calibrated models suitable for real-time environments. A/B Testing and Validation: Design and execute experiments, including A/B testing, to validate model effectiveness and business impat. Big Data Handling: Leverage tools like BigQuery, advanced SQL, and cloud platforms (e.g., GCP) to process and analyze large datasets. Collaboration and Mentorship: Work closely with engineering, product, and campaign management teams, while mentoring junior data scientists in best practices and advanced techniques. Data Visualization: Create impactful visualizations using tools like matplotlib, seaborn, Looker, and Grafana to communicate insights effectively to stakeholders. Required Experience/Skills 5–8 years of hands-on experience in data science or machine learning roles. 2+ years leading data science projects in AdTech Strong hands-on skills in Advanced Statistics, Machine Learning, and Deep Learning. Demonstrated ability to implement and optimize neural networks and other advanced ML models. Proficiency in Python for developing machine learning models, with a strong grasp of TensorFlow or PyTorch. Expertise handling large datasets using advanced SQL and big data tools like BigQuery In-depth knowledge of MLOps pipelines, from data preprocessing to deployment and monitoring. Strong background in A/B testing, statistical analysis, and experimental design. Proven capability in clustering, segmentation, and unsupervised learning methods. Strong problem-solving and analytical skills with a focus on delivering business value. Education: A Master’s in Data Science, Computer Science, Mathematics, Statistics, or a related field is preferred. A Bachelor's degree with exceptional experience will also be considered.

Posted 1 week ago

Apply

3.0 - 6.0 years

10 - 15 Lacs

Gurugram, Bengaluru

Work from Office

Naukri logo

3+ years of experience in data science roles, working with tabular data in large-scale projects. Experience in feature engineering and working with methods such as XGBoost, LightGBM, factorization machines , and similar algorithms. Experience in adtech or fintech industries is a plus. Familiarity with clickstream data, predictive modeling for user engagement, or bidding optimization is highly advantageous. MS or PhD in mathematics, computer science, physics, statistics, electrical engineering, or a related field. Proficiency in Python (3.9+), with experience in scientific computing and machine learning tools (e.g., NumPy, Pandas, SciPy, scikit-learn, matplotlib, etc.). Familiarity with deep learning frameworks (such as TensorFlow or PyTorch) is a plus. Strong expertise in applied statistical methods, A/B testing frameworks, advanced experiment design, and interpreting complex experimental results. Experience querying and processing data using SQL and working with distributed data storage solutions (e.g., AWS Redshift, Snowflake, BigQuery, Athena, Presto, MinIO, etc.). Experience in budget allocation optimization, lookalike modeling, LTV prediction, or churn analysis is a plus. Ability to manage multiple projects, prioritize tasks effectively, and maintain a structured approach to complex problem-solving. Excellent communication and collaboration skills to work effectively with both technical and business teams.

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Data Analysis Intern Location: Remote Job Type: Internship (Full-Time) Duration: 1–3 Months Stipend: ₹25,000/month Department: Data & Analytics Job Summary: We are seeking a detail-oriented and analytical Data Analysis Intern to join our remote data team. This internship is ideal for individuals looking to apply their skills in statistics, data handling, and business intelligence to real-world problems. You will gain hands-on experience with data tools and contribute to meaningful data-driven decision-making. Key Responsibilities: Collect, clean, and preprocess data from various sources Perform exploratory data analysis (EDA) and identify trends, patterns, and insights Create visualizations and dashboards to present findings using tools like Excel, Power BI, or Tableau Assist in building reports and communicating insights to different teams Document analytical processes and ensure data accuracy and consistency Collaborate with cross-functional teams to support ongoing data initiatives Qualifications: Bachelor’s degree (or final year student) in Data Science, Statistics, Computer Science, Economics, or related field Strong skills in Excel, SQL, and Python or R Understanding of basic statistical concepts and data analysis techniques Familiarity with data visualization tools such as Power BI, Tableau, or Matplotlib Good problem-solving skills and attention to detail Ability to work independently in a remote environment Preferred Skills (Nice to Have): Experience working with large datasets or real-world business data Knowledge of A/B testing, correlation analysis, or regression techniques Exposure to data cleaning and automation tools Familiarity with Jupyter Notebooks, Google Sheets, or cloud data tools What We Offer: Monthly stipend of ₹25,000 100% remote internship Exposure to real-world business and product data Mentorship from experienced data analysts and domain experts Certificate of Completion Opportunity for full-time placement based on performance Show more Show less

Posted 1 week ago

Apply

5.0 - 6.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

Experience: 5-6 years Key Responsibilities Process, analyze, and interpret time-series data from MEMS sensors (e.g., accelerometers, gyroscopes, pressure sensors). Develop and apply statistical methods to identify trends, anomalies, and key performance metrics. Compute and optimize KPIs related to sensor performance, reliability, and drift analysis. Utilize MATLAB toolboxes (e.g., Data Cleaner, Ground Truth Labeler) or Python libraries for data validation, annotation, and anomaly detection. Clean, preprocess, and visualize large datasets to uncover actionable insights. Collaborate with hardware engineers, software developers, and product owners to support end-to-end data workflows. Convert and format data into standardized schemas for use in data pipelines and simulations. Generate automated reports and build dashboards using Power BI or Tableau. Document methodologies, processes, and findings in clear and concise technical reports. Required Qualifications Proficiency in Python or MATLAB for data analysis, visualization, and reporting. Strong foundation in time-series analysis , signal processing, and statistical modeling (e.g., autocorrelation, moving averages, seasonal decomposition). Experience working with MEMS sensors and sensor data acquisition systems. Hands-on experience with pandas, NumPy, SciPy, scikit-learn, and matplotlib . Ability to develop automated KPI reports and interactive dashboards (Power BI or Tableau). Preferred Qualifications Prior experience with data from smartphones, hearables, or wearable devices . Advanced knowledge in MEMS sensor data wrangling techniques. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud Platform. Exposure to real-time data streaming and processing frameworks/toolboxes. Show more Show less

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Position-Azure Data Engineer Location- Hyderabad Mandatory Skills- Azure Databricks, pyspark Experience-5 to 9 Years Notice Period- 0 to 30 days/ Immediately Joiner/ Serving Notice period Interview Date- 13-June-25 Interview Mode- Virtual Drive Must have Experience: Strong design and data solutioning skills PySpark hands-on experience with complex transformations and large dataset handling experience Good command and hands-on experience in Python. Experience working with following concepts, packages, and tools, Object oriented and functional programming NumPy, Pandas, Matplotlib, requests, pytest Jupyter, PyCharm and IDLE Conda and Virtual Environment Working experience must with Hive, HBase or similar Azure Skills Must have working experience in Azure Data Lake, Azure Data Factory, Azure Databricks, Azure SQL Databases Azure DevOps Azure AD Integration, Service Principal, Pass-thru login etc. Networking – vnet, private links, service connections, etc. Integrations – Event grid, Service Bus etc. Database skills Oracle, Postgres, SQL Server – any one database experience Oracle PL/SQL or T-SQL experience Data modelling Thank you Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Position: Data Analyst Intern (Full-Time) Company: Lead India Location: Remote Stipend: ₹25,000/month Duration: 1–3 months (Full-Time Internship) About Lead India: Lead India is a forward-thinking technology company that helps businesses make smarter decisions through data. We provide meaningful internship opportunities for emerging professionals to gain real-world experience in data analysis, reporting, and decision-making. Role Overview: We are seeking a Data Analyst Intern to support our data and product teams in gathering, analyzing, and visualizing business data. This internship is ideal for individuals who enjoy working with numbers, identifying trends, and turning data into actionable insights. Key Responsibilities: Analyze large datasets to uncover patterns, trends, and insights Create dashboards and reports using tools like Excel, Power BI, or Tableau Write and optimize SQL queries for data extraction and analysis Assist in data cleaning, preprocessing, and validation Collaborate with cross-functional teams to support data-driven decisions Document findings and present insights to stakeholders Skills We're Looking For: Strong analytical and problem-solving skills Basic knowledge of SQL and data visualization tools (Power BI, Tableau, or Excel) Familiarity with Python for data analysis (pandas, matplotlib) is a plus Good communication and presentation skills Detail-oriented with a willingness to learn and grow What You’ll Gain: ₹25,000/month stipend Real-world experience in data analysis and reporting Mentorship from experienced analysts and developers Remote-first, collaborative work environment Potential for a Pre-Placement Offer (PPO) based on performance Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Position: Data Analyst Intern (Full-Time) Company: Lead India Location: Remote Stipend: ₹25,000/month Duration: 1–3 months (Full-Time Internship) About Lead India: Lead India is a forward-thinking technology company that helps businesses make smarter decisions through data. We provide meaningful internship opportunities for emerging professionals to gain real-world experience in data analysis, reporting, and decision-making. Role Overview: We are seeking a Data Analyst Intern to support our data and product teams in gathering, analyzing, and visualizing business data. This internship is ideal for individuals who enjoy working with numbers, identifying trends, and turning data into actionable insights. Key Responsibilities: Analyze large datasets to uncover patterns, trends, and insights Create dashboards and reports using tools like Excel, Power BI, or Tableau Write and optimize SQL queries for data extraction and analysis Assist in data cleaning, preprocessing, and validation Collaborate with cross-functional teams to support data-driven decisions Document findings and present insights to stakeholders Skills We're Looking For: Strong analytical and problem-solving skills Basic knowledge of SQL and data visualization tools (Power BI, Tableau, or Excel) Familiarity with Python for data analysis (pandas, matplotlib) is a plus Good communication and presentation skills Detail-oriented with a willingness to learn and grow What You’ll Gain: ₹25,000/month stipend Real-world experience in data analysis and reporting Mentorship from experienced analysts and developers Remote-first, collaborative work environment Potential for a Pre-Placement Offer (PPO) based on performance Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Position: Data Analyst Intern (Full-Time) Company: Lead India Location: Remote Stipend: ₹25,000/month Duration: 1–3 months (Full-Time Internship) About Lead India: Lead India is a forward-thinking technology company that helps businesses make smarter decisions through data. We provide meaningful internship opportunities for emerging professionals to gain real-world experience in data analysis, reporting, and decision-making. Role Overview: We are seeking a Data Analyst Intern to support our data and product teams in gathering, analyzing, and visualizing business data. This internship is ideal for individuals who enjoy working with numbers, identifying trends, and turning data into actionable insights. Key Responsibilities: Analyze large datasets to uncover patterns, trends, and insights Create dashboards and reports using tools like Excel, Power BI, or Tableau Write and optimize SQL queries for data extraction and analysis Assist in data cleaning, preprocessing, and validation Collaborate with cross-functional teams to support data-driven decisions Document findings and present insights to stakeholders Skills We're Looking For: Strong analytical and problem-solving skills Basic knowledge of SQL and data visualization tools (Power BI, Tableau, or Excel) Familiarity with Python for data analysis (pandas, matplotlib) is a plus Good communication and presentation skills Detail-oriented with a willingness to learn and grow What You’ll Gain: ₹25,000/month stipend Real-world experience in data analysis and reporting Mentorship from experienced analysts and developers Remote-first, collaborative work environment Potential for a Pre-Placement Offer (PPO) based on performance Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Title : AI/ML Intern Location : Gurgaon, India Employment Type : Internship (Paid) Stipend : As per Industry Standards About Aaizel Tech Labs Aaizel Tech Labs is a pioneering tech startup at the intersection of cybersecurity, AI, geospatial solutions, and more. We are passionate about leveraging technology to develop high-performance products and cutting-edge solutions. As a growing startup, we seek dynamic individuals eager to work on transformative projects in AI and Machine Learning. Role Overview We are looking for a motivated AI/ML Intern to join our data science team. This internship offers hands-on experience with model development, data engineering, and deployment in real-world projects. You will collaborate with experienced professionals and contribute to initiatives spanning predictive analytics, computer vision, and more helping to shape the future of technology at Aaizel Tech Labs. Key Responsibilities 1. Model Development & Optimization • ML Model Implementation: Assist in designing, implementing, and deploying machine learning models for applications like predictive analytics and anomaly detection. • Deep Learning Exposure: Gain experience with deep learning frameworks by working with CNNs, RNNs, and exploring generative models (GANs) on guided projects. • Experimentation: Help run experiments and tune models using basic hyperparameter optimization techniques (Grid Search, etc.). 2. Data Engineering & Preprocessing • Data Preparation: Support the collection, cleaning, and preprocessing of datasets using libraries like Pandas and NumPy. • ETL Assistance: Assist in developing simple ETL pipelines to process data from diverse sources such as IoT sensors or satellite imagery. • Integration: Learn to integrate data from APIs and databases to build comprehensive datasets for analysis. 3. Research & Algorithm Development • Innovation Exposure: Research state-of-the-art machine learning techniques (e.g., Transfer Learning, Transformer models) and assist in applying these to ongoing projects. • Algorithm Exploration: Participate in team discussions to brainstorm new approaches for solving real-world problems in cybersecurity, climate monitoring, or geospatial data analysis. 4. Deployment & MLOps • Deployment Support: Gain hands-on experience deploying models using container technologies like Docker and basic CI/CD pipelines. • Cloud Platforms: Assist in experiments with cloud platforms (AWS, Azure, or GCP) for scalable model serving solutions. • Lifecycle Management: Learn best practices for model versioning, monitoring, and maintenance. 5. Performance Evaluation & Tuning • Model Metrics: Help evaluate model performance using metrics such as F1 Score, AUCROC, and other domain-relevant measures. • Tuning Assistance: Support the process of model tuning through guided experiments and parameter adjustments. 6. Collaboration & Code Quality • Team Integration: Collaborate with data engineers, cybersecurity experts, and geospatial analysts to integrate AI solutions into end-to-end products. • Coding Standards: Contribute to maintaining high-quality codebases by following best practices and using version control (Git). • Documentation: Assist in documenting your work, including model specifications, experiments, and deployment processes. 7. Monitoring & Maintenance • Dashboard Support: Participate in the creation of monitoring dashboards (using tools like Grafana or Prometheus) to track model performance. • Feedback Loops: Help develop feedback mechanisms to retrain models based on real-time data and evolving application needs. Skills & Qualifications Required Qualifications: • Currently pursuing or recently completed a Bachelor’s degree in Computer Science, Data Science, Machine Learning, or a related field. • Proficiency in Python and familiarity with libraries such as Pandas, NumPy, and scikit-learn. • Basic understanding of machine learning algorithms and experience (academic projects or internships) with model development. • Exposure to one or more deep learning frameworks (e.g., TensorFlow, PyTorch) is a plus. • Ability to work collaboratively in a team-oriented environment. • Strong analytical and problem-solving skills, with attention to detail. • Good written and verbal communication skills. Preferred Qualifications: • Familiarity with data visualization tools (e.g., Matplotlib, Seaborn) and basic dashboarding. • Some experience with SQL and NoSQL databases. • Interest in cloud platforms (AWS, Azure, or Google Cloud) and containerization (Docker). • Knowledge of version control systems (Git) and basic CI/CD concepts. • Prior internship or project experience in AI/ML is advantageous. Learning Opportunities • Practical Projects: Work on real-world AI/ML projects that contribute directly to our product development. • Mentorship: Benefit from one-on-one guidance from experienced data scientists and machine learning engineers. • Skill Development: Gain exposure to industry-standard tools, frameworks, and best practices in AI and ML. • Cross-Disciplinary Exposure: Collaborate with experts in cybersecurity, geospatial analysis, and data engineering. • Career Growth: Develop your professional network and acquire skills that could lead to a full-time opportunity. Application Process Please submit your resume and a cover letter outlining your relevant experience and how you can contribute to Aaizel Tech Labs’ success. Send your application to hr@aaizeltech.com , bhavik@aaizeltech.com or anju@aaizeltech.com. Join Aaizel Tech Labs and be part of a team that’s shaping the future of Big Data & AI-driven applications! Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Data Science Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Application Deadline: 10th June 2025 Opportunity: Full-time role based on performance + Internship Certificate About Unified Mentor Unified Mentor provides aspiring professionals with hands-on experience in data science through industry-relevant projects, helping them build successful careers. Responsibilities Collect, preprocess, and analyze large datasets Develop predictive models and machine learning algorithms Perform exploratory data analysis (EDA) to extract insights Create data visualizations and dashboards for effective communication Collaborate with cross-functional teams to deliver data-driven solutions Requirements Enrolled in or a graduate of Data Science, Computer Science, Statistics, or a related field Proficiency in Python or R for data analysis and modeling Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) Familiarity with data visualization tools like Tableau, Power BI, or Matplotlib Strong analytical and problem-solving skills Excellent communication and teamwork abilities Stipend & Benefits Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) Hands-on experience in data science projects Certificate of Internship & Letter of Recommendation Opportunity to build a strong portfolio of data science models and applications Potential for full-time employment based on performance How to Apply Submit your resume and a cover letter with the subject line "Data Science Intern Application." Equal Opportunity: Unified Mentor welcomes applicants from all backgrounds. Show more Show less

Posted 1 week ago

Apply

25.0 years

0 Lacs

India

Remote

Linkedin logo

Opportunities: Full-time remote or work-from-home Day shift, AEST Health Insurance Career Growth About the Role: We are looking for a passionate and motivated individual to join our team as an AI & Data Science Engineer. If you have a strong foundation in Python programming, SQL, and working with APIs, and are eager to learn and grow in the field of Artificial Intelligence (AI), Natural Language Processing (NLP), and Machine Learning (ML), this role is perfect for you! As part of our team, you will have the opportunity to work on cutting-edge AI technologies, including generative AI models, and develop solutions that solve real-world problems. Key Responsibilities: Learn and contribute to the design and development of AI and machine learning models. Work with structured and unstructured data to uncover insights and build predictive models. Assist in creating NLP solutions for tasks like text classification, sentiment analysis, and summarisation. Gain hands-on experience in deep learning for image processing, speech recognition, and generative AI. Write clean and efficient Python code for data analysis and model development. Work with SQL databases to retrieve and analyse data. Learn how to integrate APIS into AI workflows. Explore Generative AI technologies (e.g., GPT, DALL·E) and contribute to innovative solutions. Collaborate with senior team members to develop impactful AI-powered applications. Document your findings and contribute to knowledge-sharing within the team. Required Skills & Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or a related field. Strong Python programming skills and familiarity with libraries like Pandas, NumPy, and Matplotlib. Basic knowledge of SQL for data manipulation and extraction. Understanding of Machine Learning concepts and algorithms. Interest in Natural Language Processing (NLP) and familiarity with tools like spaCy, NLTK, or Hugging Face is a plus. Willingness to learn and work with Deep Learning frameworks such as TensorFlow or PyTorch. Problem-solving mindset with the ability to work independently and within a team. Good communication skills and enthusiasm for learning new technologies. Technical requirements: Windows 11 operating system or MacOS 13+ 256GB Storage space - minimum 16GB RAM - minimum Dual Core CPU - minimum Camera: HD Webcam (720p) Headset: Noise-cancelling (preferably) Internet Speed: 50 Mbps - minimum Why Join Us? Opportunity to work on cutting-edge data science, machine learning, and AI projects. A collaborative and inclusive work environment that values continuous learning and innovation. Access to resources and mentorship to enhance your skills in NLP, ML, DL, and Generative AI . Competitive compensation package and growth opportunities. Note: Include your LinkedIn Account in your Resume About The Company: Freedom Property Investors is the largest and number one property investment company in Australia, with its main offices in the Sydney and Melbourne CBDs. We were awarded the 3rd fastest-growing business in Australia across all industries according to the Australian Financial Review. We are privileged to have 25+ years of combined experience between our two founders, who served over 10,000 valued members and over 300 full-time staff spread across Australia and growing. We pride ourselves on being the industry leaders. It is our mission to serve our valued members, earning over 2,054 positive Google reviews and a 4.8 Star rating, this is unheard of in our industry. We are in need of people who share the same values as we do. This opportunity is open to all driven individuals who are committed to helping people and earning life-changing income. Join Australia’s largest and number 1 property investment team and contribute to our mission to help Australians achieve their goals of financial freedom every day. Apply now!!! Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio Your Role And Responsibilities As an Associate Data Scientist at IBM, you will work to solve business problems using leading edge and open-source tools such as Python, R, and TensorFlow, combined with IBM tools and our AI application suites. You will prepare, analyze, and understand data to deliver insight, predict emerging trends, and provide recommendations to stakeholders. In Your Role, You May Be Responsible For Implementing and validating predictive and prescriptive models and creating and maintaining statistical models with a focus on big data & incorporating machine learning. techniques in your projects Writing programs to cleanse and integrate data in an efficient and reusable manner Working in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors Communicating with internal and external clients to understand and define business needs and appropriate modelling techniques to provide analytical solutions. Evaluating modelling results and communicating the results to technical and non-technical audiences Preferred Education Master's Degree Required Technical And Professional Expertise Proof of Concept (POC) Development: Develop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Collaborate with development teams to implement and iterate on POCs, ensuring alignment with customer requirements and expectations. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another, particularly COBOL to JAVA through rapid prototypes/ PoC Document solution architectures, design decisions, implementation details, and lessons learned. Create technical documentation, white papers, and best practice guides Preferred Technical And Professional Experience Strong programming skills, with proficiency in Python and experience with AI frameworks such as TensorFlow, PyTorch, Keras or Hugging Face. Understanding in the usage of libraries such as SciKit Learn, Pandas, Matplotlib, etc. Familiarity with cloud platforms Experience and working knowledge in COBOL & JAVA would be preferred Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Kolkata metropolitan area, West Bengal, India

On-site

Linkedin logo

Job description About Reboot Robotics Academy: Reboot Robotics Academy is dedicated to empowering students with future-ready skills in Robotics, AI, IoT, and Coding . We are seeking a AI & IoT Trainer to join our team and inspire the next generation of tech innovators! Key Responsibilities: Conduct hands-on training on AI, Machine Learning, and IoT . Teach fundamental to advanced Python concepts, including data structures, OOP, and automation. Chatbot designing Guide students through real-world AI projects using TensorFlow, OpenCV, and NLP. Introduce concepts of Deep Learning, Neural Networks, and AI Ethics . Provide training on IoT-based applications using ESP32, Arduino, and Raspberry Pi (preferred). Deliver Drone programming & Robotics workshops (if experienced). Assist in curriculum development , lesson planning, and creating study materials. Provide mentorship and guidance to students for projects and competitions. Stay updated with the latest trends in AI, ML, IoT, and automation technologies . Required Skills & Qualifications: Strong proficiency in Python (OOP, NumPy, Pandas, Matplotlib). Hands-on experience with AI/ML frameworks (TensorFlow, Keras, Scikit-learn). Knowledge of Deep Learning, NLP, and Computer Vision is a plus. Familiarity with IoT, Arduino, ESP32, Raspberry Pi , and sensor-based automation is preferred. Experience with drone programming is an added advantage. Prior experience in teaching, training, or mentoring is preferred. Excellent communication & presentation skills . Passion for education, technology, and innovation . Preferred Qualifications: Bachelor’s/Master’s degree in Computer Science, AI, Data Science, IoT, or a related field. Experience in STEM, Robotics & IoT education is an advantage. Certifications in AI/ML, IoT, or Drone Technology are a plus. Why Join Us? Work with a leading EdTech academy shaping the future of AI, IoT & Robotics. Opportunity to mentor young minds in cutting-edge technology. Engage in innovation-driven projects & research opportunities . Growth opportunities in AI, IoT, and drone automation training . Job Types: Full-time, Fresher Pay: ₹8,000.00 - ₹20,000.00 per month Benefits: Leave encashment Paid sick time Schedule: Day shift Fixed shift Weekend availability Supplemental Pay: Yearly bonus Language: English (Required) Work Location: In person Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description About CloudxLab CloudxLab is a team of developers, engineers, and educators passionate about building innovative products to make learning fun, engaging, and for life. We are a highly motivated team who build fresh and lasting learning experiences for our users. Powered by our innovation processes, we provide a gamified environment where learning is fun and constructive. From creative design to intuitive apps we create a seamless learning experience for our users. We upskill engineers in deep tech - make them employable & future-ready. CloudxLab is looking for Machine Learning Engineers who have good understanding of Machine Learning using Python. The primary responsibilities of a Machine learning Engineers at CloudxLab are going to be: Review the machine learning and big data projects submitted by the learners. Build the test case driven assessments for machine learning, deep learning, Spark and data analytics. Answer the queries of the learners When CloudxLab is launching the machine learning and big data projects, contribute to it. The Candidate Must Be Hands On With The Following Linux SQL Data Analysis using Numpy, Pandas and matplotlib Machine Learning with Scikit Learn Deep Learning with Tensor flow (1 or 2 either will do) Apache Spark As a part of the job application you will have to complete an online assessment test. Skills The assessment test checks for skills needed for the job and also requires you to submit a blog you have written and published online preferably on LinkedIn. The link to form is below: Link - https://forms.gle/55LbLnufpeK6kqgE8 Required Skills This is your progress in the required skills for this job. Sign in and improve your score by completing these topics and then apply for the job with a better profile. Sign in to know your progress » Apply Now » We suggest you to sign in, to check and improve your progress for the required skills before applying. Click here if you want to apply anyway. Show more Show less

Posted 1 week ago

Apply

1.0 - 3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Just Engineering is a leading Industrial automation training provider company delivering PLC/SCADA Industrial Automation and IT training from its state-of the-art training centre which is centrally located at JM Road, Deccan, Pune since 2012. We are looking for IT Trainer Data Analytic / Data Science course including Python, SQL , Data Analytics , Data Science, Machine Learning, Power BI, Excel basic and Advance, Tableau, R etc. to help us build our learning and development programs on latest technology. Job Description: 1. Training and Delivery: Deliver high-quality training on Data analytics and Data science related technologies for Programming, Data Visualization, Graphical presentation, AI, Machine Learning , databases and related advance technologies. Use interactive teaching methods such as hands-on coding exercises, live projects, case studies, and assignments to engage learners. Develop and implement curricula for training programs, workshops, and bootcamps based on learner levels (beginner to advanced). 2. Design Training Material: Create and update course content, including presentations, lab exercises, and study materials. Ensure the course is up to date with the latest industry standards and technology trends. 3. Mentorship and Guidance: Provide mentorship to students and guide them through projects. Conduct regular assessments to track learner progress. Offer career guidance and job preparation, including building portfolios, interview practice, and coding challenges. 4. Data Anaytics and Data Science Expertise: Teach core technologies including Python Programming, Power BI, Tableau and R Languages, Basic and Advanced Excel, Probability and Statistics concepts. Teach various concepts from basics to advanced in Data science and Data Analytics. Strong knowledge of various libraries like Numpy, Pandas, Sci-kit Learn and Matplotlib etc. 5. Student Support and Feedback: Provide timely feedback on assignments and projects. Address learner queries and provide troubleshooting assistance during training. 6. Performance and Progress Evaluation: Conduct assessments, quizzes, and evaluations to gauge the effectiveness of training. Adapt teaching methods based on feedback and learning outcomes. Skills and Qualifications: Technical Proficiency as applicable for course: Strong knowledge Data Analytics, Basic and Advanced Excel, Python, SQL, Power BI, Tableau, R, Statistics and Probability concepts Expertise in databases (SQL/NoSQL) and RESTful API development. Strong knowledge of Data Science, Machine Learning, Deep Learning, Power BI. Experience: 1 to 3 years of experience as a Data Analyst / Data Science Trainer or related roles. Prior experience in teaching, mentoring, or training is a plus. Soft Skills: Strong communication skills and ability to explain technical concepts to non-technical individuals. Patience, adaptability and ability to engage and motivate students. Excellent organizational, communication and time management skills. Educational Requirements: Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent practical experience). Data science /Data Analytics/ Power BI/ Advanced Excel certifications are required. Interested candidates share your resume or call on 7028953079. Job Types: Full-time Pay: ₹10,000.00 - ₹40,000.00 per month Schedule: Day shift Evening shift Morning shift Weekend availability Work Location: In person Application Deadline: 01/07/2025 Expected Start Date: 01/07/2025 Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Data Scientist Location: Remote Job Type: Full-Time | Permanent Experience Required: 4+ Years About the Role: We are looking for a highly motivated and analytical Data Scientist with 4 years of industry experience to join our data team. The ideal candidate will have a strong background in Python , SQL , and experience deploying machine learning models using AWS SageMaker . You will be responsible for solving complex business problems with data-driven solutions, developing models, and helping scale machine learning systems into production environments. Key Responsibilities: Model Development: Design, develop, and validate machine learning models for classification, regression, and clustering tasks. Work with structured and unstructured data to extract actionable insights and drive business outcomes. Deployment & MLOps: Deploy machine learning models using AWS SageMaker , including model training, tuning, hosting, and monitoring. Build reusable pipelines for model deployment, automation, and performance tracking. Data Exploration & Feature Engineering: Perform data wrangling, preprocessing, and feature engineering using Python and SQL . Conduct EDA (exploratory data analysis) to identify patterns and anomalies. Collaboration: Work closely with data engineers, product managers, and business stakeholders to define data problems and deliver scalable solutions. Present model results and insights to both technical and non-technical audiences. Continuous Improvement: Stay updated on the latest advancements in machine learning, AI, and cloud technologies. Suggest and implement best practices for experimentation, model governance, and documentation. Required Skills & Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Science, Statistics, or related field. 4+ years of hands-on experience in data science, machine learning, or applied AI roles. Proficiency in Python for data analysis, model development, and scripting. Strong SQL skills for querying and manipulating large datasets. Hands-on experience with AWS SageMaker , including model training, deployment, and monitoring. Solid understanding of machine learning algorithms and techniques (supervised/unsupervised). Familiarity with libraries such as Pandas, NumPy, Scikit-learn, Matplotlib, and Seaborn. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Data Science Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Application Deadline: 9th June 2025 Opportunity: Full-time role based on performance + Internship Certificate About Unified Mentor Unified Mentor provides aspiring professionals with hands-on experience in data science through industry-relevant projects, helping them build successful careers. Responsibilities Collect, preprocess, and analyze large datasets Develop predictive models and machine learning algorithms Perform exploratory data analysis (EDA) to extract insights Create data visualizations and dashboards for effective communication Collaborate with cross-functional teams to deliver data-driven solutions Requirements Enrolled in or a graduate of Data Science, Computer Science, Statistics, or a related field Proficiency in Python or R for data analysis and modeling Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) Familiarity with data visualization tools like Tableau, Power BI, or Matplotlib Strong analytical and problem-solving skills Excellent communication and teamwork abilities Stipend & Benefits Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) Hands-on experience in data science projects Certificate of Internship & Letter of Recommendation Opportunity to build a strong portfolio of data science models and applications Potential for full-time employment based on performance How to Apply Submit your resume and a cover letter with the subject line "Data Science Intern Application." Equal Opportunity: Unified Mentor welcomes applicants from all backgrounds. Show more Show less

Posted 1 week ago

Apply

200.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Description You are a strategic thinker passionate about driving solutions in valuation control. You have found the right team. As an Associate in our Valuation Control Group (VCG) team, you will spend each day defining, refining, and delivering set goals for our firm. Your primary responsibility will be to work on the automation and redesign of existing implementations using tools such as Python, Alteryx, Excel VBA, and BI tools like Tableau. The VCG is organized along business lines including Corporate & Investment Bank (Macro Products, Credit, Equities, Securitized Products, IB Risk), CIO, Treasury & Corporate (CTC), Asset Management, Consumer & Community Banking (CCB), and Commercial Banking (CB). You will collaborate closely with senior management, business heads, regulators, internal and external audit, as well as Traders, CFOs, Market Risk, and Middle Office to ensure a complete understanding of business issues and the accurate execution of valuation policy. Job Responsibilities Automate Excel tasks by developing Python scripts with openpyxl, pandas, and xlrd, focusing on data extraction, transformation, and generating reports with charts and pivot tables. Design and deploy interactive web applications using Streamlit, enabling real-time data interaction and integrating advanced analytics. Use Matplotlib and Seaborn to create charts and graphs, adding interactive features for dynamic data exploration tailored to specific business needs. Design intuitive user interfaces with PyQt or Flask, integrating data visualizations and ensuring secure access through authentication mechanisms. Perform data manipulation and exploratory analysis using Pandas and NumPy, and develop data pipelines to maintain data quality and support analytics. Write scripts to connect to external APIs, process data in JSON and XML formats, and ensure reliable data retrieval with robust error handling. Collaborate with cross-functional teams to gather requirements, provide technical guidance, and ensure alignment on project goals, fostering open communication. Should have excellent problem-solving skills and the ability to troubleshoot and resolve technical issues. Adhere to the control, governance, and development standards for intelligent solutions. Should have strong communication skills and the ability to work collaboratively with different teams. Required Qualifications, Capabilities, And Skills Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience in Python programming and automation. Experience with Python libraries such as Pandas, NumPy, PyQt, Streamlit, Matplotlib, Seaborn, openpyxl, xlrd, Flask, PyPDF2, pdfplumber and SQLite . Analytical, quantitative aptitude, and attention to detail. Strong verbal and written communication skills. ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team Our professionals in our Corporate Functions cover a diverse range of areas from finance and risk to human resources and marketing. Our corporate teams are an essential part of our company, ensuring that we’re setting our businesses, clients, customers and employees up for success. Global Finance & Business Management works to strategically manage capital, drive growth and efficiencies, maintain financial reporting and proactively manage risk. By providing information, analysis and recommendations to improve results and drive decisions, teams ensure the company can navigate all types of market conditions while protecting our fortress balance sheet. Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About Us All people need connectivity. The Rakuten Group is reinventing telecom by greatly reducing cost, rewarding big users not penalizing them, empowering more people and leading the human centric AI future. The mission is to connect everybody and enable all to be. Rakuten. Telecom Invented. Job Description Job Title: Senior AI/ML Engineer (Generative AI & ML Ops) Minimum 8+ years of experience in AI/ML development. Location : Bangalore Required Skills And Expertise AI/ML Strategy and Leadership: Define the AI/ML strategy and roadmap aligned with the product vision. Identify and prioritize AI/ML use cases, including classical ML, Generative AI, and Agentic AI, relevant to our product offerings. Build and lead a high-performing AI/ML team by mentoring and upskilling existing non-AI/ML team members. Stay updated with the latest advancements in AI/ML, generative AI, Agentic AI and ML Ops, and apply them to solve business problems. AI/ML Development: Proficient in Python programming and libraries like NumPy, Pandas, Scikit-learn, and Matplotlib. Strong understanding of machine learning algorithms, deep learning architectures, and generative AI models. Design, develop, and deploy classical machine learning models, including supervised, unsupervised, and reinforcement learningtechniques. Hands on Experience in AI/ML Framework like: Scikit-learn, XG Boost, TensorFlow, Py Torch, Keras, Hugging Face, OpenAI APIs (e.g., GPT models). Experience with feature engineering, model evaluation, and hyperparameter tuning. Experience with Lang Chain modules, including Chains, Memory, Tools, and Agents. Build and fine-tune generative AI models (e.g., GPT, DALL-E, Stable Diffusion) for specific use cases. Must Have Exp. in any Agentic AI framework. Leverage LLMs (e.g., GPT, Claude, LLaMA) and multi-modal models to build intelligent agents that can interact with users and systems. Design and develop autonomous AI agents capable of reasoning, planning, and executing tasks in dynamic environments. Implement prompt engineering and fine-tuning of LLMs to optimize agent behaviour for specific tasks. Optimize models for performance, scalability, and cost-efficiency. Job Requirement MLOps And DevOps For AI/ML Establish and maintain an end-to-end MLOps pipeline for model development, deployment, monitoring, and retraining. Automate model training, testing, and deployment workflows using CI/CD pipelines. Implement robust version control for datasets, models, and code. Monitor model performance in production and implement feedback loops for continuous improvement. Proficiency in MLOps tools and platforms such as MLflow, Kubeflow, TFX, SageMaker. Familiarity with CI/CD tools like Jenkins, GitHub Actions, or GitLab CI for AI/ML workflows. Expertise in deploying models on cloud platforms (AWS, Azure, GCP) Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

At Cadence, we hire and develop leaders and innovators who want to make an impact on the world of technology. Performance Modeling Engineer Location – India (Pune) Summary We are looking for modeling engineers to help develop performance models, perform architectural tradeoff analysis, and enable data driven design decisions for our next generation DDR memory controller architectures that can meet today’s complex SoC and workload requirements. Hardware modelling experience (C++/SystemC/TLM/Python) and computer architecture foundation is desired. Responsibilities Develop cycle-level performance models in SystemC or C++ Correlate performance models to match RTL configurations and traffic conditions Work with Memory Architects to understand feature requirements, architectural specifications and implement in the model Analyze architectural trade-offs (throughput, hardware cost) across different scenarios and architectural choices Develop synthetic memory traffic/traces that are representative of real-world applications (CPU, GPU, DSP, NoC, etc) Develop scripts to automate generation of various performance metrics and statistics post RTL simulation that helps identify performance bottlenecks Required Skills BE/B.Tech ME/M.Tech in ECE, E&TC, CS or similar 8+ years of experience in hardware modeling, functional or performance Strong coding skills in C++, SystemC and Transaction Level Modeling (TLM) Basic understanding of performance principles, Queuing Theory, throughput/latency tradeoffs Additional Skills Understand RTL-Verilog, SV, UVM and experience analyzing waveforms Understand memory protocols and timing – DDR4, DDR5, LP4, LP5 Experience using performance simulators – Memory Controller, NoC, CPU models Coding in Python and familiarity with packages like Pandas, Matplotlib Experience working with performance benchmarks – SPEC, STREAM, etc Concepts related to Quality of Service (QoS) and how memory controller can tradeoff performance and latencies We’re doing work that matters. Help us solve what others can’t. Show more Show less

Posted 1 week ago

Apply

5.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

A career in our Advisory Acceleration Centre is the natural extension of PwC’s leading class global delivery capabilities. We provide premium, cost effective, high quality services that support process quality and delivery capability in support for client engagements. Years of Experience: 5 - 10 Years Essential Skills Strong expertise in Python Proficency in ML frameworks such as TensorFlow, PyTorch, scikit-learn, Keras Proficency in using building various statistical and ML models Expertise in optimizing model performance through hyperparameter tuning and other techniques Expertise in visualization tools such as matplotlib , seaborn etc Job Duties Collaborate with data scientists to select and implement the most suitable machine learning algorithms for the problem at hand, ensuring alignment with project goals. Write efficient and optimized code to implement machine learning models using industry-standard frameworks, ensuring high performance and maintainability. Collecting, cleaning, and preprocessing data to be used for training models. Training and tuning ML models; build mechanisms to measure model performance and to automate training and validating activities Documenting model design, development processes, and performance metrics Staying up-to-date with the latest developments in machine learning and related fields. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Data Analyst Trainee Location: Remote Job Type: Internship (Full-Time) Duration: 1–3 Months Stipend: ₹25,000/month Department: Data & Analytics Job Summary: We are seeking a motivated and analytical Data Analyst Trainee to join our remote analytics team. This internship is perfect for individuals eager to apply their data skills in real-world projects, generate insights, and support business decision-making through analysis, reporting, and visualization. Key Responsibilities: Collect, clean, and analyze large datasets from various sources Perform exploratory data analysis (EDA) and generate actionable insights Build interactive dashboards and reports using Excel, Power BI, or Tableau Write and optimize SQL queries for data extraction and manipulation Collaborate with cross-functional teams to understand data needs Document analytical methodologies, insights, and recommendations Qualifications: Bachelor’s degree (or final-year student) in Data Science, Statistics, Computer Science, Mathematics, or a related field Proficiency in Excel and SQL Working knowledge of Python (Pandas, NumPy, Matplotlib) or R Understanding of basic statistics and analytical methods Strong attention to detail and problem-solving ability Ability to work independently and communicate effectively in a remote setting Preferred Skills (Nice to Have): Experience with BI tools like Power BI, Tableau, or Google Data Studio Familiarity with cloud data platforms (e.g., BigQuery, AWS Redshift) Knowledge of data storytelling and KPI measurement Previous academic or personal projects in analytics What We Offer: Monthly stipend of ₹25,000 Fully remote internship Mentorship from experienced data analysts and domain experts Hands-on experience with real business data and live projects Certificate of Completion Opportunity for a full-time role based on performance Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Data Analysis Intern Location: Remote Job Type: Internship (Full-Time) Duration: 1–3 Months Stipend: ₹25,000/month Department: Data & Analytics Job Summary: We are seeking a detail-oriented and analytical Data Analysis Intern to join our remote data team. This internship is ideal for individuals looking to apply their skills in statistics, data handling, and business intelligence to real-world problems. You will gain hands-on experience with data tools and contribute to meaningful data-driven decision-making. Key Responsibilities: Collect, clean, and preprocess data from various sources Perform exploratory data analysis (EDA) and identify trends, patterns, and insights Create visualizations and dashboards to present findings using tools like Excel, Power BI, or Tableau Assist in building reports and communicating insights to different teams Document analytical processes and ensure data accuracy and consistency Collaborate with cross-functional teams to support ongoing data initiatives Qualifications: Bachelor’s degree (or final year student) in Data Science, Statistics, Computer Science, Economics, or related field Strong skills in Excel, SQL, and Python or R Understanding of basic statistical concepts and data analysis techniques Familiarity with data visualization tools such as Power BI, Tableau, or Matplotlib Good problem-solving skills and attention to detail Ability to work independently in a remote environment Preferred Skills (Nice to Have): Experience working with large datasets or real-world business data Knowledge of A/B testing, correlation analysis, or regression techniques Exposure to data cleaning and automation tools Familiarity with Jupyter Notebooks, Google Sheets, or cloud data tools What We Offer: Monthly stipend of ₹25,000 100% remote internship Exposure to real-world business and product data Mentorship from experienced data analysts and domain experts Certificate of Completion Opportunity for full-time placement based on performance Show more Show less

Posted 1 week ago

Apply

55.0 years

5 - 5 Lacs

Bengaluru

On-site

At Capgemini Engineering, the world leader in engineering services, we bring together a global team of engineers, scientists, and architects to help the world’s most innovative companies unleash their potential. From autonomous cars to life-saving robots, our digital and software technology experts think outside the box as they provide unique R&D and engineering services across all industries. Join us for a career full of opportunities. Where you can make a difference. Where no two days are the same. Job Description Strong working experience in python based Django,Flask framework. Experience in developing microservices based design and architecture. Strong programming knowledge in Javascript,HTML5, Python, Restful API, gRPC API Programming experience & object-oriented concepts in PYTHON. Knowledge of python libraries like Numpy, Pandas, Ppen3D, OpenCV, Matplotlib. Knowledge of MySQL/Postgres/MSSQL database. Knowledge of 3D geometry. Knowledge of SSO/OpenID Connect/OAuth authentication protocols. Working experience with version control systems like GitHub/BitBucket/GitLab. Familiarity with continuous integration and continuous deployment (CI/CD) pipelines. Basic knowledge of image processing. Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. Job Description - Grade Specific Has more than a year of relevant work experience. Solid understanding of programming concepts, software design and software development principles. Consistently works to direction with minimal supervision, producing accurate and reliable results. Individuals are expected to be able to work on a range of tasks and problems, demonstrating their ability to apply their skills and knowledge. Organises own time to deliver against tasks set by others with a mid term horizon. Works co-operatively with others to achieve team goals and has a direct and positive impact on project performance and make decisions based on their understanding of the situation, not just the rules. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Bengaluru

On-site

Your Job As a Data Analyst in Molex's Copper Solutions Business Unit software solution group, you will be to extracting actionable insights from large and complex manufacturing datasets, identifying trends, optimizing production processes, improving operational efficiency, minimizing downtime, and enhancing overall product quality. You will be collaborating closely with cross-functional teams to ensure the effective use of data in driving continuous improvement and achieving business objectives within the manufacturing environment. Our Team Molex's Copper Solutions Business Unit (CSBU) is a global team that works together to deliver exceptional products to worldwide telecommunication and data center customers. SSG under CSBU is one of the most highly technically advanced software solution group within Molex. Our group leverages software expertise to enhance the concept, design, manufacturing, and support of high-speed electrical interconnects. What You Will Do 1. Collect, clean, and transform data from various sources to support analysis and decision-making processes. 2. Conduct thorough data analysis using Python to uncover trends, patterns, and insights. 3. Create & maintain reports based on business needs. 4. Prepare comprehensive reports that detail analytical processes and outcomes. 5. Develop and maintain visualizations/dashboards. 6. Collaborate with cross-functional teams to understand data needs and deliver actionable insights. 7. Perform ad hoc analysis to support business decisions. 8. Write efficient and optimized SQL queries to extract, manipulate, and analyze data from various databases. 9. Identify gaps and inefficiencies in current reporting processes and implement improvements and new solutions. 10. Ensure data quality and integrity across all reports and tools. Who You Are (Basic Qualifications) B.E./B.Tech Degree in Computer Science Engineering, Information Science, Data Science or related discipline. 3-5 years of progressive data analysis experience with Python (pandas, numpy, matplotlib, OpenPyXL, SciPy , Statsmodels, Seaborn). What Will Put You Ahead . Experience with Power BI, Tableau, or similar tools for creating interactive dashboards and reports tailored for manufacturing operations. • Experience with predictive analytics e.g. machine learning models (e.g., using Scikit-learn) to predict failures, optimize production, or forecast demand. • Experience with big data tools like Hadoop, Apache Kafka, or cloud platforms (e.g., AWS, Azure) for managing and analyzing large-scale data. • Knowledge on A/B testing & forecasting. • Familiarity with typical manufacturing data (e.g., machine performance metrics, production line data, quality control metrics). At Koch companies, we are entrepreneurs. This means we openly challenge the status quo, find new ways to create value and get rewarded for our individual contributions. Any compensation range provided for a role is an estimate determined by available market data. The actual amount may be higher or lower than the range provided considering each candidate's knowledge, skills, abilities, and geographic location. If you have questions, please speak to your recruiter about the flexibility and detail of our compensation philosophy. Who We Are {Insert company language from Company Boilerplate Language Guide } At Koch, employees are empowered to do what they do best to make life better. Learn how our business philosophy helps employees unleash their potential while creating value for themselves and the company. Additionally, everyone has individual work and personal needs. We seek to enable the best work environment that helps you and the business work together to produce superior results.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies