Jobs
Interviews

4556 Numpy Jobs - Page 29

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Role Overview: We’re looking for freshers who are passionate about Python and Artificial Intelligence. You should be a fast learner, eager to experiment, and unafraid to fail fast and iterate. You’ll work on AI-driven projects, prototypes, and production systems that challenge the status quo. Key Responsibilities: · Write clean, efficient Python code for AI and data-driven applications · Experiment, prototype, and rapidly test AI models and solutions · Work closely with senior developers and AI engineers to build scalable systems · Research and learn new AI tools, frameworks, and technologies · Debug, improve, and optimize existing code and AI workflows · Participate in brainstorming and solution design sessions · Document your work clearly for knowledge sharing Must-Have Skills: · Strong Python programming skills · Solid understanding of AI/ML fundamentals (e.g. supervised learning, neural networks, NLP, computer vision, etc.) · Basic knowledge of popular AI libraries (e.g. TensorFlow, PyTorch, scikit-learn, OpenCV, etc.) · Strong problem-solving skills and logical thinking · Curiosity and willingness to explore new technologies quickly · Good communication skills Nice-to-Have: · Experience with AI model deployment (Flask/FastAPI, Docker, cloud services) · Understanding of data preprocessing and analysis (NumPy, Pandas, etc.) · Participation in AI hackathons, competitions, or personal projects · Familiarity with Git and version control workflows · Basic knowledge of frontend technologies (optional but helpful) What We Offer: · Opportunity to work on real-world AI projects from scratch · A culture that values learning, experimentation, and speed · Guidance and mentorship from experienced developers and AI experts · Flexible work environment · Clear growth paths and opportunities to level up quickly · Exposure to global clients and cutting-edge projects

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Data Analyst (Sales Background) Hybrid (Bangalore, India) Full Time What you will do: ● Performance Analysis & Reporting: Design, develop, and maintain robust dashboards, reports, and analytical models to track key sales metrics, identify trends, and provide deep insights into sales performance across various segments (e.g., merchant acquisition, retention, growth, regional performance). ● Strategic Insights; Recommendations: Proactively identify opportunities for sales process improvements, efficiency gains, and revenue growth by analyzing sales data, market trends, and operational workflows. Present clear, concise, and actionable recommendations to sales leadership. ● Forecasting; Planning Support: Contribute to sales forecasting, capacity planning, and target setting processes by leveraging historical data, statistical models, and market intelligence to provide accurate projections. ● Data Infrastructure & Tools: Collaborate with data engineering and business intelligence teams to ensure data integrity, accessibility, and the development of scalable data solutions. Utilize SQL, Python, and advanced visualization tools (e.g., Tableau, Power BI) to extract, transform, and present data. ● Sales Optimization: Analyze sales funnel performance, conversion rates, and sales cycle efficiency to pinpoint bottlenecks and recommend solutions that streamline operations and enhance productivity. ● Ad-Hoc Analysis: Conduct deep-dive analyses on specific business questions or challenges, providing timely and accurate data-driven answers to support urgent strategic decisions. Preferred qualifications: ● Experience: ○ 5+ years of experience in Sales Operations, Business Intelligence, Data Analytics, or a similar analytical role, with a strong emphasis on sales performance. ○ Demonstrated experience in the Grocery, Retail, E-commerce, or Q-commerce industry is highly preferred. ○ Proven track record of translating complex data into clear, actionable insights and presenting them to senior leadership. ● Technical Skills: ○ Proficiency in SQL for data extraction and manipulation is required. ○ Proficiency with data visualization tools (e.g., Tableau, Power BI, Looker) for dashboard creation and reporting. ○ analytical programming skills in Python (Pandas, NumPy) or R for advanced data analysis and modeling. ○ Experience with CRM systems (e.g., Salesforce) and understanding of sales data structures. ● Analytical; Problem-Solving Skills: ○ Exceptional analytical and quantitative skills, with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. ○ Strong problem-solving abilities, capable of tackling complex business challenges with a data-driven approach. ● Communication ; Interpersonal Skills: ○ Excellent written and verbal communication skills, with the ability to articulate complex analytical concepts to non-technical stakeholders. ○ Strong interpersonal skills, with the ability to build relationships and influence cross-functional teams ● Education: ○ Bachelors degree in Business, Economics, Finance, Statistics, Computer Science, or a related quantitative field. Masters degree is a plus. ○ MBA from a top-tier school

Posted 2 weeks ago

Apply

3.0 - 6.0 years

4 - 7 Lacs

Ahmedabad, Vadodara

Work from Office

AI/ML Engineer (2-3 positions) Job Summary: We are seeking a highly skilled and motivated AI/ML Engineer with a specialization in Computer Vision & Un-Supervised Learning to join our growing team. You will be responsible for building, optimizing, and deploying advanced video analytics solutions for smart surveillance applications, including real-time detection, facial recognition, and activity analysis. This role combines the core competencies of AI/ML modelling with the practical skills required to deploy and scale models in real-world production environments, both in the cloud and on edge devices. Key Responsibilities: AI/ML Development & Computer Vision Design, train, and evaluate models for: o Face detection and recognition o Object/person detection and tracking o Intrusion and anomaly detection o Human activity or pose recognition/estimation Work with models such as YOLOv8, DeepSORT, RetinaNet, Faster-RCNN, and InsightFace. Perform data preprocessing, augmentation, and annotation using tools like LabelImg, CVAT, or custom pipelines. Surveillance System Integration Integrate computer vision models with live CCTV/RTSP streams for real-time analytics. Develop components for motion detection, zone-based event alerts, person re-identification, and multi-camera coordination. Optimize solutions for low-latency inference on edge devices (Jetson Nano, Xavier, Intel Movidius, Coral TPU). Model Optimization & Deployment Convert and optimize trained models using ONNX, TensorRT, or OpenVINO for real-time inference. Build and deploy APIs using FastAPI, Flask, or TorchServe. Package applications using Docker and orchestrate deployments with Kubernetes. Automate model deployment workflows using CI/CD pipelines (GitHub Actions, Jenkins). Monitor model performance in production using Prometheus, Grafana, and log management tools. Manage model versioning, rollback strategies, and experiment tracking using MLflow or DVC. As an AI/ML Engineer, you should be well-versed of AI agent development and finetuning experience Collaboration & Documentation Work closely with backend developers, hardware engineers, and DevOps teams. Maintain clear documentation of ML pipelines, training results, and deployment practices. Stay current with emerging research and innovations in AI vision and MLOps. Required Qualifications: Bachelors or masters degree in computer science, Artificial Intelligence, Data Science, or a related field. 3-6 years of experience in AI/ML, with a strong portfolio in computer vision, Machine Learning. Hands-on experience with: o Deep learning frameworks: PyTorch, TensorFlow o Image/video processing: OpenCV, NumPy o Detection and tracking frameworks: YOLOv8, DeepSORT, RetinaNet. Solid understanding of deep learning architectures (CNNs, Transformers, Siamese Networks). Proven experience with real-time model deployment on cloud or edge environments. Strong Python programming skills and familiarity with Git, REST APIs, and DevOps tools. Preferred Qualifications: Experience with multi-camera synchronization and NVR/DVR systems. Familiarity with ONVIF protocols and camera SDKs. Experience deploying AI models on Jetson Nano/Xavier, Intel NCS2, or Coral Edge TPU. Background in face recognition systems (e.g., InsightFace, FaceNet, Dlib). Understanding of security protocols and compliance in surveillance systems. Tools & Technologies: Category Tools & Frameworks Languages & AI Python, PyTorch, TensorFlow, OpenCV, NumPy, Scikit-learn Model Serving FastAPI, Flask, TorchServe, TensorFlow Serving, REST/gRPC APIs Model Optimization ONNX, TensorRT, OpenVINO, Pruning, Quantization Deployment Docker, Kubernetes, Gunicorn, MLflow, DVC CI/CD & DevOps GitHub Actions, Jenkins, GitLab CI Cloud & Edge AWS SageMaker, Azure ML, GCP AI Platform, Jetson, Movidius, Coral TPU Monitoring Prometheus, Grafana, ELK Stack, Sentry Annotation Tools LabelImg, CVAT, Supervisely

Posted 2 weeks ago

Apply

6.0 - 9.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. TThe AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements. Python Developer - Sr. Consultant The position is suited for individuals who have the ability to work in a constantly challenging environment and deliver effectively and efficiently. As a Data Engineer, you will be an integral member of our Data & Analytics team responsible for design and development of pipelines using cutting edge technologies. Work you’ll do Implementation of security and data protection Implementation of ETL pipelines for data from a wide variety of data sources using Python and SQL Delivering data and insights in Realtime Participate in architectural, design, and product sessions. Unit testing and debugging skills Collaborate with other developers, testers, and system engineers to ensure quality of deliverables and any product enhancements. Qualifications: 6-9 Years of technology Consulting experience Education: Bachelors/Master’s degree in Computer Science / MCA / M.Sc / MBA A minimum of 2 Years of experience into Unit testing and debugging skills Excellent knowledge of Python programming language along with knowledge of at least one Python web framework (Django, Flask, FastAPI, Pyramid) Extensive experience in Pandas/Numpy dataframes, slicing, data wrangling, aggregations. Lambda Functions, Decorators. Vector operations on Pandas dataframes /series. Application of applymap, apply, map functions. Understanding on using a framework based on specific needs and requirements. Understanding of the threading limitations of Python, and multi-process architecture Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3 Primary Skills: Python and data analysis libraries (Pandas, NumPy, SciPy). Django DS/Algo SQL (Read & Write) CRUD Awareness of Microservices Preferred: Good Understanding of fundamental design principles behind a scalable application Good Understanding of accessibility and security compliance Familiarity with event-driven programming in Python Proficient understanding of code versioning tools (Git, Mercurial or SVN) Knowledge of PowerShell and SQL Server You are familiar with big data technologies like Spark or Flink and comfortable working with web-scale datasets You have an eye for detail, good data intuition, and a passion for data quality Good Knowledge of user authentication and authorization between multiple systems, servers, and environments You appreciate the importance of great documentation and data debugging skill Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300058

Posted 2 weeks ago

Apply

6.0 - 9.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. TThe AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements. Python Developer - Sr. Consultant The position is suited for individuals who have the ability to work in a constantly challenging environment and deliver effectively and efficiently. As a Data Engineer, you will be an integral member of our Data & Analytics team responsible for design and development of pipelines using cutting edge technologies. Work you’ll do Implementation of security and data protection Implementation of ETL pipelines for data from a wide variety of data sources using Python and SQL Delivering data and insights in Realtime Participate in architectural, design, and product sessions. Unit testing and debugging skills Collaborate with other developers, testers, and system engineers to ensure quality of deliverables and any product enhancements. Qualifications: 6-9 Years of technology Consulting experience Education: Bachelors/Master’s degree in Computer Science / MCA / M.Sc / MBA A minimum of 2 Years of experience into Unit testing and debugging skills Excellent knowledge of Python programming language along with knowledge of at least one Python web framework (Django, Flask, FastAPI, Pyramid) Extensive experience in Pandas/Numpy dataframes, slicing, data wrangling, aggregations. Lambda Functions, Decorators. Vector operations on Pandas dataframes /series. Application of applymap, apply, map functions. Understanding on using a framework based on specific needs and requirements. Understanding of the threading limitations of Python, and multi-process architecture Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3 Primary Skills: Python and data analysis libraries (Pandas, NumPy, SciPy). Django DS/Algo SQL (Read & Write) CRUD Awareness of Microservices Preferred: Good Understanding of fundamental design principles behind a scalable application Good Understanding of accessibility and security compliance Familiarity with event-driven programming in Python Proficient understanding of code versioning tools (Git, Mercurial or SVN) Knowledge of PowerShell and SQL Server You are familiar with big data technologies like Spark or Flink and comfortable working with web-scale datasets You have an eye for detail, good data intuition, and a passion for data quality Good Knowledge of user authentication and authorization between multiple systems, servers, and environments You appreciate the importance of great documentation and data debugging skill Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300058

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Required Skills Experience 3–6 years of experience in backend development with Python. Strong hands-on experience with: FastAPI for building RESTful APIs. NumPy and Polars for numerical and tabular data processing. Solid understanding of Generative AI concepts and experience with: LangChain, LangGraph, and MCP Agents. Building and deploying agentic RAG systems. Experience with MongoDB or other vector databases for semantic search and retrieval. Familiarity with cloud platforms (Azure) and containerization (Docker/Kubernetes) is a plus. Please note - We are looking for immediate joiners (or max. NP 15-20 days). Qualifications Bachelor’s/Master’s degree in computer science, data science, mathematics or a related field.

Posted 2 weeks ago

Apply

6.0 - 9.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. TThe AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements. Python Developer - Sr. Consultant The position is suited for individuals who have the ability to work in a constantly challenging environment and deliver effectively and efficiently. As a Data Engineer, you will be an integral member of our Data & Analytics team responsible for design and development of pipelines using cutting edge technologies. Work you’ll do Implementation of security and data protection Implementation of ETL pipelines for data from a wide variety of data sources using Python and SQL Delivering data and insights in Realtime Participate in architectural, design, and product sessions. Unit testing and debugging skills Collaborate with other developers, testers, and system engineers to ensure quality of deliverables and any product enhancements. Qualifications: 6-9 Years of technology Consulting experience Education: Bachelors/Master’s degree in Computer Science / MCA / M.Sc / MBA A minimum of 2 Years of experience into Unit testing and debugging skills Excellent knowledge of Python programming language along with knowledge of at least one Python web framework (Django, Flask, FastAPI, Pyramid) Extensive experience in Pandas/Numpy dataframes, slicing, data wrangling, aggregations. Lambda Functions, Decorators. Vector operations on Pandas dataframes /series. Application of applymap, apply, map functions. Understanding on using a framework based on specific needs and requirements. Understanding of the threading limitations of Python, and multi-process architecture Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3 Primary Skills: Python and data analysis libraries (Pandas, NumPy, SciPy). Django DS/Algo SQL (Read & Write) CRUD Awareness of Microservices Preferred: Good Understanding of fundamental design principles behind a scalable application Good Understanding of accessibility and security compliance Familiarity with event-driven programming in Python Proficient understanding of code versioning tools (Git, Mercurial or SVN) Knowledge of PowerShell and SQL Server You are familiar with big data technologies like Spark or Flink and comfortable working with web-scale datasets You have an eye for detail, good data intuition, and a passion for data quality Good Knowledge of user authentication and authorization between multiple systems, servers, and environments You appreciate the importance of great documentation and data debugging skill Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300058

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Chandigarh

On-site

Job Summary We are seeking an experienced and driven Senior AI Engineer to lead advanced AI initiatives and drive real-world impact through cutting-edge machine learning solutions. The ideal candidate will have 7+ years of hands-on experience in building and deploying models across NLP, computer vision, and deep learning, along with a proven track record of leading teams and managing technical deliverables. Key Responsibilities Design, develop, and deploy robust machine learning models for real-world applications Lead and mentor a team of junior engineers and researchers, ensuring timely delivery and code quality Train and fine-tune models using large, diverse, and complex datasets Apply AI techniques in natural language processing, computer vision, and deep learning Collaborate cross-functionally with product, data, and engineering teams to align models with business goals Utilize cloud-based AI platforms (e.g., AWS SageMaker, Azure ML, Google AI Platform) for scalable deployment Monitor model performance in production, implement improvements, and ensure robustness Report on project status, milestones, and team performance to senior stakeholders Required Skills & Qualifications 7+ years of experience in Python and machine learning frameworks such as TensorFlow or PyTorch Extensive experience with data science libraries including NumPy, Pandas, and Scikit-learn Solid understanding of supervised, unsupervised, and deep learning techniques Strong leadership skills with experience managing or mentoring technical teams Familiarity with cloud AI infrastructure and scalable model deployment Excellent problem-solving abilities and the capability to thrive in fast-paced environments Preferred Qualifications Experience in model performance tuning, monitoring, and retraining pipelines Exposure to MLOps practices and CI/CD workflows for ML model lifecycle Knowledge of model explainability, bias mitigation, and ethical AI frameworks Experience working in agile environments and reporting to cross-functional leadership Why Join Us Build with Purpose: Work on impactful, high-scale products that solve real problems using cutting-edge technologies. Tech-First Culture: Join a team where engineering is at the core — we prioritize clean code, scalability, automation, and continuous learning. Freedom to Innovate: You’ll have ownership from day one — with room to experiment, influence architecture, and bring your ideas to life. Collaborate with the Best: Work alongside passionate engineers, product thinkers, and designers who value clarity, speed, and technical excellence. Paladin Tech is an equal opportunity employer. We are committed to creating an inclusive and diverse workplace and welcome candidates of all backgrounds and identities. Job Types: Full-time, Permanent Benefits: Food provided Work Location: In person

Posted 2 weeks ago

Apply

2.0 years

2 - 3 Lacs

India

On-site

Key Responsibilities: Develop and deploy machine learning and deep learning models Work on NLP, computer vision, or recommendation systems Optimize models for performance and scalability Stay updated with the latest AI research and trends Skills We’re Looking For: Strong Python programming skills Experience with ML frameworks (TensorFlow, PyTorch, Scikit-learn) Solid understanding of data preprocessing, model evaluation, and LOps Hands-on with tools like Pandas, NumPy, OpenCV, NLTK/spaCy Exposure to cloud platforms (AWS, GCP, Azure) is a plus Job Type: Full-time Pay: ₹20,000.00 - ₹30,000.00 per month Benefits: Provident Fund Ability to commute/relocate: Palarivattom, Kochi, Kerala: Reliably commute or planning to relocate before starting work (Preferred) Experience: AI: 2 years (Preferred) Work Location: In person Expected Start Date: 25/07/2025

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Gurgaon

Remote

Job description About this role Role Summary: We’re seeking a dynamic System Engineer to design and deliver intelligent, scalable, and reliable data systems. This hybrid role combines data engineering, AI/ML integration, system reliability, and DevOps to accelerate data collection, enable intelligent workflows, and drive business impact. You’ll collaborate across engineering, data analytics, and business teams to build reusable frameworks, reduce time-to-value, and uphold engineering excellence. Key Responsibilities: Data & AI Workflow Engineering Accelerate data collection at scale from millions of sources using robust, scalable pipelines. Design, build, and deploy workflows that combine AI/ML models with human-in-the-loop systems. Operate as a full-stack data engineer , taking projects from problem formulation to production. Develop APIs and services to expose data and model outputs for downstream consumption. ️ System Engineering, Reliability & DevOps Build and maintain CI/CD pipelines for data and ML services using Azure DevOps or GitHub Actions. Implement observability (metrics, logs, traces) and reliability features (retries, circuit breakers, graceful degradation). Optimize data workflows and infrastructure for performance, scalability, and fault tolerance . Contribute to infrastructure-as-code (IaC) for provisioning and managing cloud-native environments. Platform & Framework Development Elevate development standards through reusable services, frameworks, templates, and documentation. Champion best practices in code quality, security, and automation across the engineering lifecycle. Collaborate with engineering teams across the business to improve time-to-value and share internal solutions. Collaboration & Business Impact Collaborate with engineering teams across the business to improve time-to-value and share internal solutions. Translate business problems into data science/ML solutions with measurable outcomes. Propose pragmatic, diverse approaches to solving business challenges using data and AI. Present results and recommendations clearly to technical and non-technical audiences using compelling storytelling and visualizations. Required Skills and Qualifications: 5-8 years of experience in data engineering , machine learning , or system/platform engineering . Strong programming skills in Python/DotNet or Java ; proficiency in SQL , DBT , and data orchestration tools (e.g., Airflow). Experience with containerization (Docker) and Kubernetes on Azure and/or AWS. Proficiency in CI/CD , Git , and cloud-native development . Familiarity with observability tools (Azure Monitor, Prometheus, Grafana) and data validation frameworks (e.g., Great Expectations). Familiarity with data science libraries (Pandas, NumPy, scikit-learn) and deploying ML models to production. Strong understanding of distributed systems , microservices , and API design . Bachelor’s or Master’s degree in computer science, Data Science, Engineering, or a related field. Our benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law. Job Requisition # R255610

Posted 2 weeks ago

Apply

6.0 - 9.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. TThe AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements. Python Developer - Sr. Consultant The position is suited for individuals who have the ability to work in a constantly challenging environment and deliver effectively and efficiently. As a Data Engineer, you will be an integral member of our Data & Analytics team responsible for design and development of pipelines using cutting edge technologies. Work you’ll do Implementation of security and data protection Implementation of ETL pipelines for data from a wide variety of data sources using Python and SQL Delivering data and insights in Realtime Participate in architectural, design, and product sessions. Unit testing and debugging skills Collaborate with other developers, testers, and system engineers to ensure quality of deliverables and any product enhancements. Qualifications: 6-9 Years of technology Consulting experience Education: Bachelors/Master’s degree in Computer Science / MCA / M.Sc / MBA A minimum of 2 Years of experience into Unit testing and debugging skills Excellent knowledge of Python programming language along with knowledge of at least one Python web framework (Django, Flask, FastAPI, Pyramid) Extensive experience in Pandas/Numpy dataframes, slicing, data wrangling, aggregations. Lambda Functions, Decorators. Vector operations on Pandas dataframes /series. Application of applymap, apply, map functions. Understanding on using a framework based on specific needs and requirements. Understanding of the threading limitations of Python, and multi-process architecture Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3 Primary Skills: Python and data analysis libraries (Pandas, NumPy, SciPy). Django DS/Algo SQL (Read & Write) CRUD Awareness of Microservices Preferred: Good Understanding of fundamental design principles behind a scalable application Good Understanding of accessibility and security compliance Familiarity with event-driven programming in Python Proficient understanding of code versioning tools (Git, Mercurial or SVN) Knowledge of PowerShell and SQL Server You are familiar with big data technologies like Spark or Flink and comfortable working with web-scale datasets You have an eye for detail, good data intuition, and a passion for data quality Good Knowledge of user authentication and authorization between multiple systems, servers, and environments You appreciate the importance of great documentation and data debugging skill Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300058

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Overview We are looking for a savvy Data Engineer to manage in-progress and upcoming data infrastructure projects. The candidate will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder using Python and data wrangler who enjoys optimizing data systems and building them from the ground up. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. Responsibilities for Data Engineer Create and maintain optimal data pipeline architecture, assemble large, complex data sets that meet functional / non-functional business requirements using Python and SQL / AWS / Snowflakes. Identify, design, and implement internal process improvements through: automating manual processes using Python, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL / AWS / Snowflakes technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Keep our data separated and secure across national boundaries through multiple data centers and AWS regions. Work with data and analytics experts to strive for greater functionality in our data systems. Qualifications for Data Engineer Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. A successful history of manipulating, processing and extracting value from large disconnected datasets. Desired Skillset:- 3+ years of experience in a Python Scripting and Data specific role, with Bachelor degree. Experience with data processing and cleaning libraries e.g. Pandas, numpy, etc., web scraping/ web crawling for automation of processes, API’s and how they work. Debugging code if it fails and find the solution. Should have basic knowledge of SQL server job activity monitoring and of Snowflake. Experience with relational SQL and NoSQL databases, including PostgreSQL and Cassandra. Experience with most or all the following cloud services: AWS, Azure, Snowflake, Google Strong project management and organizational skills Experience supporting and working with cross-functional teams in a dynamic environment.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

0 Lacs

Mohali district, India

Remote

Job Description: SDE-II – Python Developer Job Title SDE-II – Python Developer Department Operations Location In-Office Employment Type Full-Time Job Summary We are looking for an experienced Python Developer to join our dynamic development team. The ideal candidate will have 2 to 5 years of experience in building scalable backend applications and APIs using modern Python frameworks. This role requires a strong foundation in object-oriented programming, web technologies, and collaborative software development. You will work closely with the design, frontend, and DevOps teams to deliver robust and high-performance solutions. Key Responsibilities • Develop, test, and maintain backend applications using Django, Flask, or FastAPI. • Build RESTful APIs and integrate third-party services to enhance platform capabilities. • Utilize data handling libraries like Pandas and NumPy for efficient data processing. • Write clean, maintainable, and well-documented code that adheres to industry best practices. • Participate in code reviews and mentor junior developers. • Collaborate in Agile teams using Scrum or Kanban workflows. • Troubleshoot and debug production issues with a proactive and analytical approach. Required Qualifications • 2 to 5 years of experience in backend development with Python. • Proficiency in core and advanced Python concepts, including OOP and asynchronous programming. • Strong command over at least one Python framework (Django, Flask, or FastAPI). • Experience with data libraries like Pandas and NumPy. • Understanding of authentication/authorization mechanisms, middleware, and dependency injection. • Familiarity with version control systems like Git. • Comfortable working in Linux environments. Must-Have Skills • Expertise in backend Python development and web frameworks. • Strong debugging, problem-solving, and optimization skills. • Experience with API development and microservices architecture. • Deep understanding of software design principles and security best practices. Good-to-Have Skills • Experience with Generative AI frameworks (e.g., LangChain, Transformers, OpenAI APIs). • Exposure to Machine Learning libraries (e.g., Scikit-learn, TensorFlow, PyTorch). • Knowledge of containerization tools (Docker, Kubernetes). • Familiarity with web servers (e.g., Apache, Nginx) and deployment architectures. • Understanding of asynchronous programming and task queues (e.g., Celery, AsyncIO). • Familiarity with Agile practices and tools like Jira or Trello. • Exposure to CI/CD pipelines and cloud platforms (AWS, GCP, Azure). Company Overview We specialize in delivering cutting-edge solutions in custom software, web, and AI development. Our work culture is a unique blend of in-office and remote collaboration, prioritizing our employees above everything else. At our company, you’ll find an environment where continuous learning, leadership opportunities, and mutual respect thrive. We are proud to foster a culture where individuals are valued, encouraged to evolve, and supported in achieving their fullest potential. Benefits and Perks • Competitive Salary: Earn up to ₹6 –10 LPA based on skills and experience. • Generous Time Off: Benefit from 18 annual holidays to maintain a healthy work-life balance. • Continuous Learning: Access extensive learning opportunities while working on cutting-edge projects. • Client Exposure: Gain valuable experience in client-facing roles to enhance your professional growth.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Mohali district, India

On-site

We are seeking a highly skilled AI/ML Web Scraping Specialist to join our data engineering and analytics team. The ideal candidate will have hands-on experience in building robust web scraping pipelines focused on Instagram and other social media platforms (e.g., Facebook, TikTok, YouTube, X/Twitter) . The role involves designing scalable scraping architectures, solving anti-bot challenges, and applying machine learning to classify, enrich, or analyze social media content. Key Responsibilities Design and implement advanced web scraping tools, scripts, and pipelines using Python (BeautifulSoup, Scrapy, Selenium, Playwright, etc.). Build robust scrapers for social media platforms like Instagram, TikTok, X (Twitter), Facebook, and LinkedIn, bypassing rate limits and anti-bot mechanisms (Cloudflare, reCAPTCHA, etc.). Leverage APIs (when available) and reverse-engineer web/mobile requests to extract structured data. Develop and train ML models for tasks such as content categorization, influencer classification, sentiment analysis, engagement prediction, etc. Automate scraping workflows, schedule jobs (using Airflow/Cron), and store data in NoSQL or relational databases. Maintain and optimize scraping performance, and handle edge cases or UI changes in target platforms. Work with large-scale data pipelines, and ensure clean, deduplicated, and enriched datasets. Collaborate with product, marketing, and data science teams to provide actionable insights from social media data. Required Skills and Qualifications Strong programming skills in Python with proven experience using Scrapy, Selenium, Playwright, or Puppeteer . Deep knowledge of HTTP, HTML DOM traversal, JavaScript rendering, proxies, user agents, and browser automation. Solid understanding of Instagram’s data structures , public endpoints, GraphQL queries, and security challenges. Familiarity with anti-bot bypass techniques : rotating proxies, CAPTCHA solving (2Captcha, AntiCaptcha), session management. Hands-on experience in training and deploying ML models (NLP, classification, clustering) using scikit-learn, TensorFlow, or PyTorch . Experience with MongoDB, PostgreSQL, or Elasticsearch for data storage and retrieval. Good understanding of data privacy, legal considerations, and ethical scraping practices. Preferred Skills Experience with cloud platforms (AWS, GCP, Azure) and containerization (Docker/Kubernetes). Knowledge of Instagram Business APIs , Facebook Graph API, and TikTok’s unofficial endpoints. Prior work on influencer discovery, brand monitoring, or social listening tools. Experience in building data dashboards using tools like Streamlit, Power BI, or Tableau . Contributions to open-source scraping libraries or ML projects. Tools & Technologies You Might Use Python, Scrapy, Selenium, Playwright, Puppeteer Pandas, NumPy, scikit-learn, OpenAI APIs PostgreSQL, MongoDB, Redis, Elasticsearch AWS Lambda, EC2, S3, Cloud Functions Git, Docker, CI/CD pipelines

Posted 2 weeks ago

Apply

0 years

6 Lacs

Mohali

On-site

Key Responsibilities: Develop, fine-tune, and deploy LLM-based NLP models for tasks such as text classification, summarization, entity recognition, and question answering. Implement deep learning architectures using TensorFlow and/or PyTorch . Integrate OpenAI APIs and other foundation models into internal applications and tools. Design and build computer vision models for image detection, recognition, classification, and segmentation. Collaborate with cross-functional teams including product managers, data engineers, and UI/UX designers. Stay updated with the latest research trends in AI/ML and apply them to enhance system capabilities. Optimize model performance for scalability, accuracy, and speed. Build pipelines for data preprocessing, model training, validation, and deployment. Document models, experiments, and system behavior for team knowledge sharing. Key Skills & Qualifications: Bachelor's or Master’s degree in Computer Science , Artificial Intelligence , Data Science , or a related field. Strong hands-on experience with TensorFlow (or PyTorch). Deep understanding of NLP techniques , transformers , BERT , GPT , T5 , etc. Experience working with OpenAI , Hugging Face , or other LLM frameworks. Solid foundation in Computer Vision concepts and frameworks like OpenCV , YOLO , CNNs , Detectron , etc. Proficient in Python and relevant libraries (e.g., Numpy, Pandas, Scikit-learn). Experience with REST APIs, Flask/FastAPI for deploying ML models is a plus. Excellent problem-solving and analytical skills. Preferred: Prior experience in building AI-based SaaS products or intelligent automation solutions. Knowledge of MLOps , model versioning , and cloud platforms (AWS/GCP/Azure). Familiarity with Reinforcement Learning and Generative AI is a bonus. Job Type: Full-time Pay: Up to ₹50,000.00 per month Schedule: Morning shift Work Location: In person

Posted 2 weeks ago

Apply

6.0 - 9.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. TThe AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements. Python Developer - Sr. Consultant The position is suited for individuals who have the ability to work in a constantly challenging environment and deliver effectively and efficiently. As a Data Engineer, you will be an integral member of our Data & Analytics team responsible for design and development of pipelines using cutting edge technologies. Work you’ll do Implementation of security and data protection Implementation of ETL pipelines for data from a wide variety of data sources using Python and SQL Delivering data and insights in Realtime Participate in architectural, design, and product sessions. Unit testing and debugging skills Collaborate with other developers, testers, and system engineers to ensure quality of deliverables and any product enhancements. Qualifications: 6-9 Years of technology Consulting experience Education: Bachelors/Master’s degree in Computer Science / MCA / M.Sc / MBA A minimum of 2 Years of experience into Unit testing and debugging skills Excellent knowledge of Python programming language along with knowledge of at least one Python web framework (Django, Flask, FastAPI, Pyramid) Extensive experience in Pandas/Numpy dataframes, slicing, data wrangling, aggregations. Lambda Functions, Decorators. Vector operations on Pandas dataframes /series. Application of applymap, apply, map functions. Understanding on using a framework based on specific needs and requirements. Understanding of the threading limitations of Python, and multi-process architecture Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3 Primary Skills: Python and data analysis libraries (Pandas, NumPy, SciPy). Django DS/Algo SQL (Read & Write) CRUD Awareness of Microservices Preferred: Good Understanding of fundamental design principles behind a scalable application Good Understanding of accessibility and security compliance Familiarity with event-driven programming in Python Proficient understanding of code versioning tools (Git, Mercurial or SVN) Knowledge of PowerShell and SQL Server You are familiar with big data technologies like Spark or Flink and comfortable working with web-scale datasets You have an eye for detail, good data intuition, and a passion for data quality Good Knowledge of user authentication and authorization between multiple systems, servers, and environments You appreciate the importance of great documentation and data debugging skill Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300058

Posted 2 weeks ago

Apply

1.0 years

3 - 8 Lacs

Bhubaneshwar

On-site

Job Summary We are seeking a qualified AI/ML Trainer to join our institution, primarily focused on teaching B.Tech, MCA, and M.Sc-IT courses. This position is based in Bhubaneshwar. The trainer will deliver comprehensive lectures on Artificial Intelligence, Machine Learning, Deep Learning, and Data Science, equipping students with the theoretical foundation and practical skills necessary for industry readiness. Responsibilities Teaching and Lecturing: Deliver engaging and structured lectures on AI, ML, and related areas to B.Tech, MCA, and M.Sc-IT students. Teach topics such as Machine Learning Algorithms, Supervised/Unsupervised Learning, Deep Learning, Neural Networks, Computer Vision, Natural Language Processing (NLP), and AI model deployment. Ensure students gain both conceptual clarity and hands-on experience using tools like Python, TensorFlow, Keras, and Scikit-learn. Continuously update teaching content to reflect industry trends and technological advancements. Develop course materials including lecture notes, practical assignments, mini-projects, and assessments. Create a collaborative and project-based learning environment to enhance student understanding. Research and Innovation: Conduct applied research in AI/ML and contribute to academic or industry publications. Stay updated with the latest in AI research, tools, and open-source frameworks. Support and guide students in AI/ML-based research and innovation projects. Mentorship: Provide mentorship to students working on AI/ML internships, final year projects, or competitions (e.g., Kaggle, Hackathons). Help students apply algorithms to real-world datasets and problem domains. Collaboration: Collaborate with academic peers and industry experts to strengthen the AI/ML curriculum and delivery. Establish connections with AI/ML startups or companies for student internships, guest lectures, and collaborative projects. Professional Development: Participate in AI/ML workshops, webinars, and conferences. Continuously upgrade your technical skills to match the evolving AI landscape. Qualifications Master’s degree (M.Tech/M.Sc/MCA – with 6 Months to 1 Year experience) in Artificial Intelligence, Computer Science, Data Science, or a related field. Strong background in Machine Learning, Deep Learning, and Python-based AI tools. Teaching or corporate training experience is highly desirable. Skills and Competencies Expertise in AI/ML frameworks such as TensorFlow, Keras, PyTorch, Scikit-learn. Strong command over Python, NumPy, Pandas, and Jupyter Notebooks. Ability to explain complex algorithms in a simple, practical manner. Excellent communication and presentation skills. Analytical thinking and problem-solving abilities. A passion for teaching and mentoring future AI professionals. Location Bhubaneshwar Benefits Competitive salary and benefits package. A dynamic and innovative educational environment. The opportunity to shape future AI/ML professionals. Involvement in real-world problem solving and research-driven teaching.

Posted 2 weeks ago

Apply

1.0 years

2 - 4 Lacs

Bhubaneshwar

On-site

Job Summary: We are seeking a qualified Python Trainer to join our institution, primarily focused on teaching B.Tech, MCA, and M.Sc-IT courses. This position is available at our Bhubaneshwar location. The trainer will deliver comprehensive lectures on Python and related technologies including data handling, object-oriented programming, web frameworks, and emerging Python applications in automation and data science. Responsibilities Teaching and Lecturing: Deliver engaging and informative lectures on Python and related subjects to B.Tech, MCA, and M.Sc-IT students. Teach topics such as Core Python, Object-Oriented Programming, Data Structures using Python, Web Development with Django/Flask, Python for Data Analysis, and Scripting for Automation. Ensure students develop both theoretical knowledge and hands-on programming skills. Adapt teaching methodologies to suit diverse learning needs and industry trends. Design and develop course materials, including syllabi, lecture notes, and assessments. Foster a dynamic and interactive learning environment to promote student engagement. Research and Innovation: Conduct research in the area of Python and its applications; contribute to academic publications. Stay updated with the latest Python libraries, frameworks, and industry applications. Encourage and guide students in research projects related to Python and open-source technologies. Mentorship: Mentor students in developing Python-based projects, preparing for internships, or solving real-world problems. Support students in applying theoretical knowledge through practical exercises and capstone projects. Collaboration: Work collaboratively with fellow faculty members and industry professionals to enhance curriculum delivery. Build partnerships with industry experts for guest lectures, hackathons, internships, and project guidance. Professional Development: Continuously enhance your knowledge by attending workshops, webinars, and technical events focused on Python and open-source ecosystems. Engage in self-learning to stay ahead in emerging Python trends like AI, ML, and data engineering. Qualifications Master’s degree (M.Tech/M.Sc/ MCA – with 6 Months to 1 Year experience) in Information Technology, Computer Science, or a related field. Demonstrated expertise and practical experience with Python and its ecosystems. Prior experience in teaching or mentoring in Python programming is desirable. Skills and Competencies Strong command over Core and Advanced Python programming. Hands-on experience with Python frameworks such as Django, Flask, or FastAPI. Familiarity with tools like Pandas, NumPy, Matplotlib, and automation libraries. Strong communication and presentation skills. Ability to motivate and inspire students toward coding excellence. Problem-solving mindset and practical approach to training. Location Bhubaneshwar Benefits Competitive salary and benefits package. Opportunity to work in a dynamic and supportive educational environment. Be a part of shaping the next generation of Python developers. Engage in impactful projects and contribute to real-world problem-solving through teaching.

Posted 2 weeks ago

Apply

6.0 years

5 - 15 Lacs

India

On-site

Role: Lead Python/AI Developer Experience: 6/6+ Years Location: Ahmedabad (Gujarat) Roles and Responsibilities: Helping the Python/AI team in building Python/AI solutions architectures leveraging source technologies Driving the technical discussions with clients along with Project Managers. Creating Effort Estimation matrix of Solutions/Deliverables for Delivery Team Implementing AI solutions and architectures, including data pre-processing, feature engineering, model deployment, compatibility with downstream tasks, edge/error handling. Collaborating with cross-functional teams, such as machine learning engineers, software engineers, and product managers, to identify business needs and provide technical guidance. Mentoring and coaching junior Python/AI/ML engineers. Sharing knowledge through knowledge-sharing technical presentations. Implement new Python/AI features with high quality coding standards. Must-To Have: B.Tech/B.E. in computer science, IT, Data Science, ML or related field. Strong proficiency in Python programming language. Strong Verbal, Written Communication Skills with Analytics and Problem-Solving. Proficient in Debugging and Exception Handling Professional experience in developing and operating AI systems in production. Hands-on, strong programming skills with experience in python, in particular modern ML & NLP frameworks (scikit-learn, pytorch, tensorflow, huggingface, SpaCy, Facebook AI XLM/mBERT etc.) Hands-on experience with AWS services such as EC2, S3, Lambda, AWS SageMaker. Experience with collaborative development workflow: version control (we use github), code reviews, DevOps (incl automated testing), CI/CD. Comfort with essential tools & libraries: Git, Docker, GitHub, Postman, NumPy, SciPy, Matplotlib, Seaborn, or Plotly, Pandas. Prior Experience in relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB). Experience in working in Agile methodology Good-To Have: A Master’s degree or Ph.D. in Computer Science, Machine Learning, or a related quantitative field. Python framework (Django/Flask/Fast API) & API integration. AI/ML/DL/MLOops certification done by AWS. Experience with OpenAI API. Good in Japanese Language Job Types: Full-time, Permanent Pay: ₹500,000.00 - ₹1,500,000.00 per year Benefits: Provident Fund Work Location: In person Expected Start Date: 14/08/2025

Posted 2 weeks ago

Apply

2.0 years

1 - 7 Lacs

Ahmedabad

On-site

What You’ll Do: Design and deliver engaging training sessions on AI/ML concepts, tools, and frameworks (e.g., TensorFlow, PyTorch, Scikit-learn) Mentor and guide learners through hands-on projects, coding exercises, and real-world machine learning applications Continuously update and improve training materials to reflect current industry trends, tools, and best practices Provide technical support, troubleshoot learner issues, and ensure a high-quality learning experience during training sessions What We’re Looking For: Strong hands-on experience in Python programming , with the ability to write clean, efficient, and modular code Solid understanding of statistics and probability relevant to machine learning (e.g., hypothesis testing, distributions, regression, etc.) Proficiency with AI/ML frameworks such as TensorFlow, PyTorch, Scikit-learn, and relevant libraries like NumPy and pandas Prior experience delivering technical training or mentoring , preferably in a structured or online learning environment Excellent communication and presentation skills with the ability to explain complex concepts clearly Flexibility and availability to deliver sessions in US time zones Job Type: Full-time Pay: ₹15,000.00 - ₹60,000.00 per month Benefits: Paid time off Schedule: Night shift US shift Ability to commute/relocate: Ahmadabad, Gujarat: Reliably commute or planning to relocate before starting work (Preferred) Experience: Python: 2 years (Preferred) Statistical analysis: 2 years (Preferred) Shift availability: Night Shift (Preferred) Work Location: In person Expected Start Date: 23/07/2025

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

At Skillsoft, we propel organizations and people to grow together through transformative learning experiences. We believe every team member has the potential to be AMAZING. Join us in our quest to transform learning and help individuals unleash their edge. Job Overview Are you a knowledgeable ML/AI professional or instructor with a creative flair and a passion for content design? Join us to create impactful, learner-centered content that blends real-world expertise with engaging storytelling. As a Content Contributor, you will work with the curriculum team to create engaging, instructionally sound learning experiences in the Machine Learning and AI domain. Working with subject matter experts (SMEs), you’ll translate complex frameworks into clear, outcome-focused content across digital formats. This role demands instructional design expertise, a deep understanding of learner needs, and the ability to creatively script and plan high-impact learning assets—from video courses to assessments. Job Responsibilities Design creative and effective learning experiences grounded in instructional design principles, addressing diverse learner personas and real-world scenarios. Author and script engaging digital content, including on-demand videos, interactive walkthroughs/lessons, assessments, and job aids. Collaborate with visual designers, editors, and technical experts to bring content to life in a compelling and accessible format. Utilize Generative AI tools to accelerate and enhance content ideation, scripting, and personalization. Ensure instructional consistency, voice, and quality across all course deliverables and formats. Skills Required Excellent scripting, writing, and communication skills; able to distil complex concepts into concise, engaging narratives. Strong creativity and storytelling ability with an understanding of how to structure content for different learning styles. Fluency with and experience in programming such as Python and SQL. Fluency and experience with AI/ML libraries such as NumPy, Pandas, Sci-kit Learn, HuggingFace, and Langchain. Experience working with AI/ML technology and topics such as Agents, LLMs, OpenAI, Claude, Gemini, Copilot, and Deep Learning. Preferred/Additional Skills Relevant certifications in AI. Familiarity with Generative AI tools like ChatGPT, Claude, or similar for content creation and enhancement. Understanding of instructional design models such as ADDIE, SAM, or Bloom’s Taxonomy. More About Skillsoft Skillsoft delivers online learning, training, and talent solutions to help organizations unleash their edge . Leveraging immersive, engaging content, Skillsoft enables organizations to unlock the potential in their best assets – their people – and build teams with the skills they need for success. Empowering 36 million learners and counting, Skillsoft democratizes learning through an intelligent learning experience and a customized, learner-centric approach to skills development with resources for Leadership Development, Business Skills, Technology & Development, Digital Transformation, and Compliance. Skillsoft is partner to thousands of leading global organizations, including many Fortune 500 companies. The company features three award-winning systems that support learning, performance and success: Skillsoft learning content, the Percipio intelligent learning experience platform, which offers measurable impact across the entire employee lifecycle. Learn more at www.skillsoft.com. Thank you for taking the time to learn more about us. If this opportunity intrigues you, we would love for you to apply! NOTE TO EMPLOYMENT AGENCIES: We value the partnerships we have built with our preferred vendors. Skillsoft does not accept unsolicited resumes from employment agencies. All resumes submitted by employment agencies directly to any Skillsoft employee or hiring manager in any form without a signed Skillsoft Employment Agency Agreement on file and search engagement for that position will be deemed unsolicited in nature. No fee will be paid in the event the candidate is subsequently hired as a result of the referral or through other means. Skillsoft is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information, and other legally protected categories.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

We use our specialization in analytics, digital interventions, and operations management—alongside deep industry expertise — to deliver solutions that help you outperform the competition. For more information, visit www.exlservice.com. Job Summary: We are looking for a results-driven Data Scientist with a strong foundation in SQL and Python , coupled with hands-on experience in AI/ML and exposure to Generative AI (GenAI) technologies. The ideal candidate will work on developing predictive models, building data pipelines, and experimenting with GenAI tools to solve business challenges and enhance decision-making. Key Responsibilities: Design and implement AI/ML models for a variety of business use cases, including prediction, classification, clustering, and personalization. Extract, clean, and manipulate large datasets using SQL and Python for model development and analysis. Apply and experiment with Generative AI techniques (e.g., LLMs, text generation, summarization, prompt engineering) for emerging business needs. Build scalable data pipelines and automate data workflows. Work collaboratively with cross-functional teams to define project goals and deliver actionable insights. Communicate analytical results and technical concepts to both technical and non-technical stakeholders. Stay current with advancements in AI/ML and GenAI technologies and frameworks. Required Skills: Strong programming skills in Python with experience in libraries like Pandas, NumPy, scikit-learn, TensorFlow, or PyTorch . Proficiency in SQL for data extraction, joins, and optimization. Solid understanding of machine learning algorithms and data science best practices. Experience handling structured and unstructured data. Exposure to Generative AI tools and frameworks (e.g., OpenAI APIs, Hugging Face, LangChain). Familiarity with data visualization tools and cloud platforms (AWS, GCP, Azure) is a plus.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

12 - 20 Lacs

Bengaluru

Work from Office

Senior Data Scientist Req number: R5797 Employment type: Full time Worksite flexibility: Hybrid Who we are CAI is a global technology services firm with over 8,500 associates worldwide and a yearly revenue of $1 billion+. We have over 40 years of excellence in uniting talent and technology to power the possible for our clients, colleagues, and communities. As a privately held company, we have the freedom and focus to do what is right—whatever it takes. Our tailor-made solutions create lasting results across the public and commercial sectors, and we are trailblazers in bringing neurodiversity to the enterprise. Job Summary We’re searching for an experienced Senior Data Scientist who excels at statistical analysis, feature engineering, and end to end machine learning operations. Your primary mission will be to build and productionize demand forecasting models across thousands of SKUs, while owning the full model lifecycle—from data discovery through automated re training and performance monitoring. This is a Full-time and Hybrid position. Job Description What You’ll Do Advanced ML Algorithms: Design, train, and evaluate supervised & unsupervised models (regression, classification, clustering, uplift). Apply automated hyperparameter optimization (Optuna, HyperOpt) and interpretability techniques (SHAP, LIME). Data Analysis & Feature Engineering: • Perform deep exploratory data analysis (EDA) to uncover patterns & anomalies. Engineer predictive features from structured, semistructured, and unstructured data; manage feature stores (Feast). Ensure data quality through rigorous validation and automated checks. TimeSeries Forecasting (Demand): • Build hierarchical, intermittent, and multiseasonal forecasts for thousands of SKUs. Implement traditional (ARIMA, ETS, Prophet) and deeplearning (RNN/LSTM, TemporalFusion Transformer) approaches. Reconcile forecasts across product/category hierarchies; quantify accuracy (MAPE, WAPE) and bias. MLOps & Model Lifecycle: • Establish model tracking & registry (MLflow, SageMaker Model Registry). Develop CI/CD pipelines for automated retraining, validation, and deployment (Airflow, Kubeflow, GitHub Actions). Monitor data & concept drift; trigger retuning or rollback as needed. Statistical Analysis & Experimentation: • Design and analyze A/B tests, causal inference studies, and Bayesian experiments. Provide statisticallygrounded insights and recommendations to stakeholders. Collaboration & Leadership: Translate business objectives into datadriven solutions; present findings to exec & nontech audiences. Mentor junior data scientists, review code/notebooks, and champion best practices. What You'll Need M.S. in Statistics (preferred) or related field such as Applied Mathematics, Computer Science, Data Science. 5+ years building and deploying ML models in production. Expertlevel proficiency in Python (Pandas, NumPy, SciPy, scikitlearn), SQL, and Git. Demonstrated success delivering largescale demandforecasting or timeseries solutions. Handson experience with MLOps tools (MLflow, Kubeflow, SageMaker, Airflow) for model tracking and automated retraining. Solid grounding in statistical inference, hypothesis testing, and experimental design. Experience in supplychain, retail, or manufacturing domains with highgranularity SKU data. Familiarity with distributed data frameworks (Spark, Dask) and cloud data warehouses (Big Query, Snowflake). Knowledge of deeplearning libraries (PyTorch, TensorFlow) and probabilistic programming (PyMC, Stan). Strong datavisualization skills (Plotly, Dash, Tableau) for storytelling and insight communication. Physical Demands This role involves mostly sedentary work, with occasional movement around the office to attend meetings, etc. Ability to perform repetitive tasks on a computer, using a mouse, keyboard, and monitor. Reasonable accommodation statement If you require a reasonable accommodation in completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employment selection process, please direct your inquiries to application.accommodations@cai.io or (888) 824 – 8111.

Posted 2 weeks ago

Apply

0.0 - 2.0 years

0 Lacs

India

On-site

Job Title: ML Engineer/Data Scientist Duration: 12 Months Location: PAN INDIA Timings: Full Time (As per company timings) Notice Period: within 15 days or immediate joiner Experience: 0- 2 years About The Job We are seeking a highly motivated and experienced ML Engineer/Data Scientist to join our growing ML/GenAI team. You will play a key role in designing, developing ML applications by evaluating models, training and/or fine tuning them. You will play a crucial role in developing Gen AI based solutions for our customers. As a senior member of the team, you will take ownership of projects, collaborating with engineers and stakeholders to ensure successful project delivery. What We're Looking For At least 1-2 years of experience in designing & building AI applications for customer and deploying them into production At least 1-2 years of Software engineering experience in building Secure, scalable and performant applications for customers. Experience with Document extraction using AI, Conversational AI, Vision AI, NLP or Gen AI. Design, develop, and operationalize existing ML models by fine tuning, personalizing it. Evaluate machine learning models and perform necessary tuning. Develop prompts that instruct LLM to generate relevant and accurate responses. Collaborate with data scientists and engineers to analyze and preprocess datasets for prompt development, including data cleaning, transformation, and augmentation. Conduct thorough analysis to evaluate LLM responses, iteratively modify prompts to improve LLM performance. Hands on customer experience with RAG solution or fine tuning of LLM model. Build and deploy scalable machine learning pipelines on GCP or any equivalent cloud platform involving data warehouses, machine learning platforms, dashboards or CRM tools. Experience working with the end-to-end steps involving but not limited to data cleaning, exploratory data analysis, dealing outliers, handling imbalances, analyzing data distributions (univariate, bivariate, multivariate), transforming numerical and categorical data into features, feature selection, model selection, model training and deployment. Proven experience building and deploying machine learning models in production environments for real life applications Good understanding of natural language processing, computer vision or other deep learning techniques. Expertise in Python, Numpy, Pandas and various ML libraries (e.g., XGboost, TensorFlow, PyTorch, Scikit-learn, LangChain). Familiarity with Google Cloud or any other Cloud Platform and its machine learning services. Excellent communication, collaboration, and problem-solving skills. Good to Have Google Cloud Certified Professional Machine Learning or TensorFlow Certified Developer certifications or equivalent. Experience of working with one or more public cloud platforms - namely GCP, AWS or Azure. Experience with Amazon Lex or Google DialogFlow CX or Microsoft Copilot studio for CCAI Agent workflows Experience with AutoML and vision techniques. Master’s degree in statistics, machine learning or related fields.

Posted 2 weeks ago

Apply

6.0 - 10.0 years

22 - 27 Lacs

Bengaluru

Hybrid

Python libraries and frameworks such as FastAPI, Pandas, NumPy, Pydicom. Frontend dev skills with proficiency in JavaScript, React, and TypeScript Strong experience in Python, Flask, HTML.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies