Jobs
Interviews

26630 Ml Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 2.0 years

1 - 5 Lacs

Chennai

Remote

Junior Machine Learning Engineer Role Key Responsibilities: Machine Learning Development Support: Assist in designing, developing and deploying ML models and algorithms under the guidance of senior engineers, to tackle client challenges across banking, legal and related sectors. Cloud & MLOps Support: Help implement ML solutions on AWS (with emphasis on Amazon SageMaker). Contribute to building and maintaining CI/CD pipelines using infrastructure-as-code tools such as CloudFormation and Terraform to automate model training and deployment. Algorithm Implementation & Testing: Write clean, efficient Python code to implement ML algorithms and data pipelines. Conduct experiments, evaluate model performance (e.g. accuracy, precision, recall) and document results. Collaboration & Communication: Work closely with data scientists, ML engineers and DevOps teams to integrate models into production. Participate in sprint meetings and client calls, conveying technical updates in clear, concise terms. Quality, Documentation & Compliance: Maintain thorough documentation of data preprocessing steps, model parameters and deployment workflows. Follow data security best practices and ensure compliance with confidentiality requirements for financial and legal data. Required Qualifications & Experience: Education: Bachelor’s degree in Computer Science, Engineering, Data Science or a closely related discipline. Experience: 0–2 years of practical exposure to machine learning or software development—this may include internships, academic projects or early professional roles. Programming & ML Skills: Proficiency in Python (including pandas, NumPy, scikit-learn). Basic understanding of ML concepts and model evaluation techniques. Cloud & DevOps Familiarity: Hands-on coursework or project experience with AWS (preferably SageMaker). Awareness of CI/CD principles and infrastructure-as-code tools (CloudFormation, Terraform). Hybrid Work Skills: Comfortable operating in a hybrid environment—able to collaborate effectively onsite in Chennai and maintain productivity when working remotely. Soft Skills: Strong analytical thinking, problem-solving aptitude and clear written/verbal communication. Demonstrated ability to learn quickly and work in a client-focused setting.

Posted 12 hours ago

Apply

0 years

2 - 4 Lacs

Chennai

Remote

DESCRIPTION This team enables automation at Amazon Robotics Fulfillment centers. This team serves Amazon Internal Fulfillment Technologies & Robotics teams by enabling automation, which includes real-time & offline (image/video) data auditing services. One of the key contributions of this team is supporting the fulfillment centers in maintaining inventory accuracy. An Associate in this role is required to watch the video of the stowing action at a fulfillment center, understand it thoroughly and make best use of human judgement in combination with the tools and resources to indicate the activity captured in the video. They are expected to verify or mark the location of product through a tool while maintaining highest level of accuracy. This process helps in maintaining the fulfillment center's stow quality. This is an operational role. Under general supervision, the Associate performs precise and thorough video/image audits with high degree of accuracy and speed, thus aiding defect reduction. Key job responsibilities The Associate has to watch several hundred videos in a shift and provide responses by following goals on accuracy (quality), speed (productivity) and right / acceptable practices. Associates are required to take breaks at the pre-defined slots and ensure 6.8 to 7 hours’ time per day is spent to answer the videos. Associates who are hired to work from home should maintain (1) dedicated workspace i.e., table, chair & sufficient lighting (2) workspace / work related data shouldn’t be accessed by anyone other than employee The candidate is expected to demonstrate: Willingness to work in Non-tech role for contract duration of 6 months Ability to audit image/video/text based Jobs Ability to identify details from blurry, less sharp videos and provide correct response. Requires high level of attention & focus on screen Willingness to work on incremental targets/goals on quality & productivity Fast Pace of implementation & consistent performance Ability to work in rotational shifts (including night shifts), remote teams and exceptionally good team player Readiness to come to office for few days (when required, applicable for associates working from home) Willing to switch ON laptop camera while on virtual meetings. A day in the life Associates work in 24x7 environment with rotational shifts. Associates would be working in a 9 hour shift, including pre-scheduled breaks. The shift timings would be subject to change every 3-4 months or as per business requirement. In case associate is working in night shift, night shift allowance will be provided as per applicable Amazon’s work policy. Weekly Offs: Rotational two-consecutive day off (it is a 5-day working week with 2 consecutive days off, not necessarily Saturday and Sunday) or as per business discretion. About the team Data Auditing Operations team provides human support to Amazon Fulfillment facilities with goal of enabling hands-free active stowing through visual audits on videos/images. Videos with brief duration (typically between 15 and 20seconds) are sent to Operations Team for humans to audit them with information on products being stored at fulfilment centers. For business use, these videos must be thoroughly reviewed and audited using best human judgement. The effectiveness of automated process will be increased by using videos that Associates have audited. This process helps maintaining stow quality at fulfillment center and Associate will be further evaluated for performance improvements/coaching. BASIC QUALIFICATIONS Bachelor's degree PREFERRED QUALIFICATIONS Work a flexible schedule/shift/work area, including weekends, nights, and/or holidays Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 12 hours ago

Apply

10.0 years

3 - 7 Lacs

Chennai

On-site

Vice President, Production Services Application Support I At BNY, our culture empowers you to grow and succeed. As a leading global financial services company at the center of the world’s financial system we touch nearly 20% of the world’s investible assets. Every day around the globe, our 50,000+ employees bring the power of their perspective to the table to create solutions with our clients that benefit businesses, communities and people everywhere. We continue to be a leader in the industry, awarded as a top home for innovators and for creating an inclusive workplace. Through our unique ideas and talents, together we help make money work for the world. This is what #LifeAtBNY is all about. We’re seeking a future team member for the role of Vice President, Production Services Application Support I to join our AI- Hub team. This role is located in CHENNAI, TN - Hybrid In this role, you’ll make an impact in the following ways: Resolve and triage any issues related to critical applications, servers, networks, and overall health of the application. Maintain the operational stability and integrity BNY’s AI Hub platform which is leveraged by other LOB’s Build and maintain standard procedures to troubleshoot and resolve events. Work with Ai Hub engineers to reduce mean time to resolution and enhance delivery of the service. They must maintain communication with the vendors and the team to ensure they remain updated on any issues in their domains. Regularly interact with the Internal customers and support teams. Treat the stakeholders with diplomacy and politeness. They must handle both the non-technical and technical requirements for the users. To be successful in this role, we’re seeking the following: Bachelor’s degree in information technology or computer science and/or equivalent work experience in similar fields. 10+ years of experience in technology support areas. Including Java, Phyton, Network, Windows, Unix performing hands-on IT Infrastructure & Application troubleshooting. Proficiency in SQL and Splunk queries to understand logs and build dashboards. Understanding of AI & ML, Agents, Models and core technology on how AI systems learn and make decisions Understanding of Microsoft Azure and GCP ( Google Cloud Platform) Background and experience working in an enterprise environment with ITIL Service Management disciplines, inclusive of Request, Incident, Problem and Change processes. Self-motivated, with key strengths in initiative, dependability, and teamwork. At BNY, our culture speaks for itself. Here’s a few of our awards: America’s Most Innovative Companies, Fortune, 2024 World’s Most Admired Companies, Fortune 2024 Human Rights Campaign Foundation, Corporate Equality Index, 100% score, 2023-2024 Best Places to Work for Disability Inclusion, Disability: IN – 100% score, 2023-2024 “Most Just Companies”, Just Capital and CNBC, 2024 Dow Jones Sustainability Indices, Top performing company for Sustainability, 2024 Bloomberg’s Gender Equality Index (GEI), 2023 Our Benefits and Rewards: BNY offers highly competitive compensation, benefits, and wellbeing programs rooted in a strong culture of excellence and our pay-for-performance philosophy. We provide access to flexible global resources and tools for your life’s journey. Focus on your health, foster your personal resilience, and reach your financial goals as a valued member of our team, along with generous paid leaves, including paid volunteer time, that can support you and your family through moments that matter. BNY is an Equal Employment Opportunity/Affirmative Action Employer - Underrepresented racial and ethnic groups/Females/Individuals with Disabilities/Protected Veterans.

Posted 12 hours ago

Apply

0 years

0 Lacs

Chennai

On-site

Vice President, Data Scientist I At BNY, our culture empowers you to grow and succeed. As a leading global financial services company at the center of the world’s financial system we touch nearly 20% of the world’s investible assets. Every day around the globe, our 50,000+ employees bring the power of their perspective to the table to create solutions with our clients that benefit businesses, communities and people everywhere. We continue to be a leader in the industry, awarded as a top home for innovators and for creating an inclusive workplace. Through our unique ideas and talents, together we help make money work for the world. This is what #LifeAtBNY is all about. We’re seeking a future team member for the role of Vice President, Data Scientist I to join our BNY team. This role is in Chennai, TN –HYBRID. In this role, you’ll make an impact in the following ways: Mining or extracting useful data from valuable data sources. Using machine learning tools to select features, create and optimize classifiers. Preprocessing structured and unstructured data. Enhancing data collection procedures to include relevant information for analytic systems Processing, cleansing, and verifying the data quality for analysis. Working with colleagues in other departments to improve business outcomes. Creating and applying custom data models and algorithms to data sets. Keeping up with and contributing to the latest AI research, applying new findings to ongoing projects. Ensuring ethical AI development practices, focusing on fairness, transparency, and privacy. Leading the design, development, and implementation of LLMs and ML models for NLP applications. Working with cross-functional teams to define requirements, conduct experiments, and evaluate models for performance and scalability. Staying on top of the latest developments in LLMs, ML, and NLP technologies and applying them to enhance our products and services. Mentoring junior engineers and providing technical guidance on LLMs, ML, and NLP best practices. To be successful in this role, we’re seeking the following: Bachelor's degree in Computer Science, Engineering, or related field etc. Knowledge of financial domain and investment banking is a plus. Knows data well and has working knowledge of various Big Data technologies and their use in AI/ML. Has expertise in statistics and probability, and the various evaluation metrics related to different algorithms. Has built machine learning models including data preparation techniques, feature engineering techniques, and related ML algorithms to solve real world problems. Proficient in Python and common machine learning and AI frameworks and packages, such as TensorFlow, PyTorch, Keras, Scikit-learn, etc. Experience in Data Science development lifecycle, project implementation methodologies such as agile etc. and stays current with the AI research areas and current trends. Expertise in handling large and complex datasets, with a keen ability to uncover insights that drive strategic decisions. Advanced proficiency in data visualization and the creation of dashboards and protypes that communicate insights clearly. Our Benefits and Rewards: BNY offers highly competitive compensation, benefits, and wellbeing programs rooted in a strong culture of excellence and our pay-for-performance philosophy. We provide access to flexible global resources and tools for your life’s journey. Focus on your health, foster your personal resilience, and reach your financial goals as a valued member of our team, along with generous paid leaves, including paid volunteer time, that can support you and your family through moments that matter. BNY is an Equal Employment Opportunity/Affirmative Action Employer - Underrepresented racial and ethnic groups/Females/Individuals with Disabilities/Protected Veterans.

Posted 12 hours ago

Apply

4.0 years

3 - 5 Lacs

Vadodara

On-site

Role & Responsibilities 4+ years of experience applying AI to practical uses Develop and train computer vision models for tasks like: Object detection and tracking (YOLO, Faster R-CNN, etc.) Image classification, segmentation, OCR (e.g., PaddleOCR, Tesseract) Face recognition/blurring, anomaly detection, etc. Optimize models for performance on edge devices (e.g., NVIDIA Jetson, OpenVINO, TensorRT). Process and annotate image/video datasets; apply data augmentation techniques. Proficiency in Large Language Models. Strong understanding of statistical analysis and machine learning algorithms. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Understanding of image processing concepts (thresholding, contour detection, transformations, etc.) Experience in model optimization, quantization, or deploying to edge (Jetson Nano/Xavier, Coral, etc.) Strong programming skills in Python (or C++), with expertise in: Implement and optimize machine learning pipelines and workflows for seamless integration into production systems. Hands-on experience with at least one real-time CV application (e.g., surveillance, retail analytics, industrial inspection, AR/VR). OpenCV, NumPy, PyTorch/TensorFlow Computer vision models like YOLOv5/v8, Mask R-CNN, DeepSORT Engage with multiple teams and contribute on key decisions. Expected to provide solutions to problems that apply across multiple teams. Lead the implementation of large language models in AI applications. Research and apply cutting-edge AI techniques to enhance system performance. Contribute to the development and deployment of AI solutions across various domains Requirements Design, develop, and deploy ML models for: OCR-based text extraction from scanned documents (PDFs, images) Table and line-item detection in invoices, receipts, and forms Named entity recognition (NER) and information classification Evaluate and integrate third-party OCR tools (e.g., Tesseract, Google Vision API, AWS Textract, Azure OCR,PaddleOCR, EasyOCR) Develop pre-processing and post-processing pipelines for noisy image/text data Familiarity with video analytics platforms (e.g., DeepStream, Streamlit-based dashboards). Experience with MLOps tools (MLflow, ONNX, Triton Inference Server). Background in academic CV research or published papers. Knowledge of GPU acceleration, CUDA, or hardware integration (cameras, sensors).

Posted 12 hours ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Company Description AppTestify is a leading provider of On-Demand Testing and Digital Engineering Services, delivering scalable solutions for businesses of all sizes. As a renowned software engineering and QA services provider, AppTestify enables faster deployment of superior software. With over 100 Engineers, the company has served more than 120 clients globally. Their expertise includes DevOps, automation testing, API testing, functional testing, mobile testing, Salesforce application testing, and comprehensive digital engineering solutions. Job Title: Machine Learning Engineer Experience: 5 to 8 Years Location: Pune Job Description: We are looking for highly motivated and experienced Machine Learning Engineers to join our advanced analytics and AI team. The ideal candidates will have strong proficiency in building, training, and deploying machine learning models at scale using modern ML tools and frameworks. Experience with LLMs (Large Language Models) such as OpenAI and Hugging Face Transformers is highly desirable. Key Responsibilities: Design, develop, and deploy machine learning models for real-world applications. Implement and optimize end-to-end ML pipelines using PySpark and MLflow . Work with structured and unstructured data using Pandas , NumPy , and other data processing libraries. Train and fine-tune models using scikit-learn , TensorFlow , or PyTorch . Integrate and experiment with Large Language Models (LLMs) such as OpenAI GPT , Hugging Face Transformers , etc. Collaborate with cross-functional teams including data engineers, product managers, and software developers. Monitor model performance and continuously improve model accuracy and reliability. Maintain proper versioning and reproducibility of ML experiments using MLflow. Required Skills: Strong programming experience in Python . Solid understanding of machine learning algorithms , model development, and evaluation techniques. Experience with PySpark for large-scale data processing. Proficient with MLflow for tracking experiments and model lifecycle management. Hands-on experience with Pandas , NumPy , and Scikit-learn . Familiarity or hands-on experience with LLMs (e.g., OpenAI , Hugging Face Transformers ). Understanding of MLOps principles and deployment best practices. Preferred Qualifications: Bachelor’s or Master’s degree in Computer Science, AI/ML, Data Science, or a related field. Experience in cloud ML platforms (AWS SageMaker, Azure ML, or GCP Vertex AI) is a plus. Strong analytical and problem-solving abilities. Excellent communication and teamwork skills. Apply directly - https://hrms.apptestify.com/apply/688c7ed095f9ce582a3a224a

Posted 12 hours ago

Apply

3.0 - 8.0 years

5 - 8 Lacs

Noida

On-site

Years of Experience: 3-8 Years Location: Noida Work from Office Job Description The role involves managing and organizing large-scale video datasets, building automation tools, and ensuring accurate performance evaluations of models. The ideal candidate should be proactive, hands-on, and capable of handling a small team. Key Responsibilities Data Cataloging & Management Maintain structured catalogs of video data with consistent labeling and metadata. Organize datasets for efficient access, versioning, and reuse across model development cycles. Tool Development & Automation Build or assist in developing internal tools to automate data handling, quality checks, and reporting. Streamline data pipelines to support rapid model development and testing. Accuracy Computation & Reporting Implement evaluation pipelines to compute model metrics such as accuracy, precision, recall, etc. Generate regular performance reports to support model tuning and validation efforts. Team Collaboration & Coordination Lead a small team (up to 3 members) in daily data-related activities, ensuring quality and timely delivery. Coordinate with ML engineers, QA teams, and product stakeholders for end-to-end data lifecycle management. Qualifications & Required Skills B.Tech Experience in data analysis, preferably in video/image-based domains. Desirable knowledge of data handling tools like Python (pandas, NumPy), SQL, and Excel. Familiarity with video annotation workflows, dataset versioning, and evaluation techniques. Experience in building or using automation tools for data processing. Ability to manage tasks and mentor junior team members effectively. Good communication and documentation skills.

Posted 12 hours ago

Apply

0 years

3 - 4 Lacs

Noida

On-site

Company Description The Smart Cube, a WNS company, is a trusted partner for high performing intelligence that answers critical business questions. And we work with our clients to figure out how to implement the answers, faster. Job Description Job DescriptionDesign, develop, and deploy ML/DL models (including Generative AI) using Python. Apply NLP techniques for text analysis and model building. Conduct data preprocessing, feature engineering, and model evaluation. Collaborate with teams to understand needs and deliver AI solutions. Stay updated with the latest advancements in relevant AI fields. Deploy and monitor ML models in production environments.Good Knowledge of AWS/Azure cloud is goodRequired Skills and Qualifications:Proficiency in Python and relevant ML/DL libraries (e.g., scikit-learn, TensorFlow, Kera’s, PyTorch). Strong understanding and practical application of Machine Learning and Deep Learning algorithms. Experience with Natural Language Processing (NLP) techniques and libraries. Familiarity with Generative AI models and their applications. Experience in data preprocessing, feature engineering, and model evaluation. Ability to deploy and monitor machine learning models. Excellent analytical and problem-solving skills. Strong communication skills to present data insights to stakeholders.Ability to work independently and as part of a team. Qualifications Graduate/Post Graduate

Posted 12 hours ago

Apply

3.0 years

8 - 9 Lacs

Noida

On-site

Job Summary: We are seeking a skilled and innovative AI Application Developer with a strong background in software development and practical experience in implementing AI-based solutions. The ideal candidate will have hands-on expertise in Natural Language Processing (NLP) , Computer Vision , or other machine learning technologies, and should be capable of designing and integrating AI models into production-level web or mobile applications. Key Responsibilities: Design, develop, and implement AI-driven applications using technologies such as NLP, Computer Vision, or predictive analytics. Collaborate with product managers, data scientists, and UI/UX teams to gather requirements and translate them into AI solutions. Integrate AI/ML models with frontend and backend systems (web/mobile). Optimize performance of AI models for scalability, speed, and efficiency. Maintain clear documentation of design decisions, code, and models. Stay up-to-date with emerging AI tools, frameworks, and industry trends. Conduct testing, validation, and tuning of AI models for real-world deployment. Support in debugging and enhancing existing AI-integrated applications. Required Qualifications & Experience: Education : Minimum B.E. / B.Tech / MCA in Computer Science, IT, or a related field. Experience : 3–5 years of experience in IT/Software Development. Minimum 2 years of hands-on experience in AI/ML technologies including NLP, Computer Vision, or similar domains. Strong experience integrating AI models into production environments (web or mobile apps). Technical Skills: Proficiency in programming languages: Python , Java , or C++ AI/ML Frameworks: TensorFlow , PyTorch , Keras , or OpenCV Experience with NLP libraries (e.g., SpaCy, NLTK, BERT) Experience with Computer Vision libraries (e.g., OpenCV, YOLO, Detectron2) Familiarity with RESTful APIs, cloud-based services (AWS, Azure, GCP) Knowledge of CI/CD pipelines , Git, and containerization (Docker) Job Type: Contractual / Temporary Contract length: 12 months Pay: ₹70,000.00 - ₹80,000.00 per month Benefits: Provident Fund Work Location: In person

Posted 12 hours ago

Apply

10.0 years

7 - 9 Lacs

Noida

On-site

Job Description: AI Driven Marketing Strategist Location: Noida, Uttar Pradesh, IND Our mission is to unlock human potential. We welcome you for who you are, the background you bring, and we embrace individuals who get excited about learning. Bring your experiences, your perspectives, and your passion; it’s in our differences that we empower the way the world learns. About the Role: Ready to revolutionize marketing operations and deliver measurable business impact? Join Wiley as our AI-Driven Marketing Strategist and lead the transformation from traditional marketing to cutting-edge AI-powered strategies. What You'll Do: Lead the strategic transformation of marketing from basic practices to sophisticated AI-driven technologies Develop and execute comprehensive AI-driven marketing strategies to optimise lead generation, customer engagement, and drive revenue growth Implement AI-powered tools to enhance campaign performance Lead AI-based personalisation across content marketing, email campaigns, and paid media for maximum impact Analyse market trends and customer behaviour using advanced AI tools to deliver actionable insights that drive strategic decisions Manage and mentor marketing colleagues while collaborating with cross-functional teams to integrate AI solutions Monitor, evaluate and continuously optimize AI-driven marketing initiatives Transform our marketing into an AI-first powerhouse What We're Looking For: 10+ years marketing experience with 3+ years hands-on AI/ML integration Proven track record with AI-powered marketing tools Leadership experience driving digital transformation Results-oriented mindset with measurable business impact Why This Role Matters: You'll be the architect of our marketing future, building sophisticated AI-driven methodologies that deliver superior ROI while establishing Wiley India as a leader in AI-powered marketing. Ready to take the next step in your career? Apply today! About Wiley: Wiley is a trusted leader in research and learning, our pioneering solutions and services are paving the way for knowledge seekers as they work to solve the world's most important challenges. We are advocates of advancement, empowering knowledge-seekers to transform today's biggest obstacles into tomorrow's brightest opportunities. With over 200 years of experience in publishing, we continue to evolve knowledge seekers' steps into strides, illuminating their path forward to personal, educational, and professional success at every stage. Around the globe, we break down barriers for innovators, empowering them to advance discoveries in their fields, adapt their workforces, and shape minds. Wiley is an equal opportunity/affirmative action employer. We evaluate all qualified applicants and treat all qualified applicants and employees without regard to race, color, religion, sex, sexual orientation, gender identity or expression, national origin, disability, protected veteran status, genetic information, or based on any individual's status in any group or class protected by applicable federal, state or local laws. Wiley is also committed to providing reasonable accommodation to applicants and employees with disabilities. Applicants who require accommodation to participate in the job application process may contact tasupport@wiley.com for assistance. We are proud that our workplace promotes continual learning and internal mobility. Our values support courageous teammates, needle movers, and learning champions all while striving to support the health and well-being of all employees. We offer meeting-free Friday afternoons allowing more time for heads down work and professional development, and through a robust body of employee programing we facilitate a wide range of opportunities to foster community, learn, and grow. We are committed to fair, transparent pay, and we strive to provide competitive compensation in addition to a comprehensive benefits package. It is anticipated that most qualified candidates will fall within the range, however the ultimate salary offered for this role may be higher or lower and will be set based on a variety of non-discriminatory factors, including but not limited to, geographic location, skills, and competencies. Wiley proactively displays target base pay range for United Kingdom, Canada and USA based roles. When applying, please attach your resume/CV to be considered. #LI-RB1

Posted 12 hours ago

Apply

3.0 years

0 Lacs

Uttar Pradesh

On-site

DESCRIPTION We are seeking an exceptional Business Analyst to drive analysis and insights to help our teams make meaningful, data backed business decisions. The successful candidate will possess strong fervor for analytics, performance evaluation, setting high standards, accuracy and staying ahead of a dynamic and fast-evolving business. You should have excellent communication skills to be able to work with business leaders and share succinct performance insights. Above all, you should be passionate about people, using AI tools and have deep understanding of Amazon performance evaluation principles. About the team Global Operations-Artificial Intelligence (GO-AI) is a part of Amazon Robotics (AR), an org, in Fulfillment Technologies & Robotics (FTR). GO-AI enables Computer Vision (CV) and ML based automation by delivering high quality data to improve AI and Machine Learning (ML) product lifecycle through ‘near real-time human-in-the-loop’ (NRT HITL) and offline annotations. BASIC QUALIFICATIONS Bachelor's degree or equivalent 3+ years of business analyst, data analyst or similar role experience 5+ years of Excel (including VBA, pivot tables, array functions, power pivots, etc.) and data visualization tools such as Tableau experience Demonstrated proficiency analyzing data, creating dashboard and business reports using SQL 3+ years of program management experience, leading projects involving multiple stakeholders Proven experience writing scripts using SQL, extracting and analyzing the data to provide precise reports to stakeholders Demonstrated ability to maintain high level of integrity and discretion to handle confidential information Demonstrated written and verbal communication skills and ability to influence without authority Proven ability to present complex information in a clear and concise manner to executives Experience defining requirements and using data and metrics to draw business insights and making business recommendations PREFERRED QUALIFICATIONS Advance SQL proficiency; write complex SQL statements and ability to manipulate a massive amount of data, working knowledge of Python, creating dashboards/on demand reports using Quicksight Experience using AI tools Experience creating scorecards or similar performance evaluation reports Familiar with defining configuration specifications and business analysis requirements Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 12 hours ago

Apply

6.0 years

0 Lacs

India

Remote

Experience : 6.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: Airflow, LLMs, MLOps, Generative AI, Python Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Engineering team builds and optimizes systems spanning data ingestion, processing, storage optimization and more. We work closely with engineers and the product team to build highly scalable systems that tackle real-world data problems and provide our customers with accurate, real-time, fault tolerant solutions to their ever-growing data needs. We support various OLTP and analytics environments, including our Advanced Analytics and Digital Experience Management products. We are looking for skilled engineers experienced with building and optimizing cloud-scale distributed systems to develop our next-generation ingestion, processing and storage solutions. You will work closely with other engineers and the product team to build highly scalable systems that tackle real-world data problems. Our customers depend on us to provide accurate, real-time and fault tolerant solutions to their ever growing data needs. This is a hands-on, impactful role that will help lead development, validation, publishing and maintenance of logical and physical data models that support various OLTP and analytics environments. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What You Will Be Doing Lead the design, development, and deployment of AI/ML models for threat detection, anomaly detection, and predictive analytics in cloud and network security. Architect and implement scalable data pipelines for processing large-scale datasets from logs, network traffic, and cloud environments. Apply MLOps best practices to deploy and monitor machine learning models in production. Collaborate with cloud architects and security analysts to develop cloud-native security solutions leveraging platforms like AWS, Azure, or GCP. Build and optimize Retrieval-Augmented Generation (RAG) systems by integrating large language models (LLMs) with vector databases for real-time, context-aware applications. Analyze network traffic, log data, and other telemetry to identify and mitigate cybersecurity threats. Ensure data quality, integrity, and compliance with GDPR, HIPAA, or SOC 2 standards. Drive innovation by integrating the latest AI/ML techniques into security products and services. Mentor junior engineers and provide technical leadership across projects. Required Skills And Experience AI/ML Expertise Proficiency in advanced machine learning techniques, including neural networks (e.g., CNNs, Transformers) and anomaly detection. Experience with AI frameworks like TensorFlow, PyTorch, and Scikit-learn. Strong understanding of MLOps practices and tools (e.g., MLflow, Kubeflow). Experience building and deploying Retrieval-Augmented Generation (RAG) systems, including integration with LLMs and vector databases. Data Engineering Expertise designing and optimizing ETL/ELT pipelines for large-scale data processing. Hands-on experience with big data technologies (e.g., Apache Spark, Kafka, Flink). Proficiency in working with relational and non-relational databases, including ClickHouse and BigQuery. Familiarity with vector databases such as Pinecone and PGVector and their application in RAG systems. Experience with cloud-native data tools like AWS Glue, BigQuery, or Snowflake. Cloud and Security Knowledge Strong understanding of cloud platforms (AWS, Azure, GCP) and their services. Experience with network security concepts, extended detection and response, and threat modeling. Software Engineering Proficiency in Python, Java, or Scala for data and ML solution development. Expertise in scalable system design and performance optimization for high-throughput applications. Leadership and Collaboration Proven ability to lead cross-functional teams and mentor engineers. Strong communication skills to present complex technical concepts to stakeholders. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 12 hours ago

Apply

50.0 years

0 Lacs

Noida

On-site

Who we are: Irdeto is the world leader in digital platform cybersecurity, empowering businesses to innovate for a secure, connected future. Building on over 50 years of expertise in security, Irdeto’s services and solutions protect revenue, enable growth and fight cybercrime in video entertainment, video games, and connected industries including transport, health and infrastructure. Irdeto is the security partner dedicated to empowering a secure world where people can connect with confidence. With teams and offices around the world, Irdeto’s greatest asset is its people - our diversity is celebrated through an inclusive workplace, where everyone has an equal opportunity to drive innovation and contribute to Irdeto's success. The Role: As a Software Engineer you will be joining our Video Entertainment team and will play a pivotal role in developing and enhancing our Solutions and products. You'll work as part of a dynamic and cross-functional team to ensure the seamless delivery of high-quality Deliverables. You will work on the latest technologies in the streaming industry and Your expertise will contribute to the innovation and enhancement of our solutions, ensuring our global customers have the best possible experience. Your mission at Irdeto: Develop and maintain software applications and services for our OTT platform, ensuring high- performance, scalability, and reliability. Debug, troubleshoot, and resolve software defects and performance issues, ensuring a seamless user experience. Write clean, efficient, and maintainable code, following coding standards and software development processes. Stay up to date with industry trends and best practices and contribute to the continuous improvement of our software development processes. How you can add value to the team? Bachelor’s degree in computer science, Software Engineering, or a related field. 3+ years of experience in backend development with modern frameworks (Node.js, Go, Typescript, or Java preferred) Deep understanding of REST APIs, microservices, asynchronous processing, and scalable architectures Experience with cloud platforms (AWS, GCP, or Azure) and container orchestration (Docker, Kubernetes) Familiarity with AI/ML pipelines – either integrating ML models into backend or building services to serve AI functionality Hands-on experience with databases (SQL and NoSQL), caching, and pub/sub messaging systems (Kafka, RabbitMQ) Strong grasp of security, performance, and reliability considerations in streaming systems Excellent communication skills and a passion for collaborative problem-solving What you can expect from us: We invest in our talented employees and promote collaboration, creativity, and innovation while supporting health and well-being across our global workforce. In addition to competitive remuneration, we offer: A multicultural and international environment where diversity is celebrated Professional education opportunities and training programs Innovation sabbaticals Volunteer Day State-of-the-art office spaces Additional perks tailored to local offices (e.g., on-site gyms, fresh fruit, parking, yoga rooms, etc.) Equal Opportunity at Irdeto Irdeto is proud to be an equal opportunity employer. All decisions are based on qualifications and business needs, and we do not tolerate discrimination or harassment. We welcome applications from individuals with diverse abilities and provide accommodation during the hiring process upon request. If you’re excited about this role but don’t meet every qualification, we encourage you to apply. We believe diverse perspectives and experiences make our teams stronger. Welcome to Irdeto!

Posted 12 hours ago

Apply

0 years

0 Lacs

Uttar Pradesh

Remote

DESCRIPTION This team enables automation at Amazon Robotics Fulfillment centers. This team serves Amazon Internal Fulfillment Technologies & Robotics teams by enabling automation, which includes real-time & offline (image/video) data auditing services. One of the key contributions of this team is supporting the fulfillment centers in maintaining inventory accuracy. An Associate in this role is required to watch the video of the stowing action at a fulfillment center, understand it thoroughly and make best use of human judgement in combination with the tools and resources to indicate the activity captured in the video. They are expected to verify or mark the location of product through a tool while maintaining highest level of accuracy. This process helps in maintaining the fulfillment center's stow quality. This is an operational role. Under general supervision, the Associate performs precise and thorough video/image audits with high degree of accuracy and speed, thus aiding defect reduction. Key job responsibilities The Associate has to watch several hundred videos in a shift and provide responses by following goals on accuracy (quality), speed (productivity) and right / acceptable practices. Associates are required to take breaks at the pre-defined slots and ensure 6.8 to 7 hours’ time per day is spent to answer the videos. Associates who are hired to work from home should maintain (1) dedicated workspace i.e., table, chair & sufficient lighting (2) workspace / work related data shouldn’t be accessed by anyone other than employee The candidate is expected to demonstrate: Willingness to work in Non-tech role for contract duration of 6 months Ability to audit image/video/text based Jobs Ability to identify details from blurry, less sharp videos and provide correct response. Requires high level of attention & focus on screen Willingness to work on incremental targets/goals on quality & productivity Fast Pace of implementation & consistent performance Ability to work in rotational shifts (including night shifts), remote teams and exceptionally good team player Readiness to come to office for few days (when required, applicable for associates working from home) Willing to switch ON laptop camera while on virtual meetings. A day in the life Associates work in 24x7 environment with rotational shifts. Associates would be working in a 9 hour shift, including pre-scheduled breaks. The shift timings would be subject to change every 3-4 months or as per business requirement. In case associate is working in night shift, night shift allowance will be provided as per applicable Amazon’s work policy. Weekly Offs: Rotational two-consecutive day off (it is a 5-day working week with 2 consecutive days off, not necessarily Saturday and Sunday) or as per business discretion. About the team Data Auditing Operations team provides human support to Amazon Fulfillment facilities with goal of enabling hands-free active stowing through visual audits on videos/images. Videos with brief duration (typically between 15 and 20seconds) are sent to Operations Team for humans to audit them with information on products being stored at fulfilment centers. For business use, these videos must be thoroughly reviewed and audited using best human judgement. The effectiveness of automated process will be increased by using videos that Associates have audited. This process helps maintaining stow quality at fulfillment center and Associate will be further evaluated for performance improvements/coaching. BASIC QUALIFICATIONS Bachelor's degree PREFERRED QUALIFICATIONS Work a flexible schedule/shift/work area, including weekends, nights, and/or holidays Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 12 hours ago

Apply

2.0 years

1 - 1 Lacs

India

On-site

We are seeking a highly analytical and technically skilled Data Analyst with hands-on experience in Machine Learning to join our team. The ideal candidate will be responsible for analyzing large datasets, generating actionable insights, and building ML models to drive business solutions and innovation. Key Responsibilities: Collect, clean, and analyze structured and unstructured data from multiple sources. Develop dashboards, visualizations, and reports to communicate trends and insights to stakeholders. Identify business challenges and apply machine learning algorithms to solve them. Build, evaluate, and deploy predictive and classification models using tools like Python, R, Scikit-learn, TensorFlow, etc. Collaborate with cross-functional teams including product, marketing, and engineering to implement data-driven strategies. Optimize models for performance, accuracy, and scalability. Automate data processing and reporting workflows using scripting and cloud-based tools. Stay updated with the latest industry trends in data analytics and machine learning. Required Skills & Qualifications: Bachelor's or Master's degree in Computer Science, Data Science, Statistics, Mathematics, or related field . 2+ years of experience in data analytics and machine learning. Strong proficiency in SQL , Python (Pandas, NumPy, Scikit-learn), and data visualization tools like Tableau, Power BI , or Matplotlib/Seaborn . Experience with machine learning techniques such as regression, classification, clustering, NLP, and recommendation systems. Solid understanding of statistics, probability, and data mining concepts. Familiarity with cloud platforms like AWS, GCP, or Azure is a plus. Excellent problem-solving and communication skills. Job Types: Full-time, Permanent Pay: ₹10,000.00 - ₹15,000.00 per month Ability to commute/relocate: Kalighat, Kolkata, West Bengal: Reliably commute or planning to relocate before starting work (Preferred) Language: English (Preferred) Work Location: In person

Posted 12 hours ago

Apply

3.0 - 5.0 years

0 Lacs

India

Remote

The Computer Vision Consultant will play a pivotal role in developing and optimizing algorithms for image processing and machine learning applications. This role demands a strong background in programming, image processing techniques, and deep learning methodologies. The ideal candidate will have a demonstrated track record of utilizing various tools and libraries to enhance computer vision projects. Main Responsibilities: As a Computer Vision Consultant, your core duties will include: Developing and deploying computer vision algorithms using Python. Implementing graphical user interfaces with PyQT. Utilizing CUDA and GPUs for advanced algorithm training. Conducting image processing tasks including edge detection and camera calibration. Applying deep learning frameworks for object detection and segmentation. Key Requirements: 3 to 5 years of experience in Python programming. Proficient in OOPS concepts and threading. Hands-on experience with Numpy, Scipy, and OpenCV libraries. Familiarity with 2D/3D LIDAR data analysis. Experience with ML libraries such as TensorFlow, PyTorch, and Keras. Knowledge of object detection methods (YOLO, SSD, FRCNN, RCNN). Well-versed in semantic segmentation algorithms. Nice to Have: Experience with image transformation and stitching techniques. Other Details: This position is available for remote work and is expected to be a long-term engagement in the field of computer vision consulting, engaging with a diverse range of projects in various industries.

Posted 12 hours ago

Apply

0 years

2 - 4 Lacs

Calcutta

Remote

DESCRIPTION This team enables automation at Amazon Robotics Fulfillment centers. This team serves Amazon Internal Fulfillment Technologies & Robotics teams by enabling automation, which includes real-time & offline (image/video) data auditing services. One of the key contributions of this team is supporting the fulfillment centers in maintaining inventory accuracy. An Associate in this role is required to watch the video of the stowing action at a fulfillment center, understand it thoroughly and make best use of human judgement in combination with the tools and resources to indicate the activity captured in the video. They are expected to verify or mark the location of product through a tool while maintaining highest level of accuracy. This process helps in maintaining the fulfillment center's stow quality. This is an operational role. Under general supervision, the Associate performs precise and thorough video/image audits with high degree of accuracy and speed, thus aiding defect reduction. Key job responsibilities The Associate has to watch several hundred videos in a shift and provide responses by following goals on accuracy (quality), speed (productivity) and right / acceptable practices. Associates are required to take breaks at the pre-defined slots and ensure 6.8 to 7 hours’ time per day is spent to answer the videos. Associates who are hired to work from home should maintain (1) dedicated workspace i.e., table, chair & sufficient lighting (2) workspace / work related data shouldn’t be accessed by anyone other than employee The candidate is expected to demonstrate: Willingness to work in Non-tech role for contract duration of 6 months Ability to audit image/video/text based Jobs Ability to identify details from blurry, less sharp videos and provide correct response. Requires high level of attention & focus on screen Willingness to work on incremental targets/goals on quality & productivity Fast Pace of implementation & consistent performance Ability to work in rotational shifts (including night shifts), remote teams and exceptionally good team player Readiness to come to office for few days (when required, applicable for associates working from home) Willing to switch ON laptop camera while on virtual meetings. A day in the life Associates work in 24x7 environment with rotational shifts. Associates would be working in a 9 hour shift, including pre-scheduled breaks. The shift timings would be subject to change every 3-4 months or as per business requirement. In case associate is working in night shift, night shift allowance will be provided as per applicable Amazon’s work policy. Weekly Offs: Rotational two-consecutive day off (it is a 5-day working week with 2 consecutive days off, not necessarily Saturday and Sunday) or as per business discretion. About the team Data Auditing Operations team provides human support to Amazon Fulfillment facilities with goal of enabling hands-free active stowing through visual audits on videos/images. Videos with brief duration (typically between 15 and 20seconds) are sent to Operations Team for humans to audit them with information on products being stored at fulfilment centers. For business use, these videos must be thoroughly reviewed and audited using best human judgement. The effectiveness of automated process will be increased by using videos that Associates have audited. This process helps maintaining stow quality at fulfillment center and Associate will be further evaluated for performance improvements/coaching. BASIC QUALIFICATIONS Bachelor's degree PREFERRED QUALIFICATIONS Work a flexible schedule/shift/work area, including weekends, nights, and/or holidays Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 12 hours ago

Apply

6.0 - 8.0 years

22 - 23 Lacs

Pune, Maharashtra, India

On-site

Company Description Optimum Data Analytics is a strategic technology partner delivering reliable turn key AI solutions. Our streamlined approach to development ensures high-quality results and client satisfaction. We bring experience and clarity to organizations, powering every human decision with analytics & AI Our team consists of statisticians, computer science engineers, data scientists, and product managers. With expertise, flexibility, and cultural alignment, we understand the business, analytics, and data management imperatives of your organization. Our goal is to change how AI/ML is approached in the service sector and deliver outcomes that matter. We provide best-in-class services that increase profit for businesses and deliver improved value for customers, helping businesses grow, transform, and achieve their objectives. Job Details Position : ML Engineer Experience : 6-8 years Location : Pune/Indore office Work Mode : Onsite Notice Period : Immediate Joiner – 15 days Job Summary We are looking for highly motivated and experienced Machine Learning Engineers to join our advanced analytics and AI team. The ideal candidates will have strong proficiency in building, training, and deploying machine learning models at scale using modern ML tools and frameworks. Experience with LLMs (Large Language Models) such as OpenAI and Hugging Face Transformers is highly desirable. Key Responsibilities Design, develop, and deploy machine learning models for real-world applications. Implement and optimize end-to-end ML pipelines using PySpark and MLflow. Work with structured and unstructured data using Pandas, NumPy, and other data processing libraries. Train and fine-tune models using scikit-learn, TensorFlow, or PyTorch. Integrate and experiment with Large Language Models (LLMs) such as OpenAI GPT, Hugging Face Transformers, etc. Collaborate with cross-functional teams including data engineers, product managers, and software developers. Monitor model performance and continuously improve model accuracy and reliability. Maintain proper versioning and reproducibility of ML experiments using MLflow. Required Skills Strong programming experience in Python. Solid understanding of machine learning algorithms, model development, and evaluation techniques. Experience with PySpark for large-scale data processing. Proficient with MLflow for tracking experiments and model lifecycle management. Hands-on experience with Pandas, NumPy, and Scikit-learn. Familiarity or hands-on experience with LLMs (e.g., OpenAI, Hugging Face Transformers). Understanding of MLOps principles and deployment best practices. Preferred Qualifications Bachelor’s or Master’s degree in Computer Science, AI/ML, Data Science, or a related field. Experience in cloud ML platforms (AWS SageMaker, Azure ML, or GCP Vertex AI) is a plus. Strong analytical and problem-solving abilities. Excellent communication and teamwork skills. Skills: panda,mlflow,large language models,python,mlops,pytorch,pandas,scikit-learn.,tensorflow,pyspark,scikit-learn,numpy,llms,mlfow,machine learning

Posted 12 hours ago

Apply

0 years

12 - 15 Lacs

Bhopal

On-site

Job Summary We’re looking for a Lead Full Stack Engineer (MERN) to drive the development of scalable web applications and lead projects involving AI/ML integration. You’ll architect solutions, mentor developers, and collaborate across teams to deliver high-impact products. This role combines hands-on coding with leadership in both traditional and emerging tech domains. Roles & Responsibilities Lead the development of scalable, high-performance web applications using the MERN stack (MongoDB, Express.js, React.js, Node.js). Architect robust system designs and ensure optimal application performance. Guide and mentor junior developers; conduct regular code reviews to maintain code quality. Integrate third-party APIs and services seamlessly into applications. Collaborate closely with cross-functional teams including DevOps, Design, and QA. Participate in project planning, effort estimation, and client interactions. Contribute to projects involving AI/ML integration and data-driven features. Preferred: Experience with Python for scripting, automation, or AI/ML-related tasks. Required Skills & Tools: · Strong expertise in MongoDB, Express, React, Node.js · State management with Redux, Context API · TypeScript experience is a plus. · CI/CD tools: Jenkins, GitHub Actions, Docker · Unit testing tools: Jest, Mocha, Chai · Cloud services: AWS, GCP, or Azure · Familiar with Agile/Scrum methodology · Proficient with Git, JIRA, Postman, Swagger NEED 5+ YRS OF EXP Bachelor’s degree in Computer Science, Engineering, or related field required. Job Type: Full-time Pay: ₹1,200,000.00 - ₹1,500,000.00 per year Work Location: In person

Posted 12 hours ago

Apply

1.0 - 3.0 years

2 - 4 Lacs

India

On-site

Job Summary: We are seeking a Junior Faculty and Researcher - Data Science to support our ongoing research and consulting projects in AI, Data Science, and related fields. This is an excellent opportunity for recent graduates or early-career professionals who want to build a strong foundation in applied research and data-driven development. Key Responsibilities: Primary role is broken down in following 3 activities: Consulting: 30-40% Academics: 40-60% Research: 10-20% As a Consultant (30% - 40%), you'll collaborate with business groups, managing and mentoring data scientists to create cutting-edge data-driven systems. Your role involves constructing advanced analytics solutions, including prediction and recommendation systems, knowledge mining, and business automation. Effective communication of analytical results to various business disciplines is crucial. As a Researcher (10% - 20%), you'll identify problems for the betterment of society, leading Data Scientists in broad research directions using AI/ML. Constructing primary data collection strategies aligning with fields such as Business Application of AI, Environment, Public Good, Economics of Data Science and AI, Ethics, and Law in AI and Data Science. As an Academician (40%-60%), you'll teach structured classes, contribute to cutting-edge academic curriculum development, and play a central role in academic operations and learning activities. Qualities We Value: We seek candidates with a strong intuition for data, Data Science fundamentals, and an ability to engage with the external ecosystem. Problem-solving skills, adaptability, self-learning ability, and innovative thinking are essential. Excellent analytical skills, creativity, proactiveness, and effective communication are highly valued. You should be highly driven, flexible, resourceful, and a team player with strong influencing skills. Must-Have Requirements: Education in Data Science, Machine Learning, and Artificial Intelligence (degree or certification). Minimum education: Bachelor’s degree in STEM. Preferred Education: Master’s degree. Applied Data Science experience: 1 to 3 years. Hands-on experience in creating analytics solutions, including EDA and dashboarding. Experience in building data analytics models using various technologies and/or frameworks (Python, R, H2O, Keras, TensorFlow, Spark ML). Good-to-Have Requirements: Understanding of Computer Vision and Natural Language Processing techniques. Experience in Public Speaking/Teaching/Coaching in Data Science/Technology. Project Management Experience. Recommendation System Experience. Knowledge of Reinforcement Learning System Design. Job Type: Full-time Pay: ₹240,000.00 - ₹420,000.00 per year Benefits: Paid sick time Paid time off Ability to commute/relocate: Vijay Nagar, Indore, Madhya Pradesh: Reliably commute or planning to relocate before starting work (Preferred) Experience: Data science: 1 year (Required) Language: English (Required) Hindi (Required) Work Location: In person

Posted 12 hours ago

Apply

155.0 years

0 Lacs

Mumbai, Maharashtra, India

Remote

Position Title Software Engineer I Function/Group Digital & Technology Location Mumbai Shift Timing Regular 10 AM – 7 PM Role Reports to Manager Remote/Hybrid/in-Office Hybrid About General Mills We make food the world loves: 100 brands. In 100 countries. Across six continents. With iconic brands like Cheerios, Pillsbury, Betty Crocker, Nature Valley, and Häagen-Dazs, we’ve been serving up food the world loves for 155 years (and counting). Each of our brands has a unique story to tell. How we make our food is as important as the food we make. Our values are baked into our legacy and continue to accelerate us into the future as an innovative force for good. General Mills was founded in 1866 when Cadwallader Washburn boldly bought the largest flour mill west of the Mississippi. That pioneering spirit lives on today through our leadership team who upholds a vision of relentless innovation while being a force for good. For more details check out http://www.generalmills.com General Mills India Center (GIC) is our global capability center in Mumbai that works as an extension of our global organization delivering business value, service excellence and growth, while standing for good for our planet and people. With our team of 1800+ professionals, we deliver superior value across the areas of Supply chain (SC) , Digital & Technology (D&T) Innovation, Technology & Quality (ITQ), Consumer and Market Intelligence (CMI), Sales Strategy & Intelligence (SSI) , Global Shared Services (GSS) , Finance Shared Services (FSS) and Human Resources Shared Services (HRSS).For more details check out https://www.generalmills.co.in We advocate for advancing equity and inclusion to create more equitable workplaces and a better tomorrow. Job Overview Function Overview The Digital and Technology team at General Mills stands as the largest and foremost unit, dedicated to exploring the latest trends and innovations in technology while leading the adoption of cutting-edge technologies across the organization. Collaborating closely with global business teams, the focus is on understanding business models and identifying opportunities to leverage technology for increased efficiency and disruption. The team's expertise spans a wide range of areas, including AI/ML, Data Science, IoT, NLP, Cloud, Infrastructure, RPA and Automation, Digital Transformation, Cyber Security, Blockchain, SAP S4 HANA and Enterprise Architecture. The MillsWorks initiative embodies an agile@scale delivery model, where business and technology teams operate cohesively in pods with a unified mission to deliver value for the company. Employees working on significant technology projects are recognized as Digital Transformation change agents. The team places a strong emphasis on service partnerships and employee engagement with a commitment to advancing equity and supporting communities. In fostering an inclusive culture, the team values individuals passionate about learning and growing with technology, exemplified by the "Work with Heart" philosophy, emphasizing results over facetime. Those intrigued by the prospect of contributing to the digital transformation journey of a Fortune 500 company are encouraged to explore more details about the function through the following Link Purpose of the role Technology at General Mills accelerates process transformation and business growth around the globe. Our Global Business Solutions team uses leading edge technology, innovative thinking and agile processes. This role demands that the individual must be master of the language grammar, and be very familiar with how to structure, design, implementation, and testing of the project based on one or more languages. In this role developer would be working along with the global teams in developing business solutions. Individual should be expert in multiple technology stacks, hands-on, design and write scalable applications. Individual should be able to work independently on large projects. Individual should be able to grow and inspire the team's technical skills and keep up with the technological paradigm shift Key Accountabilities Develop and maintain applications & taking ownership of complex technical designs and leading their implementation. Taking ownership of critical production issues and driving their resolution within specified SLAs. This requires strong problem-solving skills and the ability to work under pressure. Collaborate with a cross-functional team to implement a new software feature. Leading and driving technical initiatives within the team or organization, such as adopting new technologies or improving development processes. Driving continuous improvement within the team and seeking opportunities for innovation. Championing code quality and ensuring adherence to coding standards and best practices within the team. This may involve establishing coding guidelines and conducting code reviews. Taking full ownership of the technical solutions delivered, ensuring their quality, performance, and maintainability. Time Allocation – 10 % Collaboration on software design and architecture, working with cross-functional teams, learning and working on POCs for trending technologies, and participating in product/tool evaluations. Time Allocation – 75 % Translating application storyboards into functional applications, ensuring code quality and adherence to standards, writing unit and integration tests, developing automation tools, ensuring application performance and responsiveness, troubleshooting and debugging applications, leveraging DevOps tools for CI/CD, building working relationships, and mentoring less experienced team members. Time Allocation – 15 % Staying aware of organizational strategy, early adoption of trending technologies, proactive communication, challenging ideas to avoid pitfalls, and leading/participating in knowledge-sharing initiatives Minimum Qualifications Education – Full time graduation from an accredited university (Mandatory- Note: This is the minimum education criteria which cannot be altered) Technical Expertise: Proficiency in C#/VB, ASP.NET, ASP.NET MVC, RESTful APIs, JavaScript frameworks (React/Angular/Next), DevOps practices (GitHub Actions), database systems (MSSQL or NoSQL), SSIS and unit testing frameworks. Soft Skills - Strong communication skills with ability to communicate complex technical concepts with stakeholders and provide strategic decisions. Agile/Digital Experience: Strong understanding of Agile methodologies and experience with task/sprint estimation. Mindset & Behaviors: Enthusiasm for emerging technologies and a willingness to learn new ideas & ability to create a positive and supportive work environment. Preferred Qualifications Education – Full time graduation from an accredited university in computer science. Technical Expertise: Experience with other technologies like cloud technologies (GCP/Azure), .NET Blazor, GitHub, GitHub Actions & CoPilot, Containers, Python, NextJS and Tableau is beneficial.

Posted 12 hours ago

Apply

6.0 years

0 Lacs

Ghaziabad, Uttar Pradesh, India

Remote

Experience : 6.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: Airflow, LLMs, MLOps, Generative AI, Python Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Engineering team builds and optimizes systems spanning data ingestion, processing, storage optimization and more. We work closely with engineers and the product team to build highly scalable systems that tackle real-world data problems and provide our customers with accurate, real-time, fault tolerant solutions to their ever-growing data needs. We support various OLTP and analytics environments, including our Advanced Analytics and Digital Experience Management products. We are looking for skilled engineers experienced with building and optimizing cloud-scale distributed systems to develop our next-generation ingestion, processing and storage solutions. You will work closely with other engineers and the product team to build highly scalable systems that tackle real-world data problems. Our customers depend on us to provide accurate, real-time and fault tolerant solutions to their ever growing data needs. This is a hands-on, impactful role that will help lead development, validation, publishing and maintenance of logical and physical data models that support various OLTP and analytics environments. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What You Will Be Doing Lead the design, development, and deployment of AI/ML models for threat detection, anomaly detection, and predictive analytics in cloud and network security. Architect and implement scalable data pipelines for processing large-scale datasets from logs, network traffic, and cloud environments. Apply MLOps best practices to deploy and monitor machine learning models in production. Collaborate with cloud architects and security analysts to develop cloud-native security solutions leveraging platforms like AWS, Azure, or GCP. Build and optimize Retrieval-Augmented Generation (RAG) systems by integrating large language models (LLMs) with vector databases for real-time, context-aware applications. Analyze network traffic, log data, and other telemetry to identify and mitigate cybersecurity threats. Ensure data quality, integrity, and compliance with GDPR, HIPAA, or SOC 2 standards. Drive innovation by integrating the latest AI/ML techniques into security products and services. Mentor junior engineers and provide technical leadership across projects. Required Skills And Experience AI/ML Expertise Proficiency in advanced machine learning techniques, including neural networks (e.g., CNNs, Transformers) and anomaly detection. Experience with AI frameworks like TensorFlow, PyTorch, and Scikit-learn. Strong understanding of MLOps practices and tools (e.g., MLflow, Kubeflow). Experience building and deploying Retrieval-Augmented Generation (RAG) systems, including integration with LLMs and vector databases. Data Engineering Expertise designing and optimizing ETL/ELT pipelines for large-scale data processing. Hands-on experience with big data technologies (e.g., Apache Spark, Kafka, Flink). Proficiency in working with relational and non-relational databases, including ClickHouse and BigQuery. Familiarity with vector databases such as Pinecone and PGVector and their application in RAG systems. Experience with cloud-native data tools like AWS Glue, BigQuery, or Snowflake. Cloud and Security Knowledge Strong understanding of cloud platforms (AWS, Azure, GCP) and their services. Experience with network security concepts, extended detection and response, and threat modeling. Software Engineering Proficiency in Python, Java, or Scala for data and ML solution development. Expertise in scalable system design and performance optimization for high-throughput applications. Leadership and Collaboration Proven ability to lead cross-functional teams and mentor engineers. Strong communication skills to present complex technical concepts to stakeholders. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 12 hours ago

Apply

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

Remote

Experience : 6.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: Airflow, LLMs, MLOps, Generative AI, Python Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Engineering team builds and optimizes systems spanning data ingestion, processing, storage optimization and more. We work closely with engineers and the product team to build highly scalable systems that tackle real-world data problems and provide our customers with accurate, real-time, fault tolerant solutions to their ever-growing data needs. We support various OLTP and analytics environments, including our Advanced Analytics and Digital Experience Management products. We are looking for skilled engineers experienced with building and optimizing cloud-scale distributed systems to develop our next-generation ingestion, processing and storage solutions. You will work closely with other engineers and the product team to build highly scalable systems that tackle real-world data problems. Our customers depend on us to provide accurate, real-time and fault tolerant solutions to their ever growing data needs. This is a hands-on, impactful role that will help lead development, validation, publishing and maintenance of logical and physical data models that support various OLTP and analytics environments. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What You Will Be Doing Lead the design, development, and deployment of AI/ML models for threat detection, anomaly detection, and predictive analytics in cloud and network security. Architect and implement scalable data pipelines for processing large-scale datasets from logs, network traffic, and cloud environments. Apply MLOps best practices to deploy and monitor machine learning models in production. Collaborate with cloud architects and security analysts to develop cloud-native security solutions leveraging platforms like AWS, Azure, or GCP. Build and optimize Retrieval-Augmented Generation (RAG) systems by integrating large language models (LLMs) with vector databases for real-time, context-aware applications. Analyze network traffic, log data, and other telemetry to identify and mitigate cybersecurity threats. Ensure data quality, integrity, and compliance with GDPR, HIPAA, or SOC 2 standards. Drive innovation by integrating the latest AI/ML techniques into security products and services. Mentor junior engineers and provide technical leadership across projects. Required Skills And Experience AI/ML Expertise Proficiency in advanced machine learning techniques, including neural networks (e.g., CNNs, Transformers) and anomaly detection. Experience with AI frameworks like TensorFlow, PyTorch, and Scikit-learn. Strong understanding of MLOps practices and tools (e.g., MLflow, Kubeflow). Experience building and deploying Retrieval-Augmented Generation (RAG) systems, including integration with LLMs and vector databases. Data Engineering Expertise designing and optimizing ETL/ELT pipelines for large-scale data processing. Hands-on experience with big data technologies (e.g., Apache Spark, Kafka, Flink). Proficiency in working with relational and non-relational databases, including ClickHouse and BigQuery. Familiarity with vector databases such as Pinecone and PGVector and their application in RAG systems. Experience with cloud-native data tools like AWS Glue, BigQuery, or Snowflake. Cloud and Security Knowledge Strong understanding of cloud platforms (AWS, Azure, GCP) and their services. Experience with network security concepts, extended detection and response, and threat modeling. Software Engineering Proficiency in Python, Java, or Scala for data and ML solution development. Expertise in scalable system design and performance optimization for high-throughput applications. Leadership and Collaboration Proven ability to lead cross-functional teams and mentor engineers. Strong communication skills to present complex technical concepts to stakeholders. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 12 hours ago

Apply

6.0 years

0 Lacs

Agra, Uttar Pradesh, India

Remote

Experience : 6.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: Airflow, LLMs, MLOps, Generative AI, Python Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Engineering team builds and optimizes systems spanning data ingestion, processing, storage optimization and more. We work closely with engineers and the product team to build highly scalable systems that tackle real-world data problems and provide our customers with accurate, real-time, fault tolerant solutions to their ever-growing data needs. We support various OLTP and analytics environments, including our Advanced Analytics and Digital Experience Management products. We are looking for skilled engineers experienced with building and optimizing cloud-scale distributed systems to develop our next-generation ingestion, processing and storage solutions. You will work closely with other engineers and the product team to build highly scalable systems that tackle real-world data problems. Our customers depend on us to provide accurate, real-time and fault tolerant solutions to their ever growing data needs. This is a hands-on, impactful role that will help lead development, validation, publishing and maintenance of logical and physical data models that support various OLTP and analytics environments. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What You Will Be Doing Lead the design, development, and deployment of AI/ML models for threat detection, anomaly detection, and predictive analytics in cloud and network security. Architect and implement scalable data pipelines for processing large-scale datasets from logs, network traffic, and cloud environments. Apply MLOps best practices to deploy and monitor machine learning models in production. Collaborate with cloud architects and security analysts to develop cloud-native security solutions leveraging platforms like AWS, Azure, or GCP. Build and optimize Retrieval-Augmented Generation (RAG) systems by integrating large language models (LLMs) with vector databases for real-time, context-aware applications. Analyze network traffic, log data, and other telemetry to identify and mitigate cybersecurity threats. Ensure data quality, integrity, and compliance with GDPR, HIPAA, or SOC 2 standards. Drive innovation by integrating the latest AI/ML techniques into security products and services. Mentor junior engineers and provide technical leadership across projects. Required Skills And Experience AI/ML Expertise Proficiency in advanced machine learning techniques, including neural networks (e.g., CNNs, Transformers) and anomaly detection. Experience with AI frameworks like TensorFlow, PyTorch, and Scikit-learn. Strong understanding of MLOps practices and tools (e.g., MLflow, Kubeflow). Experience building and deploying Retrieval-Augmented Generation (RAG) systems, including integration with LLMs and vector databases. Data Engineering Expertise designing and optimizing ETL/ELT pipelines for large-scale data processing. Hands-on experience with big data technologies (e.g., Apache Spark, Kafka, Flink). Proficiency in working with relational and non-relational databases, including ClickHouse and BigQuery. Familiarity with vector databases such as Pinecone and PGVector and their application in RAG systems. Experience with cloud-native data tools like AWS Glue, BigQuery, or Snowflake. Cloud and Security Knowledge Strong understanding of cloud platforms (AWS, Azure, GCP) and their services. Experience with network security concepts, extended detection and response, and threat modeling. Software Engineering Proficiency in Python, Java, or Scala for data and ML solution development. Expertise in scalable system design and performance optimization for high-throughput applications. Leadership and Collaboration Proven ability to lead cross-functional teams and mentor engineers. Strong communication skills to present complex technical concepts to stakeholders. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 12 hours ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Company Description Buffercode is a technology-focused company that specializes in new and emerging technologies such as AI & ML, cloud computing, and information security. Our team acts as advisors, engineers, and designers, providing innovative solutions to help organizations upgrade themselves in various aspects, including security and data analytics. Role Description This is a full-time on-site role for an Intern- BCOM Graduate at Buffercode located in Noida. The Intern will be involved in day-to-day tasks related to assisting the team in risk management, cash flow management, audit, prepare balance sheet & P & L, etc. Qualifications Strong analytical and problem-solving skills Ability to work collaboratively in a team environment Excellent communication and interpersonal skills Attention to detail and ability to learn quickly Bachelor's degree in BCOM. Good in Excel.

Posted 12 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies