Home
Jobs

112 Aws Sagemaker Jobs - Page 3

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Job Posting TitleBUSINESS INTELLIGENCE ANALYST I Band/Level5-4-S Education ExperienceBachelors Degree (High School +4 years) Employment ExperienceLess than 1 year At TE, you will unleash your potential working with people from diverse backgrounds and industries to create a safer, sustainable and more connected world. Job Overview TE Connectivity s Business Intelligence Teams are responsible for the processing, mining and delivery of data to their customer community through repositories, tools and services. Roles & Responsibilities Tasks & Responsibilities Assist in the development and deployment of Digital Factory solutions and Machine Learning models across Manufacturing, Quality, and Supply Chain functions. Support data collection, cleaning, preparation, and transformation from multiple sources, ensuring data consistency and readiness. Contribute to the creation of dashboards and reports using tools such as Power BI or Tableau. Work on basic analytics and visualization tasks to derive insights and identify improvement areas. Assist in maintaining existing ML models, including data monitoring and model retraining processes. Participate in small-scale PoCs (proof of concepts) and pilot projects with senior team members. Document use cases, write clean code with guidance, and contribute to knowledge-sharing sessions Support integration of models into production environments and perform basic testing. Desired Candidate Proficiency in Python and/or R for data analysis, along with libraries like Pandas, NumPy, Matplotlib, Seaborn. Basic understanding of statistical concepts such as distributions, correlation, regression, and hypothesis testing. Familiarity with SQL or other database querying tools; e.g., pyodbc, sqlite3, PostgreSQL. Exposure to ML algorithms like linear/logistic regression, decision trees, k-NN, or SVM. Basic knowledge of Jupyter Notebooks and version control using Git/GitHub. Good communication skills in English (written and verbal), able to explain technical topics simply Collaborative, eager to learn, and adaptable in a fast-paced and multicultural environment. Exposure to or interest in manufacturing technologies (e.g., stamping, molding, assembly). Exposure to cloud platforms (AWS/Azure) or services like S3, SageMaker, Redshift is an advantage. Hands-on experience in image data preprocessing (resizing, Gaussian blur, PCA) or computer vision projects. Interest in AutoML tools and transfer learning techniques. Competencies ABOUT TE CONNECTIVITY TE Connectivity plc (NYSETEL) is a global industrial technology leader creating a safer, sustainable, productive, and connected future. Our broad range of connectivity and sensor solutions enable the distribution of power, signal and data to advance next-generation transportation, energy networks, automated factories, data centers, medical technology and more. With more than 85,000 employees, including 9,000 engineers, working alongside customers in approximately 130 countries, TE ensures that EVERY CONNECTION COUNTS. Learn more at www.te.com and on LinkedIn , Facebook , WeChat, Instagram and X (formerly Twitter). WHAT TE CONNECTIVITY OFFERS: We are pleased to offer you an exciting total package that can also be flexibly adapted to changing life situations - the well-being of our employees is our top priority! Competitive Salary Package Performance-Based Bonus Plans Health and Wellness Incentives Employee Stock Purchase Program Community Outreach Programs / Charity Events IMPORTANT NOTICE REGARDING RECRUITMENT FRAUD TE Connectivity has become aware of fraudulent recruitment activities being conducted by individuals or organizations falsely claiming to represent TE Connectivity. Please be advised that TE Connectivity never requests payment or fees from job applicants at any stage of the recruitment process. All legitimate job openings are posted exclusively on our official careers website at te.com/careers, and all email communications from our recruitment team will come only from actual email addresses ending in @te.com . If you receive any suspicious communications, we strongly advise you not to engage or provide any personal information, and to report the incident to your local authorities. Across our global sites and business units, we put together packages of benefits that are either supported by TE itself or provided by external service providers. In principle, the benefits offered can vary from site to site. Location

Posted 2 weeks ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Chennai

Work from Office

Naukri logo

Job Information Job Opening ID ZR_2200_JOB Date Opened 15/04/2024 Industry Technology Job Type Work Experience 5-8 years Job Title AI/ML Engineer City Chennai Province Tamil Nadu Country India Postal Code 600004 Number of Positions 4 Experience in CI/CD pipelines, scripting languages, and a deep understanding of version control systems (e.g. Git), containerization (e.g. Docker), and continuous integration/deployment tools (e.g. Jenkins) third party integration is a plus, cloud computing platforms (e.g. AWS, GCP, Azure), Kubernetes and Kafka. Experience in 4+ years of experience building production-grade ML pipelines. Proficient in Python and frameworks like Tensorflow, Keras, or PyTorch. Experience with cloud build, deployment, and orchestration tools Experience with MLOps tools such as MLFlow, Kubeflow, Weights & Biases, AWS Sagemaker, Vertex AI, DVC, Airflow, Prefect, etc., Experience in statistical modeling, machine learning, data mining, and unstructured data analytics. Understanding of ML Lifecycle, MLOps & Hands on experience to Productionize the ML Model Detail-oriented, with the ability to work both independently and collaboratively. Ability to work successfully with multi-functional teams, principals, and architects, across organizational boundaries and geographies. Equal comfort driving low-level technical implementation and high-level architecture evolution Experience working with data engineering pipelines. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 weeks ago

Apply

3.0 - 8.0 years

12 - 22 Lacs

Pune, Bengaluru

Work from Office

Naukri logo

Gen AI+AWS Skill-Gen AI,AWS,AI Platform,Python,Tensorflow,Langchain,LLM Model,ML Development, AWS Lambda,AWS Bedrock,Data Extraction Exp-3-9YRS In Gen AI AWS PKG Upto-25LPA Loc- Bang, Pune NP-Imm-30Days Ritika-8587970773 ritikab.imaginators@gmail.com Required Candidate profile Gen AI + AWS Mandatory Skill-Gen AI, AWS,AI Platform, Python, Tensor flow, Lang chain, LLM Model, ML Development, AWS Lambda, AWS Bedrock, AWS AI, Data Extraction, GEN AI Solution, RPA Tool

Posted 2 weeks ago

Apply

4.0 - 9.0 years

12 - 22 Lacs

Pune, Bengaluru

Work from Office

Naukri logo

Gen AI+AWS Skill-Gen AI,AWS,AI Platform,Python,Tensorflow,Langchain,LLM Model,ML Development, AWS Lambda,AWS Bedrock,Data Extraction Exp-4-12YRS In Gen AI AWS PKG Upto-25LPA Loc-Bang, Pune NP-Imm-30Days Ritika-8587970773 ritikab.imaginators@gmail.com Required Candidate profile Gen AI + AWS Mandatory Skill-Gen AI, AWS,AI Platform, Python, Tensor flow, Lang chain, LLM Model, ML Development, AWS Lambda, AWS Bedrock, AWS AI, Data Extraction, GEN AI Solution, RPA Tool

Posted 2 weeks ago

Apply

7.0 - 12.0 years

12 - 22 Lacs

Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

Job Title : Data Science Lead Location State : Karnataka , Maharashtra Location City : Bengaluru , Mumbai Experience Required : 5 to 12 Year(s) CTC Range : 12 to 21 LPA Shift: Rotational Work Mode: Onsite Position Type: Permanent Openings: 10 Company Name: VARITE INDIA PRIVATE LIMITED About The Client: Client is a leading global professional services company that helps the worlds leading businesses, governments and other organizations build their digital core, optimize their operations, accelerate revenue growth and enhance citizen services. About The Job: You will be a core member of Client Operations global Data & AI group, an energetic, strategic, high-visibility and high-impact team, to innovate and transform the Accenture Operations business using machine learning, advanced analytics to support data-driven decisioning. Essential Job Functions: Leading team of data scientists to build and deploy data science models to uncover deeper insights, predict future outcomes, and optimize business processes for clients. • Refining and improving data science models based on feedback, new data, and evolving business needs. • Analyze available data to identify opportunities for enhancing brand equity, improving retail margins, achieving profitable growth, and expanding market share for clients. • Data Scientists in Operations follow multiple approaches for project execution from adapting existing assets to Operations use cases, exploring third-party and open-source solutions for speed to execution and for specific use cases to engaging in fundamental research to develop novel solutions. • Data Scientists are expected to collaborate with other data scientists, subject matter experts, sales, and delivery teams from Accenture locations around the globe to deliver strategic advanced machine learning / data-AI solutions from design to deployment. Qualifications: Extensive experience in leading Data Science and Advanced Analytics delivery teams • Strong statistical programming experience Python or working knowledge on cloud native platforms like AWS Sagemaker is preferred Azure/ GCP • Experience working with large data sets and big data tools like AWS, SQL, PySpark, etc. • Solid knowledge in at least more than two of the following Supervised and Unsupervised Learning, Classification, Regression, Clustering, Neural Networks, Ensemble Modelling (random forest, boosted tree, etc) • Experience in working with Pricing models is a plus • Experience in atleast one of these business domains: Energy, CPG, Retail, Marketing Analytics, Customer Analytics, Digital Marketing, eCommerce, Health, Supply Chain • Extensive experience in client engagement and business development • Ability to work in a global collaborative team environment • Quick Learner and Independently deliver results. Qualifications: Masters / Ph.D. Computer science, Engineering, Statistics, Mathematics, Economics or related disciplines. How to Apply: Interested candidates are invited to submit their resume using the apply online button on this job post. Equal Opportunity Employer: VARITE is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. We do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity or expression, national origin, age, marital status, veteran status, or disability status. Unlock Rewards: Refer Candidates and Earn. If you're not available or interested in this opportunity, please pass this along to anyone in your network who might be a good fit and interested in our open positions. VARITE offers a Candidate Referral program, where you'll receive a one-time referral bonus based on the following scale if the referred candidate completes a three-month assignment with VARITE. Exp Req - Referral Bonus 0 - 2 Yrs. - INR 5,000 2 - 6 Yrs. - INR 7,500 6 + Yrs. - INR 10,000 About VARITE: VARITE is a global staffing and IT consulting company providing technical consulting and team augmentation services to Fortune 500 Companies in USA, UK, CANADA and INDIA. VARITE is currently a primary and direct vendor to the leading corporations in the verticals of Networking, Cloud Infrastructure, Hardware and Software, Digital Marketing and Media Solutions, Clinical Diagnostics, Utilities, Gaming and Entertainment, and Financial Services.

Posted 2 weeks ago

Apply

7.0 - 11.0 years

9 - 13 Lacs

Mumbai

Work from Office

Naukri logo

Skill required: Delivery - Advanced Analytics Designation: I&F Decision Sci Practitioner Specialist Qualifications: Master of Engineering/Masters in Business Economics Years of Experience: 7 to 11 years What would you do? Data & AI You will be a core member of Accenture Operations global Data & AI group, an energetic, strategic, high-visibility and high-impact team, to innovate and transform the Accenture Operations business using machine learning, advanced analytics to support data-driven decisioning. What are we looking for? Extensive experience in leading Data Science and Advanced Analytics delivery teams Strong statistical programming experience – Python or working knowledge on cloud native platforms like AWS Sagemaker is preferred Azure/ GCP Experience working with large data sets and big data tools like AWS, SQL, PySpark, etc. Solid knowledge in at least more than two of the following – Supervised and Unsupervised Learning, Classification, Regression, Clustering, Neural Networks, Ensemble Modelling (random forest, boosted tree, etc) Experience in working with Pricing models is a plus Experience in atleast one of these business domains:Energy, CPG, Retail, Marketing Analytics, Customer Analytics, Digital Marketing, eCommerce, Health, Supply Chain Extensive experience in client engagement and business development Ability to work in a global collaborative team environment Quick Learner and Independently deliver results. Qualifications:Masters / Ph.D. Computer science, Engineering, Statistics, Mathematics, Economics or related disciplines. Roles and Responsibilities: Building data science models to uncover deeper insights, predict future outcomes, and optimize business processes for clients. Utilizing advanced statistical and machine learning techniques to develop models that can assist in decision-making and strategic planning. Refining and improving data science models based on feedback, new data, and evolving business needs. Data Scientists in Operations follow multiple approaches for project execution from adapting existing assets to Operations use cases, exploring third-party and open-source solutions for speed to execution and for specific use cases to engaging in fundamental research to develop novel solutions. Data Scientists are expected to collaborate with other data scientists, subject matter experts, sales, and delivery teams from Accenture locations around the globe to deliver strategic advanced machine learning / data-AI solutions from design to deployment. Qualifications Master of Engineering,Masters in Business Economics

Posted 2 weeks ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Mumbai

Work from Office

Naukri logo

Skill required: Delivery - Advanced Analytics Designation: I&F Decision Sci Practitioner Sr Analyst Qualifications: Master of Engineering/Masters in Business Economics Years of Experience: 5 to 8 years What would you do? Data & AI You will be a core member of Accenture Operations global Data & AI group, an energetic, strategic, high-visibility and high-impact team, to innovate and transform the Accenture Operations business using machine learning, advanced analytics to support data-driven decisioning. What are we looking for? Extensive experience in leading Data Science and Advanced Analytics delivery teams Strong statistical programming experience – Python or working knowledge on cloud native platforms like AWS Sagemaker is preferred Azure/ GCP Experience working with large data sets and big data tools like AWS, SQL, PySpark, etc. Solid knowledge in at least more than two of the following – Supervised and Unsupervised Learning, Classification, Regression, Clustering, Neural Networks, Ensemble Modelling (random forest, boosted tree, etc) Experience in working with Pricing models is a plus Experience in atleast one of these business domains:Energy, CPG, Retail, Marketing Analytics, Customer Analytics, Digital Marketing, eCommerce, Health, Supply Chain Extensive experience in client engagement and business development Ability to work in a global collaborative team environment Quick Learner and Independently deliver results Qualifications:Masters / Ph.D. Computer science, Engineering, Statistics, Mathematics, Economics or related disciplines. Roles and Responsibilities: Building data science models to uncover deeper insights, predict future outcomes, and optimize business processes for clients. Utilizing advanced statistical and machine learning techniques to develop models that can assist in decision-making and strategic planning. Refining and improving data science models based on feedback, new data, and evolving business needs. Data Scientists in Operations follow multiple approaches for project execution from adapting existing assets to Operations use cases, exploring third-party and open-source solutions for speed to execution and for specific use cases to engaging in fundamental research to develop novel solutions. Data Scientists are expected to collaborate with other data scientists, subject matter experts, sales, and delivery teams from Accenture locations around the globe to deliver strategic advanced machine learning / data-AI solutions from design to deployment. Qualifications Master of Engineering,Masters in Business Economics

Posted 2 weeks ago

Apply

1.0 - 3.0 years

11 - 16 Lacs

Pune

Work from Office

Naukri logo

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. ZS’s Platform Developmentteam designs, implements, tests and supports ZS’s ZAIDYN Platform which helps drive superior customer experiences and revenue outcomes through integrated products & analytics. Whether writing distributed optimization algorithms or advanced mapping and visualization interfaces, you will have an opportunity to solve challenging problems, make an immediate impact and contribute to bring better health outcomes. What you'll do As part of our full-stack product engineering team, you will build multi-tenant cloud-based software products/platforms and internal assets that will leverage cutting edge based on the Amazon AWS cloud platform. Pair program, write unit tests, lead code reviews, and collaborate with QA analysts to ensure you develop the highest quality multi-tenant software that can be productized. Work with junior developers to implement large features that are on the cutting edge of Big Data Be a technical leader to your team, and help them improve their technical skills Stand up for engineering practices that ensure quality productsautomated testing, unit testing, agile development, continuous integration, code reviews, and technical design Work with product managers and architects to design product architecture and to work on POCs Take immediate responsibility for project deliverables Understand client business issues and design features that meet client needs Undergo on-the-job and formal trainings and certifications, and will constantly advance your knowledge and problem solving skills What you'll bring 1-3 years of experience in developing software, ideally building SaaS products and services Bachelor's Degree in CS, IT, or related discipline Strong analytic, problem solving, and programming ability Good hands on to work with AWS services (EC2, EMR, S3, Serverless stack, RDS, Sagemaker, IAM, EKS etc) Experience in coding in an object-oriented language such as Python, Java, C# etc. Hands on experience on Apache Spark, EMR, Hadoop, HDFS, or other big data technologies Experience with development on the AWS (Amazon Web Services) platform is preferable Experience in Linux shell or PowerShell scripting is preferable Experience in HTML5, JavaScript, and JavaScript libraries is preferable Good to have Pharma domain understanding Initiative and drive to contribute Excellent organizational and task management skills Strong communication skills Ability to work in global cross-office teams ZS is a global firm; fluency in English is required Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com

Posted 3 weeks ago

Apply

3.0 - 7.0 years

14 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

An AI Data Scientist at IBM is not just a job title – it’s a mindset. You’ll leverage the watsonx,AWS Sagemaker,Azure Open AI platform to co-create AI value with clients, focusing on technology patterns to enhance repeatability and delight clients. We are seeking an experienced and innovative AI Data Scientist to be specialized in foundation models and large language models. In this role, you will be responsible for architecting and delivering AI solutions using cutting-edge technologies, with a strong focus on foundation models and large language models. You will work closely with customers, product managers, and development teams to understand business requirements and design custom AI solutions that address complex challenges. Experience with tools like Github Copilot, Amazon Code Whisperer etc. is desirable. Success is our passion, and your accomplishments will reflect this, driving your career forward, propelling your team to success, and helping our clients to thrive. Day-to-Day Duties: Proof of Concept (POC) DevelopmentDevelop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Collaborate with development teams to implement and iterate on POCs, ensuring alignment with customer requirements and expectations. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another, particularly COBOL to JAVA through rapid prototypes/ PoC Documentation and Knowledge SharingDocument solution architectures, design decisions, implementation details, and lessons learned. Create technical documentation, white papers, and best practice guides. Contribute to internal knowledge sharing initiatives and mentor new team members. Industry Trends and InnovationStay up to date with the latest trends and advancements in AI, foundation models, and large language models. Evaluate emerging technologies, tools, and frameworks to assess their potential impact on solution design and implementation Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise New WatsonX Assistant with phone integration 2. WatsonX Assistant Extensions 3. Requirement Analysis, Solution Design, and CDD Creation (nice to have) Experience and working knowledge in COBOL & JAVA would be preferred Having experience in Code generation, code matching & code translation leveraging LLM capabilities would be a Big plus (e.g. Amazon Code Whisperer, Github Copilot etc.) Soft Skills: Excellent interpersonal and communication skills. Engage with stakeholders for analysis and implementation. Commitment to continuous learning and staying updated with advancements in the field of AI. Growth mindsetDemonstrate a growth mindset to understand clients' business processes and challenges. Experience in python and pyspark will be added advantage Preferred technical and professional experience ExperienceProven experience in designing and delivering AI solutions, with a focus on foundation models, large language models, exposure to open source, or similar technologies. Experience in natural language processing (NLP) and text analytics is highly desirable. Understanding of machine learning and deep learning algorithms. Strong track record in scientific publications or open-source communities Experience in full AI project lifecycle, from research and prototyping to deployment in production environments

Posted 3 weeks ago

Apply

7.0 - 10.0 years

15 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Experience-7 year Experienced on at least on 1 AI project from requirements definition to deployment maintenance in production Deployed end-to-end ML pipeline Used PyTorch,Tensorflow to build,deploy models Proven GenAI experience

Posted 3 weeks ago

Apply

4.0 - 8.0 years

18 - 25 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

DataScientist with NLP AI ML and testing: Experience: 4-8 years Location: Chennai/bangalore Senior Engineer Key Responsibilities Develop, test, and maintain Python-based applications for AI/ML and NLP usecases Design, build, and train models using ML/NLP techniques and GenAI frameworks Implement unit, integration, and functional tests for model pipelines and services Fine-tune large language models (LLMs) or integrate pre-trained APIs (e.g., OpenAI, Hugging Face) Collaborate with data scientists and MLOps teams to deploy models in production Write reusable, testable, and efficient code with proper documentation Design data pipelines and preprocessing functions for unstructured textual data Participate in code reviews and follow CI/CD best practices Perform exploratory data analysis (EDA) and model validation _______________________________ _________ Must-Have Technical SkillsCategory Languages - Python (Advanced), Shell scripting (Basic) Testing - PyTest, unittest, integration testing, test coverage tools (e.g., coverage.py), mocking (pytest-mock) GenAI Platforms - OpenAI API, LangChain, Hugging Face Transformers, LlamaIndex ML Frameworks - scikit-learn, TensorFlow, PyTorch, XGBoost NLP Libraries - spaCy, NLTK, Hugging Face, Gensim, fastText Data Manipulation - Pandas, NumPy, Dask Visualization - Matplotlib, Seaborn, Plotly DevOps / CI-CD - Git, Docker, Jenkins, GitHub Actions MLOps / Model Serving - MLflow, FastAPI, Streamlit, Flask, ONNX, TorchServe Cloud Platforms - AWS (S3, SageMaker, Lambda), Azure ML, GCP (Vertex AI) Databases - PostgreSQL, MongoDB, Redis, SQLite _______________________________ _________ Nice-to-Have Skills Experience working with vector databases (e.g., FAISS, Pinecone, Weaviate) Prompt engineering for LLMs Experience working with vector databases (e.g., FAISS, Pinecone, Weaviate) Prompt engineering for LLMs Familiarity with LLMOps tools (e.g., LangSmith, PromptLayer, BentoML)

Posted 3 weeks ago

Apply

8.0 - 11.0 years

27 - 42 Lacs

Bengaluru

Work from Office

Naukri logo

Role : MLOps Engineer Location - Coimbatore Mode of Interview - In Person Key words -Skillset AWS SageMaker, Azure ML Studio, GCP Vertex AI PySpark, Azure Databricks MLFlow, KubeFlow, AirFlow, Github Actions, AWS CodePipeline Kubernetes, AKS, Terraform, Fast API Responsibilities Model Deployment, Model Monitoring, Model Retraining Deployment pipeline, Inference pipeline, Monitoring pipeline, Retraining pipeline Drift Detection, Data Drift, Model Drift Experiment Tracking MLOps Architecture REST API publishing Job Responsibilities: Research and implement MLOps tools, frameworks and platforms for our Data Science projects. Work on a backlog of activities to raise MLOps maturity in the organization. Proactively introduce a modern, agile and automated approach to Data Science. Conduct internal training and presentations about MLOps tools’ benefits and usage. Required experience and qualifications: Wide experience with Kubernetes. Experience in operationalization of Data Science projects (MLOps) using at least one of the popular frameworks or platforms (e.g. Kubeflow, AWS Sagemaker, Google AI Platform, Azure Machine Learning, DataRobot, DKube). Good understanding of ML and AI concepts. Hands-on experience in ML model development. Proficiency in Python used both for ML and automation tasks. Good knowledge of Bash and Unix command line toolkit. Experience in CI/CD/CT pipelines implementation. Experience with cloud platforms - preferably AWS - would be an advantage.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

10 - 20 Lacs

Mumbai, Bengaluru

Work from Office

Naukri logo

Data Science, Machine Learning, Python, SQL, Marketing Analytics Domain (Preferred), AWS Sagemaker (Preferred). Collect, clean, and preprocess large datasets from various sources to ensure quality and consistency. Required Candidate profile Perform exploratory data analysis (EDA) to uncover patterns, trends, and insights using statistical techniques and data visualization tools. Build and deploy predictive models using machine learning

Posted 3 weeks ago

Apply

5.0 - 10.0 years

16 - 31 Lacs

Bengaluru, Mumbai (All Areas)

Hybrid

Naukri logo

Role: Data Science Min 5+ Years exp Data Science, Machine Learning Hands on experience in Python, SQL Marketing Analytics Domain (Preferred), AWS Sagemaker (Preferred) Shift: General Mode: Hybrid Location: Bangalore/ Mumbai Immediate Joiners Preferred Call Anumeha @ 6376649769 Send resume to anumeha@manningconsulting.in

Posted 3 weeks ago

Apply

5.0 - 10.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : NoSQL Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an AI/ML Engineer, you will develop applications and systems utilizing AI tools, Cloud AI services, with a proper cloud or on-prem application pipeline of production-ready quality. You will apply GenAI models as part of the solution, including deep learning, neural networks, chatbots, and image processing. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the implementation of AI/ML models.- Conduct research on emerging AI technologies.- Optimize AI algorithms for performance and scalability. Professional & Technical Skills: - Must To Have Skills: Proficiency in NoSQL.- Strong understanding of AI and ML concepts.- Experience with cloud AI services like AWS SageMaker or Google AI Platform.- Knowledge of deep learning frameworks such as TensorFlow or PyTorch.- Hands-on experience in developing AI applications.- Familiarity with chatbot development and image processing. Additional Information:- The candidate should have a minimum of 5 years of experience in NoSQL.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities: * #Must have skills: AWS Sagemaker, Python, SQL and AWS tools. * Optimize cloud infrastructure with AWS SageMaker.

Posted 3 weeks ago

Apply

0.0 - 2.0 years

3 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

What you will do Let’s do this. Let’s change the world. In this vital role you areresponsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Bachelor’s degree and 0 to 3 years of Computer Science, IT or related field experience OR Diploma and 4 to 7 years of Computer Science, IT or related field experience Preferred Qualifications: Functional Skills: Must-Have Skills : Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), AWS, Redshift, Snowflake, workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) and experience with data visualization tools. Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores. Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Good-to-Have Skills: Experience with data modeling, performance tuning on relational and graph databases ( e.g. Marklogic, Allegrograph, Stardog, RDF Triplestore). Understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, SageMaker, cloud data platform Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Professional Certifications : AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. Equal opportunity statement Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com

Posted 3 weeks ago

Apply

6.0 - 10.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

What you will do Let’s do this. Let’s change the world. In this vital role you are responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data governance initiatives and, visualizing (to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Leading and being hands-on for the technical design, development, testing, implementation, and support of data pipelines that load the data domains in the Enterprise Data Fabric and associated data services. Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs. Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency. Be able to translate data models (ontology, relational) into physical designs that performant, maintainable, easy to use. Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical Collaboration with RunOps engineers to continuously increase our ability to push changes into production with as little manual overhead and as much speed as possible. What we expect of you We are all different, yet we all use our unique contributions to serve patients. The [vital attribute] professional we seek is a [type of person] with these qualifications. Basic Qualifications: Master’s degree and 4 to 6 years of Computer Science, IT or related field experience OR Bachelor’s degree and 6 to 8 years of Computer Science, IT or related field experience OR Diploma and 10 to 12 years of Computer Science, IT or related field experience Preferred Qualifications: Functional Skills: Must-Have Skills: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on big data processing Proficient in SQL for extracting, transforming, and analyzing complex datasets from both relational and graph data stores ( e.g. Marklogic, Allegrograph, Stardog, RDF Triplestore). Experience with ETL tools such as Apache Spark, Prophecy and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Able to take user requirements and develop data models for data analytics use cases. Good-to-Have Skills: Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Experience using graph databases such as Stardog , Marklogic , Neo4J , Allegrograph, etc. and writing SPARQL queries. Experience working with agile development methodologies such as Scaled Agile. Professional Certifications AWS Certified Data Engineer preferred Certified Data Engineer / Data Analyst (preferred on Databricks or cloud environments) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. Equal opportunity statement Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com

Posted 3 weeks ago

Apply

5.0 - 6.0 years

10 - 15 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

AI/ML, AWS-based solutions. Amazon SageMaker, Python and ML libraries, data engineering on AWS, AI/ML algorithms &model deployment strategies.CI/CD, Cloud Formation, Terraform). AWS Certified Machine Learning. generative AI, real-time inference &edge

Posted 3 weeks ago

Apply

2.0 - 7.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

The Data Scientist organization within the Data and Analytics division is responsible for designing and implementing a unified data strategy that enables the efficient, secure, and governed use of data across the organization. We aim to create a trusted and customer-centric data ecosystem, built on a foundation of data quality, security, and openness, and guided by the Thomson Reuters Trust Principles. Our team is dedicated to developing innovative data solutions that drive business value while upholding the highest standards of data management and ethics. About the role: Work with low to minimum supervision to solve business problems using data and analytics. Work in multiple business domain areas including Customer Experience and Service, Operations, Finance, Sales and Marketing. Work with various business stakeholders, to understand and document requirements. Design an analytical framework to provide insights into a business problem. Explore and visualize multiple data sets to understand data available for problem solving. Build end to end data pipelines to handle and process data at scale. Build machine learning models and/or statistical solutions. Build predictive models. Use Natural Language Processing to extract insight from text. Design database models (if a data mart or operational data store is required to aggregate data for modeling). Design visualizations and build dashboards in Tableau and/or PowerBI Extract business insights from the data and models. Present results to stakeholders (and tell stories using data) using power point and/or dashboards. Work collaboratively with other team members. About you: Overall 6+ years experience in technology roles. Must have a minimum of 2 years of experience working in the data science domain. Has used frameworks/libraries such as Scikit-learn, PyTorch, Keras, NLTK. Highly proficient in Python. Highly proficient in SQL. Experience with Tableau and/or PowerBI. Has worked with Amazon Web Services and Sagemaker. Ability to build data pipelines for data movement using tools such as Alteryx, GLUE, Informatica. Proficient in machine learning, statistical modelling, and data science techniques. Experience with one or more of the following types of business analytics applications: Predictive analytics for customer retention, cross sales and new customer acquisition. Pricing optimization models. Segmentation. Recommendation engines. Experience in one or more of the following business domains Customer Experience and Service. Finance. Operations. Good presentation skills and the ability to tell stories using data and PowerPoint/Dashboard Visualizations. Excellent organizational, analytical and problem-solving skills. Ability to communicate complex results in a simple and concise manner at all levels within the organization. Ability to excel in a fast-paced, startup-like environment. #LI-SS5 What’s in it For You Hybrid Work Model We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 3 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

AI Opportunities with Soul AIs Expert Community! Are you an MLOps Engineer ready to take your expertise to the next levelSoul AI (by Deccan AI) is building an elite network of AI professionals, connecting top-tier talent with cutting-edge projects Why Join Above market-standard compensation Contract-based or freelance opportunities (2"“12 months) Work with industry leaders solving real AI challenges Flexible work locations- Remote | Onsite | Hyderabad/Bangalore Your Role: Architect and optimize ML infrastructure with Kubeflow, MLflow, SageMaker Pipelines Build CI/CD pipelines (GitHub Actions, Jenkins, GitLab CI/CD) Automate ML workflows (feature engineering, retraining, deployment) Scale ML models with Docker, Kubernetes, Airflow Ensure model observability, security, and cost optimization in cloud (AWS/GCP/Azure) Must-Have Skills: Proficiency in Python, TensorFlow, PyTorch, CI/CD pipelines Hands-on experience with cloud ML platforms (AWS SageMaker, GCP Vertex AI, Azure ML) Expertise in monitoring tools (MLflow, Prometheus, Grafana) Knowledge of distributed data processing (Spark, Kafka) (BonusExperience in A/B testing, canary deployments, serverless ML) Next Steps: Register on Soul AIs website Get shortlisted & complete screening rounds Join our Expert Community and get matched with top AI projects Dont just find a job Build your future in AI with Soul AI!

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Position Overview: The Provider Technology Shared Services Engineering team is seeking an Software Engineer Lead Analyst for a Band 3 Contributor Career Track position. The Software Engineer Lead Analyst will play a critical role in system development within the broader Provider Technology Solutions and Engineering organization, significantly influencing Operations and Technology Product Management. This position will provide expertise in the engineering, design, installation and startup of automated systems, including a self-service onboarding kit that enables users to begin utilizing the solution within minutes. The solutions developed will be accessible to individuals with minimal technical skills and will require no additional coding, ensuring zero maintenance is needed. As a member of our team, you will operate within a high-performance, high-frequency enterprise technology environment. This role entails collaborating closely with IT management and staff to identify automated solutions that leverage existing resources with tailored configurations for each use case. The objective is to minimize redundancy in solutions while promoting an enterprise mindset focused on reusability and maintaining high standards, ultimately ensuring minimal future maintenance requirements. The Software Engineer Lead Analyst demonstrates significant creativity, foresight, and sound judgment in the conception, planning, and execution of initiatives. This role requires extensive professional knowledge and expertise to effectively advise functional leaders. Additionally, the Lead Analyst stays informed about the latest advancements in technology, including AI and machine learning, to enhance both existing and new automation solutions. These solutions are designed to optimize production costs while facilitating the addition or updating of features aimed at improving the overall software development lifecycle experiences. Responsibilities: Provide comprehensive consultation to business unit and IT management, as well as personnel, regarding all facets of application development, testing and automation solutions across diverse development, financial, operational, and computing environments. Offers leadership and strategic vision in architectural design and AI/ML guidance for the team. Perform comprehensive research to identify and recommend the most efficient, cost-effective, and scalable AI/ML automation solutions applicable throughout the Software Development Life Cycle (SDLC). This includes areas such as test data generation, code generation, test case generation, test script generation, root cause analysis, predictive analysis, etc., with the aim of enhancing the overall SDLC from development to production support. Ensures that engineering solutions are aligned with the overall Technology strategy while addressing all application requirements. Demonstrate industry-leading technical abilities that enhance product quality and optimize day-to-day operations. Understand how changes impact work upstream and downstream including various back-end and front-end architectural modules. Enhance personnel effectiveness using heat matrices to prioritize Quality and Development Engineering resources on high-impact interfaces while identifying areas of lesser focus. Proactively monitor and manage the design of supported automation solutions, ensuring scalability, stability, flexibility, simplicity, performance, availability, security, and capacity. Develop and implement automation solutions to improve engineering and operational efficiency. Troubleshoot and optimize automated solutions and related artifacts to ensure seamless execution in CI/CD pipelines and on local machines, minimizing software and package dependencies or conflicts to reduce cycle time. Execute on a strategy to hand over the automation solutions to every Agile teams for adoption and use within their areas of focus, requiring zero maintenance and minimal effort for any enhancements without delving into coding. Encouraging and building automated processes wherever possible. Recognized internally as a subject matter expert. Required Skills: Foundations in Machine Learning and Deep Learning: Understanding algorithms, neural networks, supervised and unsupervised learning, and deep learning frameworks like TensorFlow, PyTorch, and Keras. Generative Models: Knowledge of generative models such as GANs (Generative Adversarial Networks), VAEs (Variational Autoencoders), and Transformers. Natural Language Processing (NLP): Knowledge in NLP techniques and libraries (e.g., spaCy, NLTK, Hugging Face Transformers) for text generation tasks. Model Deployment: Experience with deploying models using services like TensorFlow Serving, TorchServe, or cloud-based solutions (e.g., AWS SageMaker, Google AI Platform). Basic understanding of implementing Prompt Engineering, Finetuning and RAG. Strong foundation and practical experience in programming languages, especially Python, within the context of AI/ML workflows, crucial for transitioning from traditional software development processes to optimized and innovative solutions that enhance market agility. Containerization and Orchestration: Experience with Docker and Kubernetes / OpenShift for containerization and orchestration of applications. CI/CD Pipelines: Knowledge of continuous integration and continuous deployment tools and practices. Security Best Practices: understanding of security principles and best practices for protecting data and systems, including IAM, encryption and network Security. Cloud Services: Familiarity with cloud platforms like AWS / Google Cloud / Azure for deploying and managing applications and AI models. Required Experience & Education: A Bachelor's degree in Computer Science or a related field is required. A minimum of 5 years of experience in Software Development, including 3 years of professional experience in AI and Machine Learning engineering. At least 3 years of experience in Agile methodologies is required. Familiarity with an onshore/offshore operational model is essential. Demonstrated experience in the architecture, design and development of large-scale enterprise application solutions is required. Desired Experience: Proficient in AI / ML practices and automation techniques. Experienced in programming languages and scripting, including Python, Shell, Bash, Groovy, Ansible and Docker. Providing coaching and guidance to team members. Location & Hours of Work: Full-time position, working 40 hours per week. Expected overlap with US hours as appropriate. Primarily based in the Innovation Hub in Hyderabad, India in a hybrid working model (3 days WFO and 2 days WAH)

Posted 3 weeks ago

Apply

7.0 - 12.0 years

18 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

We are Hiring Senior Python with Machine Learning Engineer Level 3 for a US based IT Company based in Hyderabad. Candidates with minimum 7 Years of experience in python and machine learning can apply. Job Title : Senior Python with Machine Learning Engineer Level 3 Location : Hyderabad Experience : 7+ Years CTC : 28 LPA - 30 LPA Working shift : Day shift Job Description: We are seeking a highly skilled and experienced Python Developer with a strong background in Machine Learning (ML) to join our advanced analytics team. In this Level 3 role, you will be responsible for designing, building, and deploying robust ML pipelines and solutions across real-time, batch, event-driven, and edge computing environments. The ideal candidate will have extensive hands-on experience in developing and deploying ML workflows using AWS SageMaker , building scalable APIs, and integrating ML models into production systems. This role also requires a strong grasp of the complete ML lifecycle and DevOps practices specific to ML projects. Key Responsibilities: Develop and deploy end-to-end ML pipelines for real-time, batch, event-triggered, and edge environments using Python Utilize AWS SageMaker to build, train, deploy, and monitor ML models using SageMaker Pipelines, MLflow, and Feature Store Build and maintain RESTful APIs for ML model serving using FastAPI , Flask , or Django Work with popular ML frameworks and tools such as scikit-learn , PyTorch , XGBoost , LightGBM , and MLflow Ensure best practices across the ML lifecycle: data preprocessing, model training, validation, deployment, and monitoring Implement CI/CD pipelines tailored for ML workflows using tools like Bitbucket , Jenkins , Nexus , and AUTOSYS Design and maintain ETL workflows for ML pipelines using PySpark , Kafka , AWS EMR , and serverless architectures Collaborate with cross-functional teams to align ML solutions with business objectives and deliver impactful results Required Skills & Experience: 5+ years of hands-on experience with Python for scripting and ML workflow development 4+ years of experience with AWS SageMaker for deploying ML models and pipelines 3+ years of API development experience using FastAPI , Flask , or Django 3+ years of experience with ML tools such as scikit-learn , PyTorch , XGBoost , LightGBM , and MLflow Strong understanding of the complete ML lifecycle: from model development to production monitoring Experience implementing CI/CD for ML using Bitbucket , Jenkins , Nexus , and AUTOSYS Proficient in building ETL processes for ML workflows using PySpark , Kafka , and AWS EMR Nice to Have: Experience with H2O.ai for advanced machine learning capabilities Familiarity with containerization using Docker and orchestration using Kubernetes For further assistance contact/whatsapp : 9354909517 or write to hema@gist.org.in

Posted 3 weeks ago

Apply

7.0 - 12.0 years

20 - 32 Lacs

Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

Skills Required - Data Science, Machine Learning, Python, SQL, Marketing Analytics Domain (Preferred), AWS Sagemaker (Preferred) Manager = 7+ Years of Relevant Experience UPTO = 31.50 LPA Required Candidate profile WFO / Both Side Cabs Mumbai & Bangalore Location WhatsApp Resume to Sunny - 8219742465 ( DONT CALL ) & Mention DATA SCIENCE - Manager

Posted 3 weeks ago

Apply

4.0 - 8.0 years

14 - 22 Lacs

Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

Skills Required - Data Science, Machine Learning, Python, SQL, Marketing Analytics Domain (Preferred), AWS Sagemaker (Preferred) TL = 4+ Years of Relevant Experience UPTO = 16.30 LPA AM = 5+ Years of Relevant Experience UPTO = 22.30 LPA Required Candidate profile WFO / Both Side Cabs Mumbai & Bangalore Location WhatsApp Resume to Sunny - 8219742465 ( DONT CALL ) & Mention DATA SCIENCE - TL / AM

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies