Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 - 4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
???? About Es Magico Es Magico is an AI-first enterprise transformation organisation that goes beyond consulting we deliver scalable execution across sectors such as BFSI, Healthcare, Entertainment, and Education. With offices in Mumbai and Bengaluru, our mission is to augment the human workforce by deploying bespoke AI employees across business functions, innovating swiftly and executing with trust. We also partner with early-stage startups as a venture builder, transforming 0 ? 1 ideas into AI-native, scalable products. ???? Role: MLOps Engineer ??? Location: Bengaluru (Hybrid) ??? Experience: 14 years ??? Joining: Immediate ???? Key Responsibilities Design, develop, and maintain scalable ML pipelines for training, testing, and deployment. Automate model deployment, monitoring, and version control across dev/staging/prod environments. Integrate CI/CD pipelines for ML models using tools like MLflow, Kubeflow, Airflow, etc. Manage containerized workloads using Docker and orchestrate with Kubernetes or GKE. Collaborate closely with data scientists and product teams to optimize ML model lifecycle. Monitor performance and reliability of deployed models and troubleshoot issues as needed. ????? Technical Skills Experience with MLOps frameworks: MLflow, TFX, Kubeflow, or SageMaker Pipelines. Proficient in Python and common ML libraries (scikit-learn, pandas, etc.). Solid understanding of CI/CD practices and tools (e.g., GitHub Actions, Jenkins, Cloud Build). Familiar with Docker, Kubernetes, and Google Cloud Platform (GCP). Comfortable with data pipeline tools like Airflow, Prefect, or equivalent. ???? Preferred Qualifications 14 years of experience in MLOps, ML engineering, or DevOps with ML workflows. Prior experience with model monitoring, drift detection, and automated retraining. Exposure to data versioning tools like DVC or Delta Lake is a plus. GCP certifications or working knowledge of Vertex AI is a strong advantage. ???? How to Apply Send your resume to [HIDDEN TEXT] with the subject line: Application MLOps Engineer. Show more Show less
Posted 2 days ago
4.0 - 9.0 years
0 - 1 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Hi Pleasae Find JD and send me your updated Resume Bachelor's or Masters degree in Computer Science, Data Science, Engineering, or a related field. 3+ years of experience in MLOps, DevOps, or ML Engineering roles. Strong experience with containerization (Docker) and orchestration (Kubernetes). Proficiency in Python and experience working with ML libraries like TensorFlow, PyTorch, or scikit-learn. Familiarity with ML pipeline tools such as MLflow, Kubeflow, TFX, Airflow, or SageMaker Pipelines. Hands-on experience with cloud platforms (AWS, GCP, Azure) and infrastructure-as-code tools (Terraform, CloudFormation). Solid understanding of CI/CD principles, especially as applied to machine learning workflows. Nice-to-Have Experience with feature stores, model registries, and metadata tracking. Familiarity with data versioning tools like DVC or LakeFS. Exposure to data observability and monitoring tools. Knowledge of responsible AI practices including fairness, bias detection, and explainability.
Posted 3 weeks ago
4.0 - 10.0 years
3 - 6 Lacs
Thiruvananthapuram, Kerala, India
On-site
Lead is responsible to oversee the migration and modernization of the customer s platform and database services from Google Cloud Platform (GCP) to Amazon Web Services (AWS) Lead should provide technical guidance and mentorship to the data engineering team Design a robust AWS architecture for the database services, utilizing services such as Amazon RDS, Amazon Redshift, Amazon Aurora, Amazon Athena and Amazon DynamoDB Implement migration strategies and modernization processes, ensuring compatibility, performance, and adherence to best practices Collaborate closely with application and infrastructure teams to facilitate seamless integration of database services Identifies and implements opportunities for optimization and performance improvements during the migration Ensure compliance with data security, governance, and regulatory requirements during the migration Must have skills: Proven experience in cloud migration and modernization projects, with a strong understanding of AWS and GCP services. Strong understanding of cloud architecture principles and best practices Experience with data migration tools and strategies for AWS Understanding of AWS services such as S3, Redshift, RDS, and Data Pipeline. Skills in designing data models for both GCP and AWS environments. Proficiency in ETL (Extract, Transform, Load) processes and tools, including AWS Glue and Apache Beam. Knowledge of both SQL and NoSQL databases Proficient in scripting and automation tools using Python, Bash, or similar languages to automate data transfer and to streamline migration processes Strong ability to work effectively with cross-functional teams Proficient in documenting migration processes and providing training/support to stakeholders Experience in project planning, including defining timelines, milestones, and deliverables Skilled in scoping, planning, and executing data-related projects while adhering to timelines and budgets Ability to convey data requirements and migration benefits to non-technical stakeholders Strong documentation skills for creating architectural documents and communicating complex concepts to non-technical stakeholders Certifications : For Lead Architects at least 2 Professional or Specialty level AWS certification For Associate Architects at least 1 Professional or Specialty level AWS Certification For Senior Data Engineers at least 1 Associate level AWS certification Good to have skills: Certifications: Relevant AWS Certifications (e. g. , AWS Certified Solutions Architect) are advantageous Knowledge of big data technologies like Hadoop, Spark and Kafka for handling large datasets
Posted 3 weeks ago
7.0 - 12.0 years
3 - 6 Lacs
Bengaluru, Karnataka, India
On-site
As a Senior Client Solutions Partner you will be a part of the core sales and GTM team of Quantiphi and you will be responsible for execution of end-to-end sales processes in a B2B environment. Prepare and deliver technical presentations explaining products or services to customers and prospective customers Managing customer communication & relationships. Engage & drive end-to-end pre-sales activities for business development for the company in the Data analytics. Ability to identify & prospect full range - Proficient in developing business collaterals based on latest developments in Data modernization to showcase the potential of data for the enterprise. Experience in handling or being hands-on for data modernization projects. Work in conjunction with the Solution Architects & Data Engineering teams together, analyze and prospect business problems to be solved using large volumes of quantitative and qualitative data and develop of point of view to build a solution for the problems. Applying the right analysis frameworks to develop creative solutions to complex business problems. Planning and executing both short-term and long-range projects and manage teamwork, and client expectations. Challenge and inspire customers and peers to solve difficult problems with ambitious and novel solutions. Work with the team to identify and qualify business opportunities. Identify key customer technical objections and develop a strategy to resolve technical blockers. Work hands-on with customers to demonstrate and prototype Google Cloud product integrations in customer/partner environments and manage the technical relationship with Google s customers. Recommend integration strategies, enterprise architectures, platforms and application infrastructure required to successfully implement a complete solution using best practices on Google Cloud. This includes understanding, analyzing and prospecting complex business problems to be solved using Data solutions & AI Applications in a variety of industries including Healthcare, Media, BFSI, CPG, Retail, and many others. Travel to customer sites, conferences, and other related events as required. Responsibilities: You would be involved in the development of new business opportunities and value-added services which requires a high level of creativity, learning potential and deep quantitative subject matter expertise and therefore self-driven individuals willing to learn on the go would be preferred. Strong team player. Degree in Business (MBA), Computer Science Engineering. Good communication , abstraction , analytical and presentation skills. Experience of B2B sales , customer communication and relationship mangement Experience and knowledge of critical phases of the sales process which includes requirement gathering, sales planning, solution scoping, proposal writing and presentation. Data driven mindset. Your plans and actions are backed by not just gut feeling but also customer/industry/market research. Knowledge and willingness to learn and apply emerging trends in business research, data engineering, Cloud. Excellent aptitude in business analysis and awareness of quantitative analysis techniques. Excellent communication (both written & verbal) & articulation skills (Mandatory). Strong team player and ability to collaborate with a cross functional team. Experience with sales reporting. Self-driven and strong aptitude to work in an entrepreneurial , fast-paced environment with minimal supervision and a passion for developing new value-added data based solutions for clients across a variety of industries.
Posted 3 weeks ago
7.0 - 12.0 years
18 - 20 Lacs
Hyderabad
Work from Office
We are Hiring Senior Python with Machine Learning Engineer Level 3 for a US based IT Company based in Hyderabad. Candidates with minimum 7 Years of experience in python and machine learning can apply. Job Title : Senior Python with Machine Learning Engineer Level 3 Location : Hyderabad Experience : 7+ Years CTC : 28 LPA - 30 LPA Working shift : Day shift Job Description: We are seeking a highly skilled and experienced Python Developer with a strong background in Machine Learning (ML) to join our advanced analytics team. In this Level 3 role, you will be responsible for designing, building, and deploying robust ML pipelines and solutions across real-time, batch, event-driven, and edge computing environments. The ideal candidate will have extensive hands-on experience in developing and deploying ML workflows using AWS SageMaker , building scalable APIs, and integrating ML models into production systems. This role also requires a strong grasp of the complete ML lifecycle and DevOps practices specific to ML projects. Key Responsibilities: Develop and deploy end-to-end ML pipelines for real-time, batch, event-triggered, and edge environments using Python Utilize AWS SageMaker to build, train, deploy, and monitor ML models using SageMaker Pipelines, MLflow, and Feature Store Build and maintain RESTful APIs for ML model serving using FastAPI , Flask , or Django Work with popular ML frameworks and tools such as scikit-learn , PyTorch , XGBoost , LightGBM , and MLflow Ensure best practices across the ML lifecycle: data preprocessing, model training, validation, deployment, and monitoring Implement CI/CD pipelines tailored for ML workflows using tools like Bitbucket , Jenkins , Nexus , and AUTOSYS Design and maintain ETL workflows for ML pipelines using PySpark , Kafka , AWS EMR , and serverless architectures Collaborate with cross-functional teams to align ML solutions with business objectives and deliver impactful results Required Skills & Experience: 5+ years of hands-on experience with Python for scripting and ML workflow development 4+ years of experience with AWS SageMaker for deploying ML models and pipelines 3+ years of API development experience using FastAPI , Flask , or Django 3+ years of experience with ML tools such as scikit-learn , PyTorch , XGBoost , LightGBM , and MLflow Strong understanding of the complete ML lifecycle: from model development to production monitoring Experience implementing CI/CD for ML using Bitbucket , Jenkins , Nexus , and AUTOSYS Proficient in building ETL processes for ML workflows using PySpark , Kafka , and AWS EMR Nice to Have: Experience with H2O.ai for advanced machine learning capabilities Familiarity with containerization using Docker and orchestration using Kubernetes For further assistance contact/whatsapp : 9354909517 or write to hema@gist.org.in
Posted 2 months ago
20.0 - 25.0 years
22 - 27 Lacs
Bengaluru
Work from Office
Position: Senior AI Architect AI Factory (MLOps, GenOps)Experience:20+ years of total IT experience with a minimum of 10 years in AI/ML Proven experience in building scalable AI platforms or "AI Factories" for productionizing machine learning and generative AI workflows, including strong hands-on expertise in MLOps and emerging GenOps practices Location:Bangalore / Pune on case-to-case basisRole Summary:We are looking for a Senior AI Architect to lead the design and implementation of a next-generation AI Factory platform that streamlines the development, deployment, monitoring, and reuse of AI/ML and GenAI assets This role will be instrumental in establishing scalable MLOps and GenOps practices, building reusable components, standardizing pipelines, and enabling cross-industry solutioning for pre-sales and delivery The candidate will work closely with the AI Practice Head, contributing to both business enablement and technical strategy while supporting customer engagements, RFP/RFI responses, PoCs, and accelerator development Key Responsibilities: Architect and build the AI Factory a central repository of reusable AI/ML models, GenAI prompts, agents, pipelines, APIs, and accelerators Define and implement MLOps workflows for versioning, model training, deployment, CI/CD, monitoring, and governance Design and integrate GenOps pipelines for prompt engineering, LLM orchestration, evaluation, and optimization Create blueprints and templates for standardized AI solution delivery across cloud platforms (Azure, AWS, GCP) Build accelerators and reusable modules to speed up AI solutioning for common use cases (e g , chatbots, summarization, document intelligence) Enable pre-sales and solution teams with reusable assets for demos, PoCs, and customer presentations Contribute to RFP/RFI responses with scalable, production-ready AI factory strategies and architectural documentation Collaborate with data engineering, DevOps, cloud, and security teams to ensure robust and enterprise-compliant AI solution delivery Required Skills: Deep experience in MLOps tools like MLflow, Kubeflow, SageMaker Pipelines, Azure ML Pipelines, or Vertex AI Pipelines Understanding of GenOps frameworks including prompt flow management, LLM evaluation (e g , TruLens, Ragas), and orchestration (LangChain, LlamaIndex, Semantic Kernel) Strong command of Python, YAML/JSON, and API integration for scalable AI component development Experience with CI/CD pipelines (GitHub Actions, Jenkins, Azure DevOps), containerization (Docker, Kubernetes), and model registries Familiar with model observability, drift detection, automated retraining, and model versioning Ability to create clean, reusable architecture artifacts and professional PowerPoint decks for customer and internal presentations Preferred Qualifications: Experience in building and managing an enterprise-wide AI marketplace or model catalog Familiarity with LLMOps platforms (eg, Weights & Biases, PromptLayer, Arize AI) Exposure to multi-cloud GenAI architectures and hybrid deployment models Cloud certifications in AI/ML from any major provider (AWS, Azure, GCP) Soft Skills: Strong leadership and mentoring capabilities Effective communication and storytelling skills for technical and non-technical audiences Innovation mindset with a passion for automation and efficiency Comfortable working in a fast-paced, cross-functional environment with shifting priorities
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough