Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 - 4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
???? About Es Magico Es Magico is an AI-first enterprise transformation organisation that goes beyond consulting we deliver scalable execution across sectors such as BFSI, Healthcare, Entertainment, and Education. With offices in Mumbai and Bengaluru, our mission is to augment the human workforce by deploying bespoke AI employees across business functions, innovating swiftly and executing with trust. We also partner with early-stage startups as a venture builder, transforming 0 ? 1 ideas into AI-native, scalable products. ???? Role: MLOps Engineer ??? Location: Bengaluru (Hybrid) ??? Experience: 14 years ??? Joining: Immediate ???? Key Responsibilities Design, develop, and maintain scalable ML pipelines for training, testing, and deployment. Automate model deployment, monitoring, and version control across dev/staging/prod environments. Integrate CI/CD pipelines for ML models using tools like MLflow, Kubeflow, Airflow, etc. Manage containerized workloads using Docker and orchestrate with Kubernetes or GKE. Collaborate closely with data scientists and product teams to optimize ML model lifecycle. Monitor performance and reliability of deployed models and troubleshoot issues as needed. ????? Technical Skills Experience with MLOps frameworks: MLflow, TFX, Kubeflow, or SageMaker Pipelines. Proficient in Python and common ML libraries (scikit-learn, pandas, etc.). Solid understanding of CI/CD practices and tools (e.g., GitHub Actions, Jenkins, Cloud Build). Familiar with Docker, Kubernetes, and Google Cloud Platform (GCP). Comfortable with data pipeline tools like Airflow, Prefect, or equivalent. ???? Preferred Qualifications 14 years of experience in MLOps, ML engineering, or DevOps with ML workflows. Prior experience with model monitoring, drift detection, and automated retraining. Exposure to data versioning tools like DVC or Delta Lake is a plus. GCP certifications or working knowledge of Vertex AI is a strong advantage. ???? How to Apply Send your resume to [HIDDEN TEXT] with the subject line: Application MLOps Engineer. Show more Show less
Posted 3 days ago
5.0 - 7.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Join our dynamic and high-impact Data team as a Data Engineer, where you&aposll be responsible for safely receiving and storing trading-related data for the India teams, as well as operating and improving our shared data access and data processing systems. This is a critical role in the organisation as the data platform drives a huge range of trader analysis, simulation, reporting and insights. The ideal candidate should have work experience in systems engineering, preferably with prior exposure to financial markets and with proven working knowledge in the fields of Linux administration, orchestration and automation tools, systems hardware architecture as well as storage and data protection technologies. Your Core Responsibilities: Manage and monitor all distributed systems, storage infrastructure, and data processing platforms, including HDFS, Kubernetes, Dremio, and in-house data pipelines Drive heavy focus on systems automation and CI/CD to enable rapid deployment of hardware and software solutions Collaborate closely with systems and network engineers, traders, and developers to support and troubleshoot their queries Stay up to date with the latest technology trends in the industry; propose, evaluate, and implement innovative solutions Your Skills and Experience: 57 years of experience in managing large-scale multi-petabyte data infrastructure in a similar role Advanced knowledge of Linux system administration and internals, with proven ability to troubleshoot issues in Linux environments Deep expertise in at least one of the following technologies: Kafka, Spark, Cassandra/Scylla, or HDFS Strong working knowledge of Docker, Kubernetes, and Helm Experience with data access technologies such as Dremio and Presto Familiarity with workflow orchestration tools like Airflow and Prefect Exposure to cloud platforms such as AWS, GCP, or Azure Proficiency with CI/CD pipelines and version control systems like Git Understanding of best practices in data security and compliance Demonstrated ability to solve problems proactively and creatively with a results-oriented mindset Quick learner with excellent troubleshooting skills High degree of flexibility and adaptability About Us IMC is a global trading firm powered by a cutting-edge research environment and a world-class technology backbone. Since 1989, weve been a stabilizing force in financial markets, providing essential liquidity upon which market participants depend. Across our offices in the US, Europe, Asia Pacific, and India, our talented quant researchers, engineers, traders, and business operations professionals are united by our uniquely collaborative, high-performance culture, and our commitment to giving back. From entering dynamic new markets to embracing disruptive technologies, and from developing an innovative research environment to diversifying our trading strategies, we dare to continuously innovate and collaborate to succeed. Show more Show less
Posted 4 days ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
Join us in building a modern workflow management system using Python (FastAPI), React/Next.js, GraphQL, and SQL Server on Azure. You'll work across a well-architected backend and a clean, Jamstack-style frontend with a strong focus on quality, performance, and automation. You will be responsible for building REST and GraphQL APIs with FastAPI, developing modern, responsive UIs using React + Next.js, orchestrating workflows using Celery/RQ and Prefect/Airflow, integrating SQLAlchemy ORM with MS SQL Server, and contributing to testing (pytest, mocking, coverage), CI/CD (GitHub Actions, Docker), and documentation. Additionally, you will collaborate with onshore teams and provide production support when needed. We are looking for individuals with at least 5 years of Python development experience and 3 years of React/Next.js experience. You should be strong in async programming & API design, comfortable with GraphQL, SQL, and workflow engines, have experience with testing, CI/CD, and code quality tools, and possess excellent communication skills. Flexibility to support shifts if needed is also required. If you are ready to build something impactful and meet the above qualifications, let's connect! Feel free to drop your profile or send a direct message if you're interested. Referrals are welcome too.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
At PwC, our managed services team focuses on providing outsourced solutions and support to clients across various functions. We help organizations streamline operations, reduce costs, and enhance efficiency by managing key processes and functions on their behalf. Our team is skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC are responsible for transitioning and running services, managing delivery teams, programs, commercials, performance, and delivery risk. Your role will involve continuous improvement and optimization of managed services processes, tools, and services. As an Associate at PwC, you will work as part of a team of problem solvers, assisting in solving complex business issues from strategy to execution. Professional skills and responsibilities at this level include using feedback and reflection to develop self-awareness, demonstrating critical thinking, and bringing order to unstructured problems. You will be involved in ticket quality review, status reporting for projects, adherence to SLAs, incident management, change management, and problem management. Additionally, you will seek opportunities for exposure to different situations, environments, and perspectives, uphold the firm's code of ethics, demonstrate leadership capabilities, and work in a team environment that includes client interactions and cross-team collaboration. Required Skills: - AWS Cloud Engineer - Minimum 2 years of hands-on experience in building advanced data warehousing solutions on leading cloud platforms - Minimum 1-3 years of Operate/Managed Services/Production Support Experience - Extensive experience in developing scalable, repeatable, and secure data structures and pipelines - Designing and implementing data pipelines for data ingestion, processing, and transformation in AWS - Building efficient ETL/ELT processes using industry-leading tools like AWS, PySpark, SQL, Python, etc. - Implementing data validation and cleansing procedures - Monitoring and troubleshooting data pipelines - Implementing and maintaining data security and privacy measures - Strong communication, problem-solving, quantitative, and analytical abilities Nice To Have: - AWS certification In our Managed Services platform, we deliver integrated services and solutions grounded in deep industry experience and powered by talent. Our team provides scalable solutions that add value to our clients" enterprise through technology and human-enabled experiences. We focus on empowering clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. As a member of our Data, Analytics & Insights Managed Service team, you will work on critical offerings, help desk support, enhancement, optimization work, and strategic roadmap and advisory level work. Your contribution will be crucial in supporting customer engagements both technically and relationally.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
As an AI and Machine Learning Engineer at Dailoqa, you will play a crucial role in shaping the future of Financial Services clients and the company as a whole. Working directly with the founding team, you will have the opportunity to apply the latest AI techniques to address real-world problems encountered by Financial Services clients. Your responsibilities will include designing, constructing, and enhancing datasets to assess and continually enhance our solutions, as well as engaging in strategy and product ideation sessions to influence our product and solution roadmap. Key Responsibilities: - Agentic AI Development: Build scalable multi-modal Large Language Model (LLM) based AI agents using frameworks such as LangGraph, Microsoft Autogen, or Crewai. - AI Research and Innovation: Research and develop innovative solutions for relevant AI challenges such as Retrieval-Augmented Generation (RAG), semantic search, knowledge representation, tool usage, fine-tuning, and reasoning in LLMs. - Technical Expertise: Demonstrate proficiency in a technology stack comprising Python, LlamaIndex / LangChain, PyTorch, HuggingFace, FastAPI, Postgres, SQLAlchemy, Alembic, OpenAI, Docker, Azure, Typescript, and React. - LLM and NLP Experience: Hands-on experience with LLMs, RAG architectures, Natural Language Processing (NLP), or applying Machine Learning to solve real-world problems. - Dataset Development: Establish a strong track record of constructing datasets for training and/or evaluating machine learning models. - Customer Focus: Dive deep into the domain, comprehend the problem, and concentrate on delivering value to the customer. - Adaptability: Thrive in a fast-paced environment and demonstrate enthusiasm for joining an early-stage venture. - Model Deployment and Management: Automate model deployment, monitoring, and retraining processes. - Collaboration and Optimization: Collaborate with data scientists to review, refactor, and optimize machine learning code. - Version Control and Governance: Implement version control and governance for models and data. Required Qualifications: - Bachelor's degree in computer science, Software Engineering, or a related field. - 4-8 years of experience in MLOps, DevOps, or similar roles. - Strong programming experience and familiarity with Python-based deep learning frameworks like PyTorch, JAX, Tensorflow. - Proficiency in cloud platforms (AWS, Azure, or GCP) and infrastructure-as-code tools like Terraform. Desired Skills: - Experience with experiment tracking and model versioning tools. - Proficiency with the technology stack: Python, LlamaIndex / LangChain, PyTorch, HuggingFace, FastAPI, Postgres, SQLAlchemy, Alembic, OpenAI, Docker, Azure, Typescript, React. - Knowledge of data pipeline orchestration tools like Apache Airflow or Prefect. - Familiarity with software testing and test automation practices. - Understanding of ethical considerations in machine learning deployments. - Strong problem-solving skills and ability to work in a fast-paced environment.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be joining an innovative company that is revolutionizing retail checkout experiences by utilizing cutting-edge Computer Vision technology to replace traditional barcodes. Our platform aims to create seamless, faster, and smarter checkout processes, enhancing the shopping experience for both retailers and consumers. As we are growing rapidly, we are seeking an experienced Android/Cross-Platform App Developer to be a part of our team and help in shaping the future of retail technology. As a Senior Data Engineer, you will be an integral part of our expanding data team. Your primary responsibilities will involve building and optimizing data infrastructure, pipelines, and tooling to support analytics, machine learning, and product development. This role requires a strong background in cloud-native data engineering, a passion for scalable systems, and the ability to work independently with minimal supervision. Key Responsibilities: - Design, build, and maintain scalable data pipelines and ETL/ELT workflows using tools such as Kestra or Prefect. - Architect and manage cloud-based data infrastructure utilizing platforms like Snowflake, MySQL, and LanceDB. - Implement and uphold data quality, lineage, and governance best practices. - Collaborate with analytics, BI, and product teams to establish data models for reporting, experimentation, and operational use cases. - Optimize query performance, storage costs, and data reliability across various platforms. - Oversee data ingestion from internal and external systems through APIs, CDC, or streaming technologies like Kafka and MQTT. - Develop automated data validation, testing, and monitoring frameworks to ensure data integrity. - Contribute to infrastructure-as-code and deployment processes using CI/CD pipelines and version control systems like Git. - Capable of working independently and driving projects forward with minimal supervision. Skills and Qualifications: - 5+ years of experience as a data engineer or software engineer in large-scale data systems. - Proficiency in SQL, Python, and modern data transformation frameworks. - Hands-on experience in building and maintaining production-level ETL/ELT pipelines. - Familiarity with cloud data warehouses like Snowflake and RedPanda Cloud. - Expertise in workflow orchestration tools such as Airflow, Kestra, or Prefect. - Understanding of data modeling techniques like dimensional modeling and normalization. - Experience with cloud platforms such as AWS and Azure for data infrastructure and services. - Ability to work independently and lead projects with minimal guidance. Nice to Have: - Experience with streaming data technologies, specifically RedPanda. - Knowledge of data security, privacy, and compliance practices including GDPR and HIPAA. - Background in DevOps for data, encompassing containerization and observability tools. - Previous involvement in a Retail or e-commerce data environment. Software Qualifications: - Languages: Python, SQL, Rust - Data Warehousing: Snowflake, MySQL - ETL/ELT Orchestration Tools: Kestra, Prefect - Version Control & CI/CD: Git, GitHub Actions - Orchestration & Infrastructure: Docker, Kubernetes, Redpanda, Cloudflare - Monitoring: OpenobserveAI, Keep Why Join Us : - Become part of a forward-thinking company shaping the future of retail technology. - Collaborate with a dynamic and innovative team that values creativity. - Opportunity to contribute to cutting-edge projects and enhance your skills. - Competitive salary and benefits package. - Enjoy a flexible work environment with opportunities for career growth.,
Posted 2 weeks ago
5.0 - 10.0 years
10 - 20 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Key Skills : PostgreSQL, Cron Jobs, Databricks, Azure, SSIS, Prefect, Data Pipelines, Cloud Data Migration, MSSQL. Roles and Responsibilities: Design and implement data models in PostgreSQL database on cloud environments. Build and manage transformation pipelines using Databricks for data migration from MSSQL to PostgreSQL. Schedule and manage automation using Cron jobs. Mentor and guide junior team members. Work in Azure or any cloud-based environment. Ensure successful and optimized data migration from MSSQL to PostgreSQL. Experience Requirement: 5-10 years of experience in database engineering and data migration. Hands-on experience in PostgreSQL, Cron jobs, Databricks, and Azure. Experience with data pipelines using SSIS or Prefect is preferred. Education: B.E., B.Tech.
Posted 2 weeks ago
3.0 - 8.0 years
9 - 19 Lacs
Hyderabad
Work from Office
We Advantum Health Pvt. Ltd - US Healthcare MNC looking for Senior AI/ML Engineer. We Advantum Health Private Limited is a leading RCM and Medical Coding company, operating since 2013. Our Head Office is located in Hyderabad, with branch operations in Chennai and Noida. We are proud to be a Great Place to Work certified organization and a recipient of the Telangana Best Employer Award. Our office spans 35,000 sq. ft. in Cyber Gateway, Hitech City, Hyderabad Job Title: Senior AI/ML Engineer Location: Hitech City, Hyderabad, India Work from office Ph: 9177078628, 7382307530, 9059683624 Address: Advantum Health Private Limited, Cyber gateway, Block C, 4th floor Hitech City, Hyderabad. Location: https://www.google.com/maps/place/Advantum+Health+India/@17.4469674,78.3747158,289m/data=!3m2!1e3!5s0x3bcb93e01f1bbe71:0x694a7f60f2062a1!4m6!3m5!1s0x3bcb930059ea66d1:0x5f2dcd85862cf8be!8m2!3d17.4467126!4d78.3767566!16s%2Fg%2F11whflplxg?entry=ttu&g_ep=EgoyMDI1MDMxNi4wIKXMDSoASAFQAw%3D%3D Job Summary: We are seeking a highly skilled and motivated Data Engineer to join our growing data team. In this role, you will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure to support analytics, machine learning, and business intelligence initiatives. You will work closely with data analysts, scientists, and engineers to ensure data availability, reliability, and quality across the organization. Key Responsibilities: Design, develop, and maintain robust ETL/ELT pipelines for ingesting and transforming large volumes of structured and unstructured data Build and optimize data infrastructure for scalability, performance, and reliability Collaborate with cross-functional teams to understand data needs and translate them into technical solutions Implement data quality checks, monitoring, and alerting mechanisms Manage and optimize data storage solutions (data warehouses, data lakes, databases) Ensure data security, compliance, and governance across all platforms Automate data workflows and optimize data delivery for real-time and batch processing Participate in code reviews and contribute to best practices for data engineering Required Skills and Qualifications: Bachelors or Masters degree in Computer Science, Engineering, Information Systems, or a related field 3+ years of experience in data engineering or related roles Strong programming skills in Python, Java, or Scala Proficiency with SQL and working with relational databases (e.g., PostgreSQL, MySQL) Experience with data pipeline and workflow orchestration tools (e.g., Airflow, Prefect, Luigi) Hands-on experience with cloud platforms (AWS, GCP, or Azure) and cloud data services (e.g., Redshift, BigQuery, Snowflake) Familiarity with distributed data processing tools (e.g., Spark, Kafka, Hadoop) Solid understanding of data modeling, warehousing concepts, and data governance Preferred Qualifications: Experience with CI/CD and DevOps practices for data engineering Knowledge of data privacy regulations such as GDPR, HIPAA, etc. Experience with version control systems like Git Familiarity with containerization (Docker, Kubernetes) Follow us on LinkedIn, Facebook, Instagram, Youtube and Threads for all updates: Advantum Health Linkedin Page: https://www.linkedin.com/showcase/advantum-health-india/ Advantum Health Facebook Page: https://www.facebook.com/profile.php?id=61564435551477 Advantum Health Instagram Page: https://www.instagram.com/reel/DCXISlIO2os/?igsh=dHd3czVtc3Fyb2hk Advantum Health India Youtube link: https://youtube.com/@advantumhealthindia-rcmandcodi?si=265M1T2IF0gF-oF1 Advantum Health Threads link: https://www.threads.net/@advantum.health.india HR Dept, Advantum Health Pvt Ltd Cybergateway, Block C, Hitech City, Hyderabad Ph: 9177078628, 7382307530, 9059683624
Posted 4 weeks ago
5.0 - 10.0 years
12 - 22 Lacs
Hyderabad, Delhi / NCR
Hybrid
8+ years of experience in data engineering or a related field. Strong expertise in Snowflake including schema design, performance tuning, and security. Proficiency in Python for data manipulation and automation. Solid understanding of data modeling concepts (star/snowflake schema, normalization, etc.). Experience with DBT for data transformation and documentation. Hands-on experience with ETL/ELT tools and orchestration frameworks (e.g., Airflow, Prefect). Strong SQL skills and experience with large-scale data sets. Familiarity with cloud platforms (AWS, Azure, or GCP) and data services.
Posted 2 months ago
5.0 - 9.0 years
7 - 11 Lacs
Bengaluru
Work from Office
We are looking for a Senior Data Engineer who will design, build, and maintain scalable data pipelines and ingestion frameworks. The ideal candidate must have experience with DBT, orchestration tools like Airflow or Prefect, and cloud platforms such as AWS. Responsibilities include developing ELT pipelines, optimizing queries, implementing CI/CD, and integrating with AWS services. Strong SQL, Python, and data modeling skills are essential. The role also involves working with real-time and batch processing, ensuring high performance and data integrity.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough