Jobs
Interviews
10 Job openings at Inferenz
Sr MLOps Engineer

Pune, Maharashtra, India

5 years

Not disclosed

On-site

Full Time

Position: Sr. MLOps Engineer Location: Ahmedabad, Pune Required Experience: 5+ Years of experience Preferred Immediate Joiners Job Overview Building the machine learning production infrastructure (or MLOps) is the biggest challenge most large companies currently have in making the transition to becoming an AI-driven organization. We are looking for a highly skilled MLOps Engineer to join our team. As an MLOps Engineer, you will be responsible for designing, implementing, and maintaining the infrastructure that supports the deployment, monitoring, and scaling of machine learning models in production. You will work closely with data scientists, software engineers, and DevOps teams to ensure seamless integration of machine learning models into our production systems. The job is NOT for your if You don’t want to build a career in AI/ML. Becoming an expert in this technology and staying current will require significant self-motivation. You like the comfort and predictability of working on the same problem or code base for years. The tools, best practices, architectures, and problems are all going through rapid change — you will be expected to learn new skills quickly and adapt. Key Responsibilities: · Model Deployment: Design and implement scalable, reliable, and secure pipelines for deploying machine learning models to production. · Infrastructure Management: Develop and maintain infrastructure as code (IaC) for managing cloud resources, compute environments, and data storage. · Monitoring and Optimization: Implement monitoring tools to track the performance of models in production, identify issues, and optimize performance. · Collaboration: Work closely with data scientists to understand model requirements and ensure models are production ready. · Automation: Automate the end-to-end process of training, testing, deploying, and monitoring models. · Continuous Integration/Continuous Deployment (CI/CD): Develop and maintain CI/CD pipelines for machine learning projects. · Version Control: Implement model versioning to manage different iterations of machine learning models. · Security and Governance: Ensure that the deployed models and data pipelines are secure and comply with industry regulations. · Documentation: Create and maintain detailed documentation of all processes, tools, and infrastructure. Qualifications: · 5+ years of experience in a similar role (DevOps, DataOps, MLOps, etc.) · Bachelor’s or master’s degree in computer science, Engineering, or a related field. · Experience with cloud platforms (AWS, GCP, Azure) and containerization (Docker, Kubernetes) · Strong understanding of machine learning lifecycle, data pipelines, and model serving. · Proficiency in programming languages such as Python, Shell scripting, and familiarity with ML frameworks (TensorFlow, PyTorch, etc.). · Exposure to deep learning approaches and modeling frameworks (PyTorch, Tensorflow, Keras, etc.) · Experience with CI/CD tools like Jenkins, GitLab CI, or similar · Experience building end-to-end systems as a Platform Engineer, ML DevOps Engineer, or Data Engineer (or equivalent) · Strong software engineering skills in complex, multi-language systems · Comfort with Linux administration · Experience working with cloud computing and database systems · Experience building custom integrations between cloud-based systems using APIs · Experience developing and maintaining ML systems built with open-source tools · Experience developing with containers and Kubernetes in cloud computing environments · Familiarity with one or more data-oriented workflow orchestration frameworks (MLFlow, KubeFlow, Airflow, Argo, etc.) · Ability to translate business needs to technical requirements · Strong understanding of software testing, benchmarking, and continuous integration · Exposure to machine learning methodology and best practices · Understanding of regulatory requirements for data privacy and model governance. Preferred Skills: · Excellent problem-solving skills and ability to troubleshoot complex production issues. · Strong communication skills and ability to collaborate with cross-functional teams. · Familiarity with monitoring and logging tools (e.g., Prometheus, Grafana, ELK Stack). · Knowledge of database systems (SQL, NoSQL). · Experience with Generative AI frameworks · Preferred cloud-based or MLOps/DevOps certification (AWS, GCP, or Azure) Show more Show less

Lead Data Engineer (Databricks)

Ahmedabad, Gujarat, India

7 years

Not disclosed

On-site

Full Time

Position: Lead Data Engineer (Databricks) Location: Ahmedabad, Pune Required Experience: 7 to 10 Years Preferred Immediate Joiner We are looking for an accomplished Lead Data Engineer with expertise in Databricks to join our dynamic team. This role is crucial for enhancing our data engineering capabilities, and it offers the chance to work with advanced technologies, including Generative AI. Key Responsibilities: Lead the design, development, and optimization of data solutions using Databricks, ensuring they are scalable, efficient, and secure. Collaborate with cross-functional teams to gather and analyse data requirements, translating them into robust data architectures and solutions. Develop and maintain ETL pipelines, leveraging Databricks and integrating with Azure Data Factory as needed. Implement machine learning models and advanced analytics solutions, incorporating Generative AI to drive innovation. Ensure data quality, governance, and security practices are adhered to, maintaining the integrity and reliability of data solutions. Provide technical leadership and mentorship to junior engineers, fostering an environment of learning and growth. Stay updated on the latest trends and advancements in data engineering, Databricks, Generative AI, and Azure Data Factory to continually enhance team capabilities. Required Skills & Qualifications: Bachelor’s or master’s degree in computer science, Information Technology, or a related field. 7+ to 10 years of experience in data engineering, with a focus on Databricks. Proven expertise in building and optimizing data solutions using Databricks and integrating with Azure Data Factory/AWS Glue. Proficiency in SQL and programming languages such as Python or Scala. Strong understanding of data modelling, ETL processes, and Data Warehousing/Data Lakehouse concepts. Familiarity with cloud platforms, particularly Azure, and containerization technologies such as Docker. Excellent analytical, problem-solving, and communication skills. Demonstrated leadership ability with experience mentoring and guiding junior team members. Preferred Qualifications: Experience with Generative AI technologies and their applications. Familiarity with other cloud platforms, such as AWS or GCP. Knowledge of data governance frameworks and tools. Perks: Flexible Timings 5 Days Working Healthy Environment Celebration Learn and Grow Build the Community Medical Insurance Benefit Show more Show less

Senior Data Engineer (Snowflake+DBT+Airflow)

Ahmedabad

5 years

INR 3.0 - 7.75 Lacs P.A.

On-site

Part Time

Location: Ahmedabad / Pune Required Experience: 5+ Years Preferred Immediate Joiner We are looking for a highly skilled Lead Data Engineer (Snowflake) to join our team. The ideal candidate will have extensive experience Snowflake, and cloud platforms, with a strong understanding of ETL processes, data warehousing concepts, and programming languages. If you have a passion for working with large datasets, designing scalable database schemas, and solving complex data problems. Key Responsibilities: Design, implement, and optimize data pipelines and workflows using Apache Airflow Develop incremental and full-load strategies with monitoring, retries, and logging Build scalable data models and transformations in dbt, ensuring modularity, documentation, and test coverage Develop and maintain data warehouses in Snowflake Ensure data quality, integrity, and reliability through validation frameworks and automated testing Tune performance through clustering keys, warehouse scaling, materialized views, and query optimization. Monitor job performance and resolve data pipeline issues proactively Build and maintain data quality frameworks (null checks, type checks, threshold alerts). Partner with data analysts, scientists, and business stakeholders to translate reporting and analytics requirements into technical specifications. Required Skills & Qualifications: Snowflake (data modeling, performance tuning, access control, external tables, streams & tasks) Apache Airflow (DAG design, task dependencies, dynamic tasks, error handling) dbt (Data Build Tool) (modular SQL development, jinja templating, testing, documentation) Proficiency in SQL, Spark and Python Experience building data pipelines on cloud platforms like AWS, GCP, or Azure Strong knowledge of data warehousing concepts and ELT best practices Familiarity with version control systems (e.g., Git) and CI/CD practices Familiarity with infrastructure-as-code tools like Terraform for provisioning Snowflake or Airflow environments. Excellent problem-solving skills and the ability to work independently. Perks: Flexible Timings 5 Days Working Healthy Environment Celebration Learn and Grow Build the Community Medical Insurance Benefit

Senior Full Stack Developer (GenAI Applications)

Ahmedabad

5 years

INR 5.5 - 7.4 Lacs P.A.

On-site

Part Time

Location: Ahmedabad,Pune Required Experience: 5+ Years Preferred Immediate Joiner Inferenz is a pioneering AI and Data Native company dedicated to transforming how organizations leverage data and AI to drive innovation and efficiency. As industry leaders, we specialize in delivering cutting-edge AI and data solutions that empower businesses to harness the full potential of their data assets. We are seeking an experienced Senior Full Stack Developer to join our team building innovative Generative AI (GenAI) based applications. The ideal candidate will have a strong background in developing scalable RESTful APIs using Python, as well as modern frontend applications with Node.js or React. Experience with cloud platforms (Azure or AWS), Kubernetes, microservices architecture, and version control (Git or Azure Repos) is essential. Familiarity with AI/ML/GenAI technologies and Agenting AI is highly valued. Key Responsibilities: Full-Stack Development: Design, build, and maintain scalable RESTful APIs using Python and frontend applications using Node.js or React. GenAI Integration: Develop and optimize Agenting AI components using Python, ensuring seamless integration with backend services. Cloud Deployment: Manage application deployment, scaling, and monitoring on Azure or AWS using Kubernetes and microservices architecture. Collaboration: Work with cross-functional teams using Jira and Confluence for project tracking and documentation. Performance Optimization: Ensure high availability, security, and efficiency of applications through robust coding practices and infrastructure management. Required Skills & Experience: Backend: Strong expertise in Python and REST API development. Frontend: Proficiency in Node.js, React, and modern JavaScript frameworks. Cloud & DevOps: Hands-on experience with Azure or AWS, Kubernetes, microservices, and Git or Azure Repos for version control. Tools: Familiarity with Jira, Confluence, and CI/CD pipelines. Experience: 5+ years in full-stack development with production-grade applications. Preferred Skills: AI/ML Knowledge: Understanding of GenAI tools (e.g., LangChain, LLMs, RAG/GraphRAG/MCP architectures) and Agenting AI development. Cloud AI Services: Experience with cloud-based AI platforms (e.g., AWS Bedrock, Azure AI). Architecture: Proficiency in designing scalable systems and troubleshooting distributed environments. What We Offer: Competitive salary and comprehensive benefits package. Flexible work hours and a hybrid working model to support work-life balance. Opportunities for professional growth and development in a cutting-edge technology environment. Exposure to Generative AI and other advanced technologies. Inferenz is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Perks: Flexible Timings 5 Days Working Healthy Environment Celebration Learn and Grow Build the Community Medical Insurance Benefit

Lead Full Stack Developer (GenAI Applications)

Ahmedabad

7 years

INR 5.5 - 7.4 Lacs P.A.

On-site

Part Time

Location: Ahmedabad,Pune Required Experience: 7+ Years Preferred Immediate Joiner Inferenz is a pioneering AI and Data Native company dedicated to transforming how organizations leverage data and AI to drive innovation and efficiency. As industry leaders, we specialize in delivering cutting-edge AI and data solutions that empower businesses to harness the full potential of their data assets. We are seeking an experienced Senior Full Stack Developer to join our team building innovative Generative AI (GenAI) based applications. The ideal candidate will have a strong background in developing scalable RESTful APIs using Python, as well as modern frontend applications with Node.js or React. Experience with cloud platforms (Azure or AWS), Kubernetes, microservices architecture, and version control (Git or Azure Repos) is essential. Familiarity with AI/ML/GenAI technologies and Agenting AI is highly valued. Key Responsibilities: Full-Stack Development: Design, build, and maintain scalable RESTful APIs using Python and frontend applications using Node.js or React. GenAI Integration: Develop and optimize Agenting AI components using Python, ensuring seamless integration with backend services. Cloud Deployment: Manage application deployment, scaling, and monitoring on Azure or AWS using Kubernetes and microservices architecture. Collaboration: Work with cross-functional teams using Jira and Confluence for project tracking and documentation. Performance Optimization: Ensure high availability, security, and efficiency of applications through robust coding practices and infrastructure management. Required Skills & Experience: Backend: Strong expertise in Python and REST API development. Frontend: Proficiency in Node.js, React, and modern JavaScript frameworks. Cloud & DevOps: Hands-on experience with Azure or AWS, Kubernetes, microservices, and Git or Azure Repos for version control. Tools: Familiarity with Jira, Confluence, and CI/CD pipelines. Experience: 5+ years in full-stack development with production-grade applications. Preferred Skills: AI/ML Knowledge: Understanding of GenAI tools (e.g., LangChain, LLMs, RAG/GraphRAG/MCP architectures) and Agenting AI development. Cloud AI Services: Experience with cloud-based AI platforms (e.g., AWS Bedrock, Azure AI). Architecture: Proficiency in designing scalable systems and troubleshooting distributed environments. What We Offer: Competitive salary and comprehensive benefits package. Flexible work hours and a hybrid working model to support work-life balance. Opportunities for professional growth and development in a cutting-edge technology environment. Exposure to Generative AI and other advanced technologies. Inferenz is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Perks: Flexible Timings 5 Days Working Healthy Environment Celebration Learn and Grow Build the Community Medical Insurance Benefit

Lead Data Engineer (Snowflake)

ahmedabad, gujarat

7 - 11 years

INR Not disclosed

On-site

Full Time

You are a highly skilled Lead Data Engineer (Snowflake) with 7 to 10 years of experience, seeking to join a dynamic team in Ahmedabad or Pune. Your expertise includes extensive knowledge of Snowflake, cloud platforms, ETL processes, data warehousing concepts, and various programming languages. If you are passionate about working with large datasets, designing scalable database schemas, and solving complex data problems, we are excited to welcome you aboard! Your responsibilities will involve designing, developing, and optimizing data pipelines using Snowflake and ELT/ETL tools. You will be tasked with architecting, implementing, and maintaining data warehouse solutions, ensuring high performance and scalability. Additionally, you will design efficient database schemas and data models to support business needs, write and optimize complex SQL queries, and develop data transformation scripts using Python, C#, or Java for process automation. Ensuring data integrity, security, and governance throughout the data lifecycle will be paramount, as you analyze, troubleshoot, and resolve data-related issues at various strategic levels. Collaboration with cross-functional teams to comprehend business requirements and deliver data-driven solutions will be a key aspect of your role. **Qualifications (Must Have):** - Strong experience with Snowflake. - Deep understanding of transactional databases, OLAP, and data warehousing concepts. - Experience in designing database schemas and data models. - Proficiency in one programming language (Python, C#, or Java). - Strong problem-solving and analytical skills. **Good to Have:** - Snowpro Core or Snowpro Advanced certificate. - Experience with cost/performance optimization. - Client-facing experience with the ability to understand business needs. - Ability to work collaboratively in a team environment. **Perks:** - Flexible Timings - 5 Days Working - Healthy Environment - Celebration - Learn and Grow - Build the Community - Medical Insurance Benefit,

Lead Full Stack Developer

ahmedabad, gujarat

5 - 9 years

INR Not disclosed

On-site

Full Time

As a Senior Full Stack Developer at Inferenz, a pioneering AI and Data Native company based in Ahmedabad and Pune, you will play a crucial role in developing innovative Generative AI (GenAI) based applications. With over 7 years of experience, you will be responsible for designing, building, and maintaining scalable RESTful APIs using Python, as well as modern frontend applications with Node.js or React. Your key responsibilities will include integrating GenAI components using Python, ensuring seamless backend services integration, managing application deployment on Azure or AWS using Kubernetes and microservices architecture, and collaborating with cross-functional teams using Jira and Confluence for project tracking and documentation. Additionally, you will focus on performance optimization to ensure high availability, security, and efficiency of applications through robust coding practices and infrastructure management. To excel in this role, you should have a strong expertise in Python and REST API development for backend, proficiency in Node.js, React, and modern JavaScript frameworks for frontend, hands-on experience with Azure or AWS, Kubernetes, microservices, and Git or Azure Repos for Cloud & DevOps, familiarity with Jira, Confluence, and CI/CD pipelines for Tools, and at least 5 years of experience in full-stack development with production-grade applications. Preferred skills include understanding of GenAI tools such as LangChain, LLMs, RAG/GraphRAG/MCP architectures, experience with cloud-based AI platforms like AWS Bedrock, Azure AI, proficiency in designing scalable systems and troubleshooting distributed environments for Architecture. In return, Inferenz offers a competitive salary, comprehensive benefits package, flexible work hours, and a hybrid working model to support work-life balance. You will have opportunities for professional growth and development in a cutting-edge technology environment, exposure to Generative AI and other advanced technologies. Inferenz is an equal opportunity employer that celebrates diversity and is committed to creating an inclusive environment for all employees. Join Inferenz to enjoy perks like flexible timings, 5 days working week, a healthy work environment, celebrations, learning and growth opportunities, building a community, and medical insurance benefits.,

Solution Architect - Data & AI

ahmedabad, gujarat

8 - 12 years

INR Not disclosed

On-site

Full Time

You are a highly skilled and experienced Solution Architect specializing in Data & AI, with over 8 years of experience. In this role, you will lead and drive the data-driven transformation within the organization. Your main responsibility is to design and implement cutting-edge AI and data solutions that align with the business objectives. Collaborating closely with cross-functional teams, you will create scalable, high-performance architectures utilizing modern technologies in data engineering, machine learning, and cloud computing. Your key responsibilities include architecting and designing end-to-end data and AI solutions to address business challenges and optimize decision-making. You will define and implement best practices for data architecture, data governance, and AI model deployment. Collaborating with data engineers, data scientists, and business stakeholders, you will deliver scalable and high-impact AI-driven applications. Additionally, you will lead the integration of AI models with enterprise applications, ensuring seamless deployment and operational efficiency. It is also part of your role to evaluate and recommend the latest technologies in data platforms, AI frameworks, and cloud-based analytics solutions while ensuring data security, compliance, and ethical AI implementation. Guiding teams in adopting advanced analytics, AI, and machine learning models for predictive insights and automation is also a crucial aspect. Your role requires driving innovation by identifying new opportunities for AI and data-driven improvements within the organization. To excel in this position, you must possess over 8 years of experience in designing and implementing data and AI solutions. Strong expertise in cloud platforms such as AWS, Azure, or Google Cloud is essential. Hands-on experience with big data technologies like Spark, Databricks, Snowflake, etc., is required. Proficiency in TensorFlow, PyTorch, Scikit-learn, etc., is a must. A deep understanding of data modeling, ETL processes, and data governance frameworks is necessary. Experience in MLOps, model deployment, and automation is expected. Proficiency in Generative AI frameworks and strong programming skills in Python, SQL, and Java/Scala (preferred) are essential. Familiarity with containerization and orchestration (Docker, Kubernetes) is a plus. Excellent problem-solving skills and the ability to work in a fast-paced environment are crucial. Strong communication and leadership skills, with the ability to drive technical conversations, are highly valuable. Preferred qualifications for this role include certifications in cloud architecture, data engineering, or AI/ML, experience with generative AI, a background in developing AI-driven analytics solutions for enterprises, experience with Graph RAG, Building AI Agents, Multi-Agent systems, and additional certifications in AI/GenAI. Proven leadership skills are expected in this position. This role offers various perks such as flexible timings, 5 days working schedule, a healthy environment, celebrations, opportunities for learning and growth, building a community, and medical insurance benefits.,

Lead Cloud Engineer

Ahmedabad

6 years

INR Not disclosed

On-site

Part Time

Location: Ahmedabad,Pune Required Experience: 6+ Years Preferred Immediate Joiner We are seeking a highly skilled and motivated Lead Cloud Engineer with over 6 years of experience in designing, implementing, and managing scalable, secure, and highly available cloud solutions. The ideal candidate will have deep knowledge of cloud platforms (AWS, Azure, or GCP), strong leadership capabilities, and a hands-on approach to infrastructure automation and DevOps practices. Key Responsibilities: Lead the architecture, design, and implementation of enterprise-grade cloud infrastructure solutions. Collaborate with DevOps, Security, and Development teams to ensure robust CI/CD pipelines and cloud-native application deployment. Evaluate and select appropriate cloud services and tools based on business and technical requirements. Drive automation across infrastructure provisioning, configuration management, and application deployment. Monitor system performance, ensure high availability, and proactively resolve any issues. Implement cloud cost optimization strategies and maintain operational budgets. Enforce security best practices and compliance standards across cloud environments. Mentor junior engineers and provide technical leadership across cross-functional teams. Maintain documentation and architectural diagrams for cloud environments and processes. Stay current with emerging technologies and propose innovations that improve business outcomes. Qualifications: Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field. 6+ years of experience in cloud engineering, including at least 3 years in a leadership or senior technical role. Expertise in at least one major cloud platform: AWS, Azure, or Google Cloud Platform (GCP) . Proficiency in IaC tools such as Terraform, CloudFormation, or Pulumi . Strong scripting skills using Python, Bash, or PowerShell . Experience with CI/CD tools like Jenkins, GitLab CI/CD, GitHub Actions, or Azure DevOps . Deep understanding of networking, security, identity & access management in the cloud. Familiarity with containerization and orchestration technologies such as Docker and Kubernetes . Strong analytical, problem-solving, and troubleshooting skills. Excellent verbal and written communication skills. Preferred Qualifications (Nice to Have): Cloud certifications (e.g., AWS Certified Solutions Architect, Azure Solutions Architect, GCP Professional Cloud Architect). Experience with serverless architectures and event-driven systems. Familiarity with monitoring tools like Prometheus, Grafana, Datadog, or CloudWatch . Experience leading cloud migration or modernization projects. Perks: Flexible Timings 5 Days Working Healthy Environment Celebration Learn and Grow Build the Community Medical Insurance Benefit

Lead Data Engineer

ahmedabad, gujarat

7 - 11 years

INR Not disclosed

On-site

Full Time

As an accomplished Lead Data Engineer with 7 to 10 years of experience in data engineering, we are looking for you to join our dynamic team in either Ahmedabad or Pune. Your expertise in Databricks will play a crucial role in enhancing our data engineering capabilities and working with advanced technologies, including Generative AI. Your key responsibilities will include leading the design, development, and optimization of data solutions using Databricks to ensure scalability, efficiency, and security. You will collaborate with cross-functional teams to gather and analyze data requirements, translating them into robust data architectures and solutions. Developing and maintaining ETL pipelines, leveraging Databricks, and integrating with Azure Data Factory when necessary will also be part of your role. Furthermore, you will implement machine learning models and advanced analytics solutions, incorporating Generative AI to drive innovation. Ensuring data quality, governance, and security practices are adhered to will be essential to maintain the integrity and reliability of data solutions. Providing technical leadership and mentorship to junior engineers to foster an environment of learning and growth will also be a key aspect of your role. It is crucial to stay updated on the latest trends and advancements in data engineering, Databricks, Generative AI, and Azure Data Factory to continually enhance team capabilities. To qualify for this role, you should hold a Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Your proven expertise in building and optimizing data solutions using Databricks, integrating with Azure Data Factory/AWS Glue, proficiency in SQL, and programming languages like Python or Scala are essential. A strong understanding of data modeling, ETL processes, Data Warehousing/Data Lakehouse concepts, cloud platforms (particularly Azure), and containerization technologies such as Docker are required. Excellent analytical, problem-solving, and communication skills are a must, along with demonstrated leadership ability and experience mentoring junior team members. Preferred qualifications include experience with Generative AI technologies and applications, familiarity with other cloud platforms like AWS or GCP, and knowledge of data governance frameworks and tools. In return, we offer flexible timings, 5 days working week, a healthy environment, celebrations, opportunities to learn and grow, build a community, and medical insurance benefits. Join us and be part of a team that values innovation, collaboration, and professional development.,

Inferenz logo

Inferenz

10 Jobs

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview