Jobs
Interviews

Rhohyve Ai Labs

Rhohyve AI Labs specializes in artificial intelligence solutions, focusing on innovative AI tooling that enhances business processes and decision-making.

8 Job openings at Rhohyve Ai Labs
Regional Sales Manager Mumbai Suburban,Navi Mumbai,Mumbai (All Areas) 7 - 12 years INR 15.0 - 18.0 Lacs P.A. Work from Office Full Time

Who were looking for: A sales leader with a solid network among doctors in Mumbai, passionate about eldercare, and proven in driving revenue and partnerships in healthcare/homecare sectors. Key Responsibilities: Own P&L and revenue growth for the region. Lead and mentor a team of BDMs & Sr. Sales Managers. Drive doctor/hospital partnerships for lead generation and conversion. Develop and execute territory-specific sales strategies. Analyze sales data and optimize performance using MIS. You’ll excel if you have: 10+ years in field sales, 7+ in leadership with P&L. Deep experience in healthcare sales or partnerships. Strong Mumbai-based network with doctors and hospitals. MBA (Preferred). Location: Mumbai Sales Target: 50 Lakhs/ Month (Minimum)

Snowflake Developer Bengaluru 3 - 8 years INR 0.5 - 3.0 Lacs P.A. Remote Full Time

If you are passionate about Snowflake, data warehousing, and cloud-based analytics, we'd love to hear from you! Apply now to be a part of our growing team. Perks and benefits Intersected candidates can go through the below link to apply directly and can complete the 1st round of technical discussion https://app.hyrgpt.com/candidate-job-details?jobId=67ecc88dda1154001cc8b88f Job Summary: We are looking for a skilled Snowflake Engineer with 3-10 years of experience in designing and implementing cloud-based data warehousing solutions. The ideal candidate will have hands-on expertise in Snowflake architecture, SQL, ETL pipeline development, and performance optimization. This role requires proficiency in handling structured and semi-structured data, data modeling, and query optimization to support business intelligence and analytics initiatives. The ideal candidate will work on a project for one of our key Big4 consulting customer and will have immense learning opportunities Key Responsibilities: Design, develop, and manage high-performance data pipelines for ingestion, transformation, and storage in Snowflake. Optimize Snowflake workloads, ensuring efficient query execution and cost management. Develop and maintain ETL processes using SQL, Python, and orchestration tools. Implement data governance, security, and access control best practices within Snowflake. Work with structured and semi-structured data formats such as JSON, Parquet, Avro, and XML. Design and maintain fact and dimension tables, ensuring efficient data warehousing and reporting. Collaborate with data analysts and business teams to support reporting, analytics, and business intelligence needs. Troubleshoot and resolve data pipeline issues, ensuring high availability and reliability. Monitor and optimize Snowflake storage and compute usage to improve efficiency and performance. Required Skills & Qualifications: 3-10 years of experience in Snowflake, SQL, and data engineering. Strong hands-on expertise in Snowflake development, including data sharing, cloning, and time travel. Proficiency in SQL scripting for query optimization and performance tuning. Experience with ETL tools and frameworks (e.g., DBT, Airflow, Matillion, Talend). Familiarity with cloud platforms (AWS, Azure, or GCP) and integration with Snowflake. Strong understanding of data warehousing concepts, including fact and dimension modeling. Ability to work with semi-structured data formats like JSON, Avro, Parquet, and XML. Knowledge of data security, governance, and access control within Snowflake. Excellent problem-solving and troubleshooting skills. Preferred Qualifications: Experience in Python for data engineering tasks. Familiarity with CI/CD pipelines for Snowflake development and deployment. Exposure to streaming data ingestion and real-time processing. Experience with BI tools such as Tableau, Looker, or Power BI.

Gen AI Developer New Delhi,Hyderabad,Bengaluru 3 - 8 years INR 4.0 - 9.0 Lacs P.A. Hybrid Full Time

About the Role: We are seeking a talented and experienced GenAI Developer with expertise in Python , Large Language Models (LLMs) , Natural Language Processing (NLP) , and AI agent development to join our dynamic team. The ideal candidate will work on the development and integration of cutting-edge AI solutions in cloud environments (AWS, GCP, Azure), integrating with CRM systems, and optimizing AI model performance. You will be responsible for developing AI agents , fine-tuning models using Hugging Face Transformers , and OpenAI technologies, and optimizing systems to meet business needs in a scalable and ethical manner. The role involves close collaboration with cross-functional teams, and youll be working on AI-driven features , APIs , and cloud integrations. Key Responsibilities: AI Model Development: Design, develop, and fine-tune AI models using OpenAI LLMs , Hugging Face Transformers , and advanced NLP techniques (e.g., prompt engineering , reinforcement learning from human feedback (RLHF) ). Cloud Integration: Develop, deploy, and optimize AI solutions on cloud platforms ( AWS , GCP , Azure ) ensuring scalability , reliability , and performance . API Development & Integration: Build and maintain APIs to integrate AI solutions with CRM platforms and third-party applications. Ensure seamless integration of AI agents into customer-facing systems. Model Interpretability & Transparency: Apply model interpretability tools to ensure transparency and ethical AI practices, and help explain AI model decisions to both technical and non-technical stakeholders. Continuous Learning & Optimization: Stay updated on the latest advancements in LLMs , AI agents , and Generative AI (GenAI) technologies. Continuously optimize the performance of AI models, focusing on accuracy, efficiency, and business needs. Collaboration & Innovation: Work closely with cross-functional teams to design innovative AI-driven features. Contribute to architecture decisions and AI system enhancements. Apply Agile development practices to ensure timely delivery. Candidate Profile: Education: A Bachelor’s degree in Computer Science , Artificial Intelligence , Information Technology , or a related technical field. Experience: 2 to 4 years of professional experience in AI development , with a strong focus on LLMs , NLP , Python , and cloud platforms . Proven experience with model fine-tuning , API development , and cloud integration is essential. Skills Required: Proficiency in Python with experience in Django or other web frameworks. Strong understanding of NLP techniques, including prompt engineering and reinforcement learning . Hands-on experience with Hugging Face Transformers and OpenAI models (GPT, etc.). Strong experience with cloud platforms ( AWS , GCP , Azure ) for deploying AI solutions. Familiarity with version control systems (GitHub, Git). Experience developing and consuming RESTful APIs for seamless third-party integrations. Familiarity with database technologies (e.g., PostgreSQL , MySQL , MongoDB ) and GraphQL is a plus. Preferred Skills: Knowledge of Generative AI (GenAI) frameworks (e.g., Azure OpenAI , DALL•E ). Experience with RAG (retrieval-augmented generation) techniques for improving model performance. Familiarity with tools like Jupyter Notebooks , Streamlit , and Flask for rapid prototyping. Experience with ethical AI practices and ensuring model transparency.

Data Scientist pune,chennai 3 - 8 years INR 3.0 - 8.0 Lacs P.A. Hybrid Full Time

Required Skills Strong experience in RAG (Retrieval-Augmented Generation) or Machine Learning, including implementation, debugging, and optimization. Hands-on knowledge of Agentic AI frameworks (practical experience preferred over theoretical). Advanced Python proficiency with emphasis on writing clean, maintainable code, debugging complex issues, and troubleshooting pipelines. Competitive coding experience (e.g., LeetCode, HackerRank) is a big advantage. A passionate, continuous learner who keeps pace with the latest AI advancements. What Youll Own Full-cycle development of RAG solutions and Agentic AI components. Debugging and optimizing AI/ML pipelines in production. Close collaboration with cross-functional teams to deliver practical, scalable solutions. Creating technical documentation and actively sharing insights to uplift team knowledge.

Full Stack Engineer / Technical Lead chennai 3 - 8 years INR 8.0 - 18.0 Lacs P.A. Hybrid Full Time

Job Title: Full Stack Engineer / Tech Lead ( Multiple Positions) Experience: 3 to 8 years Location: Chennai Hybrid Shift Timing: 2:00 PM 11:00 PM Notice Period: 30 days Experience - 3 to 10 years Job Summary: We are looking for an experienced Full Stack Engineer with strong expertise in both frontend and backend technologies, along with hands-on experience in DevOps fundamentals. The ideal candidate should have prior experience in Commercial P&C (Property & Casualty) domain and be able to work in a fast-paced environment. Key Responsibilities: Develop, enhance, and maintain web and mobile applications using modern frameworks and tools. Design and implement scalable, high-performance backend services and APIs. Collaborate with cross-functional teams to define, design, and deliver innovative solutions. Implement best practices for code quality, security, and performance. Work with DevOps tools to manage CI/CD pipelines and ensure smooth deployments. Optimize applications for maximum speed and scalability. Technical Skills Required: Frontend: React or Angular JavaScript/TypeScript Flutter for Browser and Mobile Backend: .NET Core Web API, REST Database: SQL Server PostgreSQL DevOps: Basic knowledge of CI/CD pipelines (Azure DevOps, GitHub Actions) Domain Expertise: Commercial P&C (Property & Casualty) Insurance experience is mandatory. Qualifications: Bachelors degree in Computer Science, Information Technology, or related field. 3–8 years of relevant experience as a Full Stack Engineer. Additional Details: Shift: 2:00 PM – 11:00 PM IST Notice Period: Immediate to 30 days

Full-Stack Engineer Lead (.NET & Azure) pune,chennai 6 - 11 years INR 25.0 - 40.0 Lacs P.A. Work from Office Full Time

Title: Full-Stack Engineer Lead (.NET & Azure) Location: Chennai / Pune Employment Type: Full-Time/Contractual Experience Level: Mid to Senior (6+ Years) We are looking for a talented and self-driven Full-Stack Engineer with strong expertise in .NET technologies and Microsoft Azure, and a solid background in SaaS product development. This is a hands-on engineering role with opportunities to influence architecture, design, and product strategy. Responsibilities Design and develop full-stack web applications using ASP.NET Core, C#, and Azure services Build intuitive and responsive user interfaces using React or Angular Develop and maintain RESTful APIs and microservices for SaaS platforms Build Serverless or Containerized applications Implement CI/CD pipelines using Azure DevOps, GitHub Actions, or similar tools Optimize performance, scalability, and reliability of SaaS applications in cloud environments [optional] Collaborate with product managers, designers, and QA teams to deliver customer-centric features [exposure to Agile delivery practices] Participate in agile ceremonies, code reviews, and continuous improvement initiatives Required Skills & Qualifications Bachelors degree in computer science, Engineering, or related field 4+ years of experience in full-stack development Strong proficiency in C#, ASP.NET Core, and Entity Framework Hands-on experience with Azure services such as App Services, Functions, Blob Storage, Key Vault, and Azure SQL Proven experience in SaaS product development, including multi-tenancy, subscription models, and feature rollout strategies [Good to have] Solid understanding of JavaScript, TypeScript, and modern frontend frameworks (React or Angular) Experience with SQL Server and/or NoSQL databases Familiarity with containerization (Docker) and orchestration (Kubernetes, Azure Container Apps) Knowledge of DevOps practices, CI/CD pipelines, and infrastructure as code (ARM, Bicep, or Terraform) Familiarity with GitHub Copilot or other AI-assisted development tools Excellent problem-solving and communication skills Preferred Qualifications Azure certifications (e.g., AZ-204, AZ-400) Good to have Experience with GraphQL, SignalR, or gRPC Understanding of GDPR, SOC 2, and other compliance frameworks relevant to SaaS Frontend : Build user interfaces using HTML, CSS, JavaScript, and frameworks like React, Angular, or Vue Backend : Develop server-side logic using languages like C#, Java, Python, or Node.js Databases : Work with SQL (e.g., PostgreSQL, MySQL) or NoSQL (e.g., MongoDB) to store and retrieve data DevOps & Cloud : Deploy applications using platforms like Azure, AWS, or Docker; manage CI/CD pipelines API Integration : Create and consume RESTful or GraphQL APIs to connect different parts of the system

Workday Functional Consultant- HR Processes pune,chennai,bengaluru 5 - 10 years INR 7.0 - 17.0 Lacs P.A. Hybrid Full Time

Job Title: Workday Functional Consultant- Location: Mumbai, Pune, Chennai, Hyderabad, Bangalore , Gaurgam , Kolkota Notice Period: Immediate Joiner to 15Days Experience - 5 To 10 Years Job Description We are looking for an experienced Workday Functional Consultant with strong expertise across multiple HR modules. The ideal candidate should have hands-on experience in Workday implementation, HR processes, and a proven track record of delivering successful projects. Key Responsibilities: Lead and support Workday Implementation projects Gather requirements, design solutions, perform data conversions, and ensure successful Workday module integration. Work closely with stakeholders to understand HR processes and interfaces with other modules. Provide expertise in multiple Workday modules including: Core HCM Payroll Benefits Talent Management Time & Absence Compensation Drive solutions in areas such as Talent & Performance Management, Recruiting, Reporting, Prism Analytics, Payroll, Learning, Absence Management, and Time Tracking . Ensure alignment of HR processes with organizational goals. Provide functional consulting and support across multiple projects. Required Skills & Experience: Minimum 5 + years of experience in Workday Core HCM, Security, Core & Advanced Compensation. Strong experience in HR process consulting and functional solution delivery. Hands-on expertise with at least two or more full Workday implementation projects . Ability to design, configure, and optimize Workday modules. Strong communication skills with the ability to engage stakeholders effectively. Exposure to multiple industries is an added advantage.

Databricks Lead Engineer / Databricks Architects hyderabad,chennai,bengaluru 6 - 9 years INR 16.0 - 22.5 Lacs P.A. Hybrid Full Time

Required Skills & Experience 6-9 years of experience in Consulting, Data, and Analytics , with a strong background in Databricks solutions . Hands-on experience in designing and implementing big data solutions , including data pipelines for large and complex datasets. Expertise in Python, PySpark , and orchestration frameworks such as Airflow, Oozie, Luigi , etc. Strong understanding of Databricks workspace and its components (workspace, compute, clusters, jobs, Unity Catalog, UC permissions). Ability to configure, schedule, and manage Databricks clusters and jobs , including performance optimization and monitoring. Experience with data modeling , SQL queries, joins, stored procedures, relational schemas , and schema design. Proficiency in working with structured, semi-structured, and unstructured data (CSV, Parquet, Delta Lake, Delta Tables, Delta Live Tables). Strong knowledge of Spark architecture , Spark-Streaming, and analyzing jobs using Spark UI for monitoring and troubleshooting. Knowledge of using REST API endpoints for data consumption and integration with external sources. Proven experience in ETL / Data Warehouse transformation processes and building streaming/real-time data processing solutions. Experience in cloud-based data platforms such as AWS (preferred), GCP, or Azure , including managing data sources like AWS S3. Ability to design and implement end-to-end data ingestion pipelines , ensuring data quality and consistency. Experience with CI/CD pipelines configuration in Databricks . Strong logical structuring, problem-solving, verbal, written, and presentation skills. Ability to work as an individual contributor as well as lead a team in an Agile setup . Capable of delivering and presenting Proof of Concepts (POCs) to stakeholders. Optional / Good to Have Experience in Databricks Administration (user access, Spark tuning, data frame manipulation, etc.). Exposure to real-time data movement solutions with security and encryption protocols. Experience in Application Development or Data Warehousing across enterprise technologies. Life Sciences industry domain knowledge.