Jobs
Interviews

HYRGPT

13 Job openings at HYRGPT
Regional Sales Manager Mumbai,Maharashtra,India 0 years Not disclosed On-site Full Time

Who we’re looking for: A sales leader with a solid network among doctors in Mumbai , passionate about eldercare, and proven in driving revenue and partnerships in healthcare/homecare sectors. Key Responsibilities: Own P&L and revenue growth for the region. Lead and mentor a team of BDMs & Sr. Sales Managers. Drive doctor/hospital partnerships for lead generation and conversion. Develop and execute territory-specific sales strategies. Analyze sales data and optimize performance using MIS. You’ll excel if you have: 10+ years in field sales , 7+ in leadership with P&L. Deep experience in healthcare sales or partnerships. Strong Mumbai-based network with doctors and hospitals. MBA (Preferred). Location: Mumbai Show more Show less

Sales Development Representative New Delhi,Delhi,India 1 - 3 years None Not disclosed On-site Full Time

Join Our Team as a Sales Development Representative (SDR) -UK and Europe Market Are you a skilled sales professional with a proven track record in the Saas industry? Job Description: As an SDR focused on the UK market, you will be instrumental in driving revenue growth through proactive outreach by generating and qualifying leads, nurturing relationships, and building a robust sales pipeline. The current role is for the AI product Zoi AI. Key Responsibilities: Conduct outbound sales prospecting activities to identify and qualify potential customers in the overseas market (UK preferable). Engage with prospects via email, social media (LinkedIn), tele, and other outbound campaigns to understand their needs and present our Saas solutions effectively. Write compelling and personalised emails and LinkedIn messages to engage and nurture leads. Conduct product demos to prospects and clearly articulate value propositions. Collaborate closely with the sales, marketing & IT teams to understand solution proposition, execute targeted campaigns and optimize lead engagement strategies. Build and maintain a consistent pipeline of qualified opportunities. Utilize CRM tools to manage lead interactions, track progress, and report on sales activities. Achieve and exceed quarterly and annual sales targets. Qualifications: 1-3 years of experience in sales development or inside sales within the UK market, preferably in the Saas industry. Full time graduation in any stream is mandatory. Proven ability to generate and convert leads to qualified opportunities. Strong communication and interpersonal skills to provide product demonstration, with the ability to articulate value propositions clearly. Self-motivated, goal-oriented, and capable of working independently. Proficiency in using CRM software (e.g.HubSpot) and sales prospecting tools.

Artificial Intelligence Engineer - LLM Models Delhi,Delhi,India 0 years None Not disclosed On-site Full Time

Artificial Intelligence : 2-5 : Delhi Skills Were looking for sharp AI developers with hands-on experience in AI development, ideally with a strong foundation in Python (78/10 skill level) and practical exposure to AI/ML concepts and OpenAI agent development. Candidates should possess strong logical thinking, prompt engineering skills, and clear communication abilities. This role offers the opportunity to work on real-world AI agent applications in a fast-growing SaaS platform focused on sales and productivity tools for consulting and financial Full-time B.Tech/BE in Computer Science or related Youll Do : Build intelligent AI agents using OpenAI and LLMs. Design APIs and backend logic to integrate AI into SaaS workflows. Create AI-driven features in Python (Django/Flask). Work on CRM and third-party tool integrations. Continuously improve agent capabilities using prompt engineering and user Should Have : Strong Python coding skills (78/10 level). Hands-on experience with OpenAI APIs and LLMs. Understanding of AI/ML models and architectures. Prompt-writing capability with logical clarity. Familiarity with REST APIs, GitHub, and agile workflows. Bonus: Exposure to React, Mongo/PostgreSQL, GraphQL, or CRM systems. (ref:hirist.tech)

Intersystem IRIS Developer - FHIR/Cache ObjectScript Greater Kolkata Area 3 - 5 years None Not disclosed On-site Full Time

Job Description We are seeking talented and motivated Intersystem IRIS Developers to join our growing team at our USI locations. The ideal candidates will have strong hands-on experience in the Intersystem IRIS Data Platform and be proficient in designing, developing, and optimizing healthcare or enterprise-grade solutions. Key Responsibilities Design and develop applications using Intersystem IRIS, ObjectScript, and SQL Build and integrate RESTful and SOAP-based web services Work with FHIR standards for healthcare data integration and interoperability Conduct data modeling, schema design, and performance tuning Deploy and manage IRIS environments on Linux/Windows platforms Collaborate with cross-functional teams to deliver high-quality solutions Provide technical guidance and troubleshooting support Required Skills 3-5 years of hands-on experience with Intersystem IRIS Data Platform Strong knowledge of ObjectScript and SQL Proficiency in building and integrating REST/SOAP APIs FHIR implementation experience is a must Experience in data modeling and performance tuning Familiarity with deploying solutions on Linux/Windows Nice-to-Have Skills Working knowledge of Python, Java, or .NET Experience with Docker containerization (ref:hirist.tech)

Data Engineer - Snowflake DB Chennai,Tamil Nadu,India 3 - 5 years None Not disclosed On-site Full Time

Job : Snowflake Data Engineer Experience : 3 -5 YRs Location : Hyderabad, Bangalore, Mumbai, Gurgaon, Pune, Chennai, Kolkata Employment Type : Full-Time Job Summary We are seeking a skilled Snowflake Data Engineer with 3- 5 years of hands-on experience in designing, developing, and optimizing cloud-based data warehousing solutions. This position offers a compelling opportunity to work on a flagship data initiative for one of our premier Big 4 consulting clients, providing ample scope for technical innovation, learning, and career growth. Key Responsibilities Design & Develop High-Performance Data Pipelines : Build scalable and efficient Snowflake pipelines for data ingestion, transformation, and storage. Special focus on external tables, semi-structured data handling, and transformation logic. Optimize Snowflake Workloads Ensure optimal query execution and cost-effective utilization of compute and storage resources. Tune performance across large-scale datasets and implement workload management strategies. ETL/ELT Development Develop robust ETL processes using SQL, Python, and orchestration tools like DBT, Apache Airflow, Matillion, or Talend. Focus on automation, data transformation, and pipeline reliability. Integrate With AWS Glue Utilize AWS Glue capabilities such as crawlers, jobs, and external tables for seamless integration with Snowflake. Ensure consistent and automated data ingestion and cataloging. Data Governance & Security Enforce data governance, role-based access control, and compliance protocols within Snowflake. Ensure secure handling of sensitive data and privacy adherence. Handle Diverse Data Formats Ingest and transform structured and semi-structured formats like JSON, Parquet, Avro, XML, etc. Enable flexibility in data consumption across reporting and analytics. Data Modeling Design dimensional models including fact and dimension tables optimized for Snowflake architecture. Enable efficient querying and integration with BI tools. Cross-Functional Collaboration Partner with business stakeholders, data analysts, and BI developers to align technical solutions with business needs. Translate business requirements into scalable data solutions. Pipeline Monitoring & Issue Resolution Monitor end-to-end data workflows and ensure system reliability. Troubleshoot failures and performance bottlenecks proactively. Key Skills & Qualifications Hands-on experience with Snowflake development and architecture. Proficiency in SQL, Python, and cloud-native ETL/ELT tools. Experience with AWS Glue, S3, and Snowflake integration. Strong knowledge of data modeling, performance tuning, and cost optimization. Familiarity with handling semi-structured data. Good understanding of data governance, access control, and security best practices. Excellent problem-solving and communication skills. Nice To Have Experience working with Big 4 consulting clients or large enterprise environments. Exposure to DevOps practices, CI/CD pipelines, and data quality framework (ref:hirist.tech)

AWS Data Engineer - ETL/Python Greater Kolkata Area 0 years None Not disclosed On-site Full Time

AWS Data Engineer Design and optimize scalable, cloud-native data pipelines on AWS. Build and manage ETL workflows using SQL, Python, and modern orchestration tools. Work with both structured and semi-structured data formats, including JSON, Parquet, and Avro. Collaborate with BI and analytics teams to deliver data models that power dashboards and reporting tools. Ensure pipeline performance, cost-efficiency, and data security within a governed AWS environment. Support query performance tuning, data transformations, and automation. Participate in real-time data streaming and event-driven architectures using AWS-native Youll Use : Strong Knowledge of AWS Glue, Athena, Lambda, Step Functions Knowledge of Redshift, S3, EC2, EMR, Kinesis Hands on experience in SQL (query tuning & scripting) Hands on experience in Python, DBT, Airflow, Add-ons : CI/CD for data pipelines using CodePipeline / CodeBuild Experience with real-time/streaming architectures Strong problem-solving and cloud architecture skills (ref:hirist.tech)

Oracle Data Integrator Lead - ETL Pipeline Greater Kolkata Area 6 years None Not disclosed On-site Full Time

Key Responsibilities Oracle Data Integrator (ODI) ETL Lead : Lead the design, development, and deployment of ETL workflows using Oracle Data Integrator (ODI). Build and optimize complex ODI mappings, load plans, and custom Knowledge Modules (KMs). Contribute to multiple full-cycle data warehouse implementations, ensuring timely and quality delivery. Implement advanced ETL solutions including delta extracts, parameterization, data auditing, and error handling. Manage ODI repositories, agents, and topologies; monitor and optimize ETL performance. Perform data migrations leveraging SQL Loader and ODI export/import utilities. Ensure ETL pipelines are automated, scalable, and reliable to support enterprise data needs. Collaborate with cross-functional teams to integrate ODI with diverse source and target Qualifications & Skills : 6+ years of ETL development experience, with 34 years focused on Oracle Data Integrator (ODI). Strong expertise in Oracle PL/SQL, ETL design principles, and ODI component management. Hands-on experience with ODI Master and Work repositories, custom Knowledge Module development, and performance tuning. Proven ability to integrate ODI across various data sources and target systems. Excellent problem-solving skills and ability to troubleshoot complex ETL workflows. Strong communication and collaboration Skills : Exposure to cloud-based data platforms (e.g., AWS, Azure, GCP). Familiarity with DevOps practices applied to ETL development. Experience with real-time data processing and streaming architectures. (ref:hirist.tech)

Snowflake Engineer - ETL/AWS Glue Greater Kolkata Area 5 years None Not disclosed On-site Full Time

Snowflake : 5- 10 years : Anywhere in Responsibilities : Design and develop scalable, high-performance data pipelines in Snowflake. Lead ETL/ELT development using tools such as SQL, Python, DBT, Airflow, Matillion, or Talend. Migrate complex T-SQL logic and stored procedures from SQL Server to Snowflake-compatible SQL or ELT workflows. Integrate AWS Glue to automate and orchestrate data workflows. Work with structured and semi-structured data formats (e.g., JSON, Parquet, Avro, XML). Optimize Snowflake performance and cost through effective query tuning and warehouse resource management. Design data models that support business intelligence and analytics use cases. Ensure high standards of data quality, validation, and consistency during migration and transformation processes. Enforce data governance, security, and access control policies to ensure compliance with organizational standards. Collaborate with data architects, business stakeholders, and analytics teams to understand requirements and deliver data solutions. Maintain up-to-date technical documentation including data flows, mapping specifications, and operational Skills & Qualifications : 5+ years of experience in data engineering or a similar role. Hands-on experience with Snowflake and cloud-based data platforms (AWS preferred). Strong expertise in SQL and at least one scripting language (preferably Python). Experience with ETL/ELT tools like DBT, Airflow, Matillion, or Talend. Familiarity with AWS Glue and other cloud-native data services. Proven ability to work with semi-structured data. Solid understanding of data modeling, data warehousing concepts, and BI tools. Strong focus on performance tuning, data validation, and data quality. Excellent communication and documentation skills. (ref:hirist.tech)

Technical Lead - Frontend Architecture Chennai,Tamil Nadu,India 6 - 8 years None Not disclosed On-site Full Time

Job Description Collaborate with internal team during design, implementation and testing of enterprise software products. Work with different stakeholders in understanding customer requirements and developing solutions to address these requirements. Involve in UI design and development, unit tests and integration tests. Should have used versioning tools and should have a clear understanding of product life cycle. Understanding of Agile methodology is a plus Qualification 6 - 8 years of experience in software development/support of software products Computer Science or equivalent engineering (BE/MTech/MCA) / B.Sc graduate from reputed university/college with good academic records Should have experience working on front end framework applications using Angular 7 and Angular 10 & above. Strong understanding of HTML5, CSS3, and JavaScript/TypeScript. Proficient in responsive and mobile-first design principles Familiarity with RESTful APIs and asynchronous request handling Experience with version control systems (e.g., Git) and CI/CD pipelines Knowledge of browser developer tools for debugging and performance optimization Should possess strong analytical skills and be able to quickly adapt to a changing environment. Excellent debugging, troubleshooting skills on web interface and services etc. Ability to quickly learn new concepts and software is necessary Desired Skill Set Primary Skill : Angular 7 (mandatory) and Angular 10 and above, HTML 5.0, CSS 3.0, Skill : SQL Server, Cloud deployment, Micro frontend (ref:hirist.tech)

Big Data Specialist Chennai,Tamil Nadu,India 7 - 9 years None Not disclosed On-site Full Time

Job Title: Senior Cloud Systems Engineer (Big Data) Technology: Google/AWS/Azure public cloud platform, Big Query / Airflow / Data Flow/Dataproc/ PySpark, Terraform, Ansible, Jenkins, Linux and Git. Location: Chennai/Hyderabad, India Experience: Overall 7 to 9 years Job Summary: We are seeking a Senior Big Data Systems Engineer who is interested to lead a high performing team of Cloud engineers/data engineers for a larger Big Data upstream environment hosted on On-Prem and cloud. The candidate should possess in-depth knowledge in RedHat on Google cloud. Job Description: Skills, Roles and Responsibilities (Google/AWS/Azure public cloud, PySpark, Big Query and Google Airflow) Participate in 24x7x365 SAP Environment rotational shift support and operations As a team lead you will be responsible for maintaining the upstream Big Data environment day in day out where millions of financial data flowing through, consists of PySpark, Big Query, Dataproc and Google Airflow You will be responsible for streamlining and tuning existing Big Data systems and pipelines and building new ones. Making sure the systems run efficiently and with minimal cost is a top priority Manage the operations team in your respective shift. You will be making changes to the underlying systems This role involves providing day-to-day support, enhancing platform functionality through DevOps practices, and collaborating with application development teams to optimize database operations. Architect and optimize data warehouse solutions using BigQuery to ensure efficient data storage and retrieval. Install/build/patch/upgrade/configure big data applications Manage and configure BigQuery environments, datasets, and tables. Ensure data integrity, accessibility, and security in the BigQuery platform. Implement and manage partitioning and clustering for efficient data querying. Define and enforce access policies for BigQuery datasets. Implement query usage caps and alerts to avoid unexpected expenses. Should be very comfortable with troubleshooting Linux-based systems on issues and failures with good grasp of the Linux command line. Create and maintain dashboards and reports to track key metrics like cost, performance. Integrate BigQuery with other Google Cloud Platform (GCP) services like Dataflow, Pub/Sub, and Cloud Storage. Enable BigQuery through tools like Jupyter notebook, Visual Studio code, other CLI's. Implement data quality checks and data validation processes to ensure data integrity. Manage and monitor data pipelines using Airflow and CI/CD tools (e.g., Jenkins, Screwdriver) for automation. Collaborate with data analysts and data scientists to understand data requirements and translate them into technical solutions. Provide consultation and support to application development teams for database design, implementation, and monitoring. Proficiency in Unix/Linux OS fundamentals, shell/perl/python scripting, and Ansible for automation. Disaster Recovery & High Availability. Expertise in planning and coordinating disaster recovery principles, including backup/restore operations. Experience with geo-redundant databases and Red Hat cluster. Accountable for ensuring that delivery is within the defined SLA and agreed milestones (projects) by following best practices and processes for continuous service improvement. Work closely with other Support Organizations (DB, Google, PySpark data engineering and Infrastructure teams). Incident Management, Change Management, Release Management and Problem Management processes based on ITIL Framework.

Oracle Retail Consultant karnataka 4 - 8 years INR Not disclosed On-site Full Time

You are a skilled Oracle Retail Technical Consultant with at least 3-8 years of experience, seeking to join a dynamic team to contribute to exciting integration projects. Your primary responsibility will be to work on various integration projects using your expertise in Oracle Retail Technical Integration and Data Flow Architecture. Your key skills and expertise will include a strong understanding of Oracle Retail Technical Integration & Data Flow Architecture, proficiency in integration patterns such as File-based, RFI, and API-based, experience in integrating MFCS with boundary applications through middleware or point-to-point connections, advanced PL/SQL skills, and APEX knowledge (preferable). Additionally, you should have a good grasp of Oracle Retail modules and functionalities. If you are enthusiastic about Oracle Retail technology and enjoy taking on challenging projects, we are eager to have you join our team. Apply now or reach out for more details.,

Data Engineer - Snowflake DB chennai,tamil nadu 3 - 7 years INR Not disclosed On-site Full Time

As a Snowflake Data Engineer with 3-5 years of experience, you will be responsible for designing, developing, and optimizing cloud-based data warehousing solutions. This is an exciting opportunity to work on a flagship data initiative for a premier Big 4 consulting client, offering ample scope for technical innovation, learning, and career growth. Your key responsibilities will include: - Designing and developing high-performance data pipelines in Snowflake for data ingestion, transformation, and storage. You will focus on external tables, semi-structured data handling, and transformation logic. - Optimizing Snowflake workloads to ensure optimal query execution and cost-effective utilization of compute and storage resources. You will tune performance across large-scale datasets and implement workload management strategies. - Developing robust ETL processes using SQL, Python, and orchestration tools like DBT, Apache Airflow, Matillion, or Talend. Automation, data transformation, and pipeline reliability will be your focus. - Integrating with AWS Glue by utilizing capabilities such as crawlers, jobs, and external tables for seamless integration with Snowflake. You will ensure consistent and automated data ingestion and cataloging. - Enforcing data governance, role-based access control, and compliance protocols within Snowflake to ensure secure handling of sensitive data and privacy adherence. - Handling diverse data formats including structured and semi-structured formats like JSON, Parquet, Avro, XML, etc., to enable flexibility in data consumption across reporting and analytics. - Designing dimensional models optimized for Snowflake architecture, including fact and dimension tables, to enable efficient querying and integration with BI tools. - Collaborating with business stakeholders, data analysts, and BI developers to translate business requirements into scalable data solutions. - Monitoring end-to-end data workflows, ensuring system reliability, and proactively troubleshooting failures and performance bottlenecks. Key Skills & Qualifications: - Hands-on experience with Snowflake development and architecture. - Proficiency in SQL, Python, and cloud-native ETL/ELT tools. - Experience with AWS Glue, S3, and Snowflake integration. - Strong knowledge of data modeling, performance tuning, and cost optimization. - Familiarity with handling semi-structured data. - Good understanding of data governance, access control, and security best practices. - Excellent problem-solving and communication skills. Nice To Have: - Experience working with Big 4 consulting clients or large enterprise environments. - Exposure to DevOps practices, CI/CD pipelines, and data quality framework. If you are looking to leverage your expertise in Snowflake and cloud-based data warehousing to drive technical innovation and deliver scalable solutions, this role offers an exciting opportunity to grow your career and make a significant impact.,

Sr BA - Lead Bengaluru,Karnataka,India 5 years None Not disclosed On-site Full Time

We’re Hiring: Production Support Sr. Business Analyst / Lead Location: [Pune / Chennai ] | Experience: 5+ years | Industry: Insurance About the Role: We are looking for a Production Support Sr. Business Analyst / Lead to join our team and ensure smooth operations of our insurance policy administration system (ORIGAMI) . This role involves incident management, system troubleshooting, and driving enhancements to improve functionality and user experience. If you have strong insurance domain expertise , love solving complex problems, and enjoy working at the intersection of business and technology , this role is for you! What You’ll Do: ✅ Incident & Problem Management – Triage issues, analyze logs, identify root causes, and implement solutions to restore services quickly. ✅ Enhancements & Change Requests – Gather business requirements, work with configuration teams, and ensure successful delivery of system changes. ✅ Testing & UAT Support – Validate fixes and enhancements through functional testing and coordinate UAT activities with business teams. What We’re Looking For: ✔ 5+ years of Business Analyst experience in the insurance industry ✔ Solid knowledge of commercial insurance processes and policy lifecycle development ✔ Prior experience with PAS systems and rating programs ✔ Strong analytical skills and ability to recommend technical solutions ✔ Excellent communication & stakeholder management skills ✔ Hands-on experience with DevOps What Sets You Apart: ✅ Proactive & self-driven with a collaborative mindset ✅ Problem-solving attitude and ability to work independently If you’re passionate about delivering value, ensuring system stability, and enhancing user experience , we’d love to hear from you! 📩 Apply now or reach out via DM for more details.