Job Description: RPA Developer (8 Years Experience, Healthcare Domain Preferred) Position Summary : We are looking for an experienced RPA Developer with at least 8 years of professional software development experience, a strong focus on Robotic Process Automation (RPA), and a preference for candidates with healthcare domain expertise. The ideal candidate will design, develop, test, and deploy end-to-end automation solutions, particularly using UiPath. Prior exposure to healthcare data and workflows is highly desirable. Key Responsibilities : Design and develop scalable, end-to-end RPA solutions to automate key business processes, with a focus on healthcare workflows and compliance. Analyze automation requirements, perform feasibility studies, and suggest process improvements. Create, document, and review process design documents (PDDs), solution design documents (SDDs), test cases, and support documentation. Manage the full automation lifecycle: requirements gathering, development, testing, deployment, and post-implementation support. Develop robust frameworks with effective exception handling, logging, and error recovery processes. Integrate RPA bots with external healthcare systems, APIs, and databases, ensuring HIPAA compliance when required. Mentor and guide junior RPA developers, participate in code reviews, and foster a collaborative development environment. Stay current on RPA trends and best practices; proactively suggest adoption of new tools and techniques. Required Qualifications: Bachelors degree in Computer Science, Engineering, or a related field. 8+ years of overall software development experience; minimum 5+ years in RPA development (UiPath, Blue Prism, or Automation Anywhere). Deep understanding of healthcare industry processes (EMR/EHR, claims processing, eligibility, compliance, etc.) is highly preferred. Proven ability to manage full RPA deployment including bots, orchestrators, and system integrations in enterprise environments. UiPath Advanced Developer or similar RPA certifications are a plus. Solid problem-solving, analytical, and communication skills. Preferred Skills: Integration of RPA with AI/ML technologies or cloud platforms (Azure, AWS, GCP). Strong experience with scripting (Python, PowerShell, etc.), SQL, and integration via REST/SOAP APIs. Understanding of process improvement methodologies (Lean, Six Sigma). Track record of designing reusable components and frameworks, especially in a regulated healthcare environment. Strong documentation and stakeholder management abilities. Soft Skills: Strong communication and interpersonal skills for cross-functional collaboration. Ability to mentor, lead, and motivate technical teams. Exceptional organizational and multitasking skills, especially in fast-paced and dynamic work environments. Example Technologies Used: UiPath Azure Storage, SQL Server, mainframe applications API integration tools, OCR, Excel Automation .NET framework, Python This role offers the opportunity to drive automation for impactful healthcare operations, lead projects, and advance digital transformation within a mission-driven environment
Job Description: Senior Data Engineer Healthcare Data Key Responsibilities Data Pipeline Development: Design, implement, and optimize scalable data pipelines for extracting, transforming, and loading (ETL) healthcare data from various sources. Healthcare Data Integration: Work with sensitive healthcare datasets, ensuring compliance with industry standards such as FHIR, HL7, and Epic Clarity extracts. Workflow Orchestration: Utilize tools like Apache Airflow (or similar) to orchestrate and automate complex healthcare data workflows at scale. Cross-Functional Collaboration: Partner closely with data scientists, backend engineers, and stakeholders to integrate and support data systems across the organization. Technical Problem Solving: Tackle domain-specific challenges with innovative solutions that enhance the quality, accessibility, and usability of healthcare data to improve patient outcomes. Minimum Qualifications Bachelors degree in Computer Science, Data Science, Engineering, or a related discipline. At least 4 years of hands-on experience in data engineering roles, with strong proficiency in Python and MongoDB. Proven experience in building and maintaining robust data pipelines using data from hospital EHR systems. Familiarity with data lakes and scalable data architecture principles for storing and managing large datasets. Preferred Qualifications Experience with Microsoft Azure cloud services and tools for scalable data processing and storage. Strong understanding of data validation and quality assurance practices in healthcare data workflows. Familiarity with anomaly and outlier detection techniques in high-volume datasets. Proficiency in using Apache Spark or similar tools to analyze and query large-scale data. Experience developing and maintaining RESTful APIs to facilitate data integration and accessibility.
We're Hiring! HR Intern Wanted! Are you passionate about people, culture, and making a difference in the workplace? Looking to kickstart your career in Human Resources? Join our dynamic HR team as an HR Intern and gain hands-on experience in: Talent Acquisition & Recruitment Employee Engagement & Communication HR Operations & Data Management Onboarding & Documentation Fun HR Projects & More! Who can apply? 2025 MBA graduates (Freshers welcome!) Specialization in HR, Business Administration. Excellent communication & interpersonal skills Proactive attitude with a willingness to learn Basic proficiency in MS Office Shift Timings: 11:00 AM to 8:00 PM IST Location: Hyderabad Duration: 3 Months
Job Description: AI Engineer Roles: The AI Engineer at Genzeon will develop and deploy artificial intelligence solutions with a focus on healthcare applications. You'll work with cross-functional teams to create AI solutions that improve clinical outcomes and operational efficiency. Key Responsibilities Create and implement AI solutions to automate complex workflows Clean and preprocess data for AI model training and validation Deploy models to production environments and monitor their performance Collaborate with team members to brainstorm innovative solutions Document models, processes, and research findings Support business users and clients in resolving technical queries Participate in proof-of-concept projects for potential clients Requirements 3-6 Months experience of Python programming experience Knowledge of machine learning libraries Familiarity with AI orchestration frameworks Experience building AI agents and agentic systems Understanding of core machine learning concepts and algorithms Experience with SQL databases and data manipulation techniques Strong analytical and problem-solving skills Effective communication and teamwork abilities Interest in healthcare industry applications of AI Preferred Qualifications Experience with cloud platforms (AWS, Google Cloud, Azure) Knowledge of data visualization tools Understanding of DevOps practices for model deployment Experience with version control systems (Git) Mode of Interview: 1. Assessment Test 2. Technical Interviews (2 rounds) Face to Face
Key Responsibilities : Design and develop Snowflake pipelines, data models, and transformations Provide L2/L3 production support for Snowflake jobs, queries, and integrations Troubleshoot failed jobs, resolve incidents, and conduct RCA Tune queries, monitor warehouses, and help optimize Snowflake usage and cost Handle service requests like user provisioning, access changes, and role management Participate in code reviews, deployment pipelines, and continuous improvement Document issues, enhancements, and standard procedures (runbooks) Required Skills & Experience: 4+ years of hands-on experience in Snowflake development and support Strong SQL, data modeling, and performance tuning experience Exposure to CI/CD pipelines and scripting languages (e.g., Python, Shell) Understanding of Snowflake security (RBAC), warehouse sizing, cost controls Experience with data pipelines and orchestration tools (Airflow, dbt, ADF) Preferred: SnowPro Core Certification Experience with ticketing systems (ServiceNow, Jira) Cloud experience with AWS, Azure, or GCP Basic understanding of ITIL processes
Key Responsibilities: Lead the design and implementation of cloud-native microservices using Python FastAPI, Pydantic, and Async I/O. Architect, build, and optimize APIs, worker services, and event-driven systems leveraging Confluent Kafka. Define and enforce coding standards, testing strategies, and development best practices. Implement CI/CD pipelines using GitHub Actions or other tools and manage secure and scalable deployments. Work with Docker, Terraform, and GCP infrastructure services including Cloud Run, Pub/Sub, Secret Manager, Artifact Registry, and Eventarc. Guide the integration of monitoring and observability tools such as New Relic, Cloud Logging, and Cloud Monitoring. Drive initiatives around performance tuning, caching (Redis), and data transformation including XSLT, XML/XSD processing. Support version control and code collaboration using Git/GitHub. Mentor team members, conduct code reviews, and ensure quality through unit testing frameworks like Pytest or unittest. Collaborate with stakeholders to translate business needs into scalable and maintainable solutions. Mandatory Skills: Programming & Frameworks: Expert in Python and experienced with FastAPI or equivalent web frameworks. Strong knowledge of Async I/O, Pydantic Settings Hands-on with Pytest or unittest Experience with Docker, Terraform, and Kafka (Confluent) Version Control & DevOps: Experience with any version control Proven CI/CD pipeline implementation experience Cloud & Infrastructure: Hands-on experience with any cloud provider Data Processing: Knowledge of XSLT transformations, XML/XSD processing Monitoring & Observability: Familiar with integrating monitoring/logging solutions, New relic preferred. Databases & Storage: Experience with any database/storage solution Understanding of caching mechanisms Good to Have Skills : Development & Testing PyPI package creation Tox Ruff (Python linting) Basic Bash scripting Data Formats for Kafka: JSON, Avro, Protobuf Version Control & CI/CD: Proficient with GitHub and GitHub Actions GCP Experience in: Cloud Storage, Eventarc, Cloud Run, GKE Autopilot Secret Manager, Artifact Registry, Workload Identity Federation Pub/Sub, Cloud Logging, Cloud Monitoring IAM, VPC networking Databases & Storage: Familiarity with PostgreSQL, MS SQL Server, IBM AS/400, Redis Soft Skills: Strong leadership and mentoring skills Excellent communication and cross-functional collaboration Analytical mindset with problem-solving capabilities Drive for continuous learning and improvement