Role We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data the Role : The candidate will be responsible for leading data modeling initiatives and ensuring compliance with healthcare regulations while collaborating with various stakeholders to translate business requirements into technical : Data Architecture & Modeling Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management. Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment). Create and maintain data lineage documentation and data dictionaries for healthcare datasets. Establish data modeling standards and best practices across the organization. Technical Leadership Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica. Architect scalable data solutions that handle large volumes of healthcare transactional data. Collaborate with data engineers to optimize data pipelines and ensure data quality. Healthcare Domain Expertise Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI). Design data models that support analytical, reporting and AI/ML needs. Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations. Partner with business stakeholders to translate healthcare business requirements into technical data solutions. Data Governance & Quality Implement data governance frameworks specific to healthcare data privacy and security requirements. Establish data quality monitoring and validation processes for critical health plan metrics. Lead efforts to standardize healthcare data definitions across multiple systems and data sources. Required Qualifications Skills 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data. Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches. Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing. Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks). Proficiency with data modeling tools (Hackolade, ERwin, or similar). Healthcare Industry Knowledge Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data. Experience with healthcare data standards and medical coding systems. Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment). Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI). Leadership & Communication Proven track record of leading data modeling projects in complex healthcare environments. Strong analytical and problem-solving skills with ability to work with ambiguous requirements. Excellent communication skills with ability to explain technical concepts to business stakeholders. Experience mentoring team members and establishing technical standards. Preferred Qualifications Experience with Medicare Advantage, Medicaid, or Commercial health plan operations. Cloud platform certifications (AWS, Azure, or GCP). Experience with real-time data streaming and modern data lake architectures. Knowledge of machine learning applications in healthcare analytics. Previous experience in a lead or architect role within healthcare organizations. (ref:hirist.tech) Show more Show less
We are seeking an experienced Data Catalog Lead to lead the implementation and ongoing development of enterprise data catalog using Collibra. This role focuses specifically on healthcare payer industry requirements, including complex regulatory compliance, member data privacy, and multi-system data integration challenges unique to health plan operations. Key Responsibilities Data Catalog Implementation & Development : Configure and customize Collibra workflows, data models, and governance processes to support health plan business requirements Develop automated data discovery and cataloging processes for healthcare data assets including claims, eligibility, provider networks, and member information Design and implement data lineage tracking across complex healthcare data ecosystems spanning core administration systems, data warehouses, and analytics Data Governance : Build specialized data catalog structures for healthcare data domains including medical coding systems (ICD-10, CPT, HCPCS), pharmacy data (NDC codes), and provider taxonomies Configure data classification and sensitivity tagging for PHI (Protected Health Information) and PII data elements in compliance with HIPAA requirements Implement data retention and privacy policies within Collibra that align with healthcare regulatory requirements and member consent management Develop metadata management processes for regulatory reporting datasets (HEDIS, Medicare Stars, MLR reporting, risk adjustment) Technical Integration & Automation Integrate Collibra with healthcare payer core systems including claims processing platforms, eligibility systems, provider directories, and clinical data repositories Implement automated data quality monitoring and profiling processes that populate the data catalog with technical and business metadata Configure Collibra's REST APIs to enable integration with existing data governance tools and business intelligence platforms Required Qualifications Collibra Platform Expertise : 8+ years of hands-on experience with Collibra Data Intelligence Cloud platform implementation and administration Expert knowledge of Collibra's data catalog, data lineage, and data governance capabilities Proficiency in Collibra workflow configuration, custom attribute development, and role-based access control setup Experience with Collibra Connect for automated metadata harvesting and system integration Strong understanding of Collibra's REST APIs and custom development capabilities Healthcare Payer Industry Knowledge 4+ years of experience working with healthcare payer/health plan data environments Deep understanding of healthcare data types including claims (professional, institutional, pharmacy), eligibility, provider data, and member demographics Knowledge of healthcare industry standards including HL7, X12 EDI transactions, and FHIR specifications Familiarity with healthcare regulatory requirements (HIPAA, ACA, Medicare Advantage, Medicaid managed care) Understanding of healthcare coding systems (ICD-10-CM/PCS, CPT, HCPCS, NDC, SNOMED CT) Technical Skills Strong SQL skills and experience with healthcare databases (claims databases, clinical data repositories, member systems) Knowledge of cloud platforms (AWS, Azure, GCP) and their integration with Collibra cloud services Understanding of data modeling principles and healthcare data warehouse design patterns Data Governance & Compliance Experience implementing data governance frameworks in regulated healthcare environments Knowledge of data privacy regulations (HIPAA, state privacy laws) and their implementation in data catalog tools Understanding of data classification, data quality management, and master data management principles Experience with audit trail requirements and compliance reporting in healthcare organizations Preferred Qualifications Advanced Healthcare Experience : Experience with specific health plan core systems (such as HealthEdge, Facets, QNXT, or similar platforms) Knowledge of Medicare Advantage, Medicaid managed care, or commercial health plan operations Understanding of value-based care arrangements and their data requirements Experience with clinical data integration and population health analytics Technical Certifications & Skills Collibra certification (Data Citizen, Data Steward, or Technical User) Experience with additional data catalog tools (Alation, Apache Atlas, IBM Watson Knowledge Catalog) Knowledge of data virtualization tools and their integration with data catalog platforms Experience with healthcare interoperability standards and API management (ref:hirist.tech)
Job Title : ML/AI Engineer (GCP)+ Vertex : 5+ of Hire : Contract -Hire / Duration : 6Months extendable Location : Hyderabad/ Pune (Hybrid) / Remote Job Description Overview We are seeking a talented ML/AI Engineer to join our innovative team and drive the development of cutting-edge machine learning solutions. This role offers the opportunity to work with state-of-the-art AI technologies while making a meaningful impact in our organization. Position Summary We are looking for a 5+ years experience as an ML/AI Engineer with strong expertise in building machine learning models and generative AI solutions. The ideal candidate will have hands-on experience with agentic AI systems, cloud-based ML platforms, and modern AI frameworks. Healthcare industry experience is highly valued but not required. Professional Experience 3+ years of experience in machine learning engineering or related field Demonstrated experience shipping ML models to production environments Experience with MLOps practices and CI/CD pipelines for ML Strong understanding of data engineering principles and practices Key Responsibilities Model Development & Implementation : Design, develop, and deploy robust machine learning models for various business applications Build and optimize generative AI solutions using latest frameworks and techniques Implement agentic AI systems that can autonomously perform complex tasks Develop and maintain ML pipelines from data ingestion to model deployment Cloud & Platform Management Leverage Google Cloud Platform (GCP) services for scalable ML infrastructure Utilize Vertex AI for model training, deployment, and management Implement AutoML solutions for rapid prototyping and model development Ensure model security and compliance using Model Armour and related tools Technical Excellence Write clean, efficient Python code for ML applications and data processing Optimize model performance, accuracy, and computational efficiency Implement MLOps best practices for continuous integration and deployment Collaborate with cross-functional teams to integrate ML solutions into existing systems Innovation & Research Stay current with latest developments in ML/AI, particularly in generative AI and agentic systems Experiment with new technologies and frameworks to enhance capabilities Contribute to technical documentation and knowledge sharing initiatives Required Qualifications Skills Strong experience in building ML models with proven track record of successful deployments Extensive experience in Generative AI including LLMs, diffusion models, and related technologies Experience in Agentic AI and understanding of autonomous agent architectures Proficiency with Model Control Protocol (MCP) for agent communication and control Advanced Python programming with expertise in ML libraries (scikit-learn, TensorFlow, PyTorch, etc.) Google Cloud Platform (GCP) experience with ML-focused services Vertex AI hands-on experience for model lifecycle management AutoML experience for automated machine learning workflows Model Armour or similar model security and protection frameworks Soft Skills Excellent problem-solving abilities and analytical thinking Strong communication skills for technical and non-technical stakeholders Ability to work independently and manage multiple projects simultaneously Collaborative mindset for cross-functional team environments Preferred Qualifications Healthcare Industry Experience (Bonus) : Experience developing ML solutions for healthcare applications Understanding of healthcare data standards (FHIR, HL7, DICOM) Knowledge of healthcare compliance requirements (HIPAA, FDA regulations) Experience with clinical decision support systems or medical imaging Education Bachelors degree in computer science, Machine Learning, Data Science, or related field Master's degree preferred but not required with equivalent experience (ref:hirist.tech)
Databricks Migration Architect / Consultant 8+ Location - Remote Contract Directly with Genzeon| Duration: 2-3 Overview : We are seeking a specialized Databricks Architect with deep expertise in cost optimization and migration strategies, particularly focused on transitioning away from Databricks platforms. The ideal candidate will have extensive experience in Spark cluster solutions and a proven track record of reducing Databricks operational costs while architecting successful migration paths to alternative Responsibilities : Cost Optimization : Conduct comprehensive cost analysis and auditing of existing Databricks deployments across multiple workspaces Develop and implement aggressive cost reduction strategies targeting 30-50% savings through cluster optimization Design and deploy automated cost monitoring solutions with real-time alerts and budget controls Optimize cluster configurations, auto-scaling policies, and job scheduling to minimize compute costs Implement spot instance strategies and preemptible VM usage for non-critical workloads Establish cost allocation frameworks and implement chargeback mechanisms for business unit accountability Create cost governance policies and developer guidelines to prevent cost overruns Analyze and optimize storage costs including Delta Lake table optimization and data lifecycle from Databricks Architecture : Lead strategic initiatives to migrate workloads away from Databricks to cost-effective alternatives Assess existing Databricks implementations and create detailed migration roadmaps to target platforms Design migration architectures for transitioning to open-source Spark on Kubernetes, EMR, or other platforms Develop automated migration tools and frameworks to minimize business disruption Create comprehensive migration strategies including data export, job conversion, and dependency mapping Establish parallel running environments to ensure zero-downtime migrations Lead post-migration validation and performance benchmarking against original Databricks solutions Document lessons learned and create reusable migration playbooks for future Spark Cluster Solutions : Design high-performance, cost-optimized Spark cluster architectures outside of Databricks ecosystem Implement custom Spark solutions on Kubernetes, YARN, and standalone cluster managers Optimize Spark job performance through advanced tuning of memory management, serialization, and parallelism Develop custom Spark operators and applications for specialized business use cases Troubleshoot complex Spark performance bottlenecks and implement optimization strategies Create cluster auto-scaling solutions and dynamic resource allocation frameworks Design fault-tolerant Spark architectures with disaster recovery and high availability Implement monitoring and alerting for Spark cluster health and job performance Planning & Execution : Collaborate with finance teams to develop multi-year cost reduction roadmaps Evaluate and recommend alternative platforms based on cost-benefit analysis Create business cases for migration projects with detailed ROI calculations Establish technical debt reduction strategies related to Databricks dependencies Partner with procurement teams on contract negotiations and vendor Qualifications : Experience : 8+ years of experience in big data architecture with focus on cost optimization 5+ years of hands-on Databricks experience with proven cost reduction achievements Demonstrated experience architecting and executing complete platform migrations from Databricks to alternative solutions with successful outcomes 6+ years of advanced Apache Spark development and cluster management experience Track record of achieving significant cost savings (minimum 40%+) in cloud data Optimization Expertise : Expert knowledge of Databricks pricing models, compute types, and cost drivers Experience with FinOps practices and cloud cost management tools Proven ability to implement automated cost controls and budget management systems Knowledge of alternative platforms and their cost structures (EMR, HDInsight, GCP Dataproc, & Spark Technical Skills : Deep expertise in migrating complex data workloads between different Spark platforms Advanced knowledge of Spark internals, catalyst optimizer, and performance tuning Experience with Kubernetes-based Spark deployments and container orchestration Proficiency in infrastructure-as-code for multi-cloud Spark cluster provisioning Strong background in data pipeline migration and ETL/ELT conversion & Platform Skills : Expert-level proficiency in Scala, Python, and Java for Spark development Advanced SQL skills and experience with multiple database technologies Experience with open-source alternatives to Databricks (Apache Spark, Delta Lake OSS, MLflow OSS) Knowledge of streaming platforms (Kafka, Kinesis, Pulsar) and real-time architectures Proficiency with monitoring tools (Prometheus, Grafana, ELK Qualifications : Databricks certifications combined with experience in competitive platforms Cloud cost management certifications (AWS Cost Optimization, Azure Cost Management) Experience with vendor negotiations and contract optimization Background in building business cases for platform migrations Knowledge of data governance during platform transitions Experience with Apache Iceberg, Hudi, or other open table formats as Delta Lake : Bachelor's degree in Computer Science, Engineering, Information Technology, or related field Master's degree preferred but not required with equivalent experience (ref:hirist.tech)
Job Title : Jr AI/ ML Engineer Location : Pune (Onsite)- work from office Experience : 2 to 4 Years Role The AI Engineer at Genzeon will develop and deploy artificial intelligence solutions with a focus on healthcare applications. You'll work with cross-functional teams to create AI solutions that improve clinical outcomes and operational efficiency. Key Responsibilities Create and implement AI solutions to automate complex workflows Clean and preprocess data for AI model training and validation Deploy models to production environments and monitor their performance Collaborate with team members to brainstorm innovative solutions Document models, processes, and research findings Support business users and clients in resolving technical queries Participate in proof-of-concept projects for potential clients Requirements 2 - 4 experience of Python programming experience Knowledge of machine learning libraries Familiarity with AI orchestration frameworks Experience building AI agents and agentic systems Understanding of core machine learning concepts and algorithms Experience with SQL databases and data manipulation techniques Strong analytical and problem-solving skills Effective communication and teamwork abilities Interest in healthcare industry applications of AI Preferred Qualifications Experience with cloud platforms (AWS, Google Cloud, Azure) Knowledge of data visualization tools Understanding of DevOps practices for model deployment Experience with version control systems (Git) (ref:hirist.tech)
Job Title : AI/ ML - Trainee Location : Pune (Onsite)- work from office Experience : 0-1 years Role The AI Engineer at Genzeon will develop and deploy artificial intelligence solutions with a focus on healthcare applications. You'll work with cross-functional teams to create AI solutions that improve clinical outcomes and operational efficiency. Key Responsibilities Create and implement AI solutions to automate complex workflows Clean and preprocess data for AI model training and validation Deploy models to production environments and monitor their performance Collaborate with team members to brainstorm innovative solutions Document models, processes, and research findings Support business users and clients in resolving technical queries Participate in proof-of-concept projects for potential : 03 - 12 Months experience of Python programming experience Knowledge of machine learning libraries Familiarity with AI orchestration frameworks Experience building AI agents and agentic systems Understanding of core machine learning concepts and algorithms Experience with SQL databases and data manipulation techniques Strong analytical and problem-solving skills Effective communication and teamwork abilities Interest in healthcare industry applications of AI Preferred Qualifications Experience with cloud platforms (AWS, Google Cloud, Azure) Knowledge of data visualization tools Understanding of DevOps practices for model deployment Experience with version control systems (Git) (ref:hirist.tech)
Job Title : Product Manager Experience : 7 - 10 Years Location : Pune Work from Office About The Role We are looking for an experienced Product Manager to lead and enhance our Medical Review Product. The ideal candidate will have 7-10 years of experience in product development, with expertise in user and industry research, journey mapping, usability testing, and agile methodologies. The role will be responsible for translating the product vision and strategy into product features, writing user stories, managing development interactions, and collaborating with cross-functional teams to deliver products that exceed customer expectations and regulatory requirements. Key Responsibilities Feature Specification & Development Interaction : Write detailed feature specifications that -clearly define the requirements, acceptance criteria, and dependencies for each product feature. Interact closely with the development team to ensure alignment with product requirements. User Story Writing & Acceptance Testing : Write clear, concise user stories and define acceptance criteria. Lead user acceptance testing (UAT) to validate product features before release. Cross-Functional Collaboration : Work closely with UX/UI designers, engineers, sales, and marketing teams to deliver high-quality products. Collaborate with the Scrum Master to improve product development processes and remove roadblocks. User Research & Usability Testing : Conduct user interviews, usability testing, and feedback sessions to gather insights and validate features. Incorporate findings to enhance user experience and optimize product functionality. Performance Monitoring & Adoption Strategies : Track product performance and user adoption through analytics and user feedback. Develop strategies to improve adoption rates and optimize product performance. Market Analysis : Conduct market research to identify trends, opportunities, and competitive landscape within the medical review industry. Stakeholder Engagement : Act as the main point of contact for stakeholders, providing internal demos, training, and support. Sales Support : Provide support for Sales proposals and presentations and deliver exceptional product demos to clients, prospects and partners. Requirements Aptitude and Attitude : Demonstrate a strong aptitude for learning new technologies and domain. Must have a positive and collaborative attitude, be proactive, and have a detail-oriented approach. Education : Bachelor of Engineering degree in computer science, or related fields and an advanced degree (MBA). Experience : 7-10 years of experience in developing and delivering commercial products, with a proven track record of conducting user interviews, usability testing, and managing product lifecycles. Agile Methodology : Experience collaborating with Scrum Masters and working within agile -frameworks. Communication Skills : Excellent verbal and written communication skills for effective stakeholder and team interaction. Domain Knowledge : Familiarity with the USA healthcare domain is highly desired. Preferred Qualifications Experience with medical review software or clinical support tools. Knowledge of GenAI and machine learning applications in healthcare. Proven ability to launch products that enhance productivity and user satisfaction. (ref:hirist.tech)
Job Role : Full-Stack Software Engineer Associate Location : Pune, 5 days a week - WFO Employment Type : Full-time Experience : 0- 3 years of the Role : Role Were hiring an Associate Software Engineer to learn, build, and ship features across the stack with mentorship. Youll work primarily with React (front end), Python (services/APIs), and PostgreSQL (database), while gaining exposure to CI/CD, basic SRE practices, and team delivery. This role is ideal for someone with solid fundamentals, eagerness to learn, and a strong ownership Youll Do : Implement UI components in ReactJS (hooks, routing, forms, basic state). Build and maintain Python endpoints (FastAPI/Flask/Django REST) with unit tests. Write and optimize SQL queries; assist with PostgreSQL schema changes and stored procedures/functions under guidance. Automate routine tasks with UNIX shell scripting (scripts for setup, data utilities, simple deploy steps). Contribute to Azure DevOps (ADO) pipelines and follow CI/CD workflows, code quality gates, and branching strategies. Add logs/metrics and follow runbooks to help troubleshoot non-critical issues with senior support. Participate in code reviews, pair programming, and design discussions; write concise docs. Collaborate closely with designers, product managers, and QA to deliver incremental, well[1]tested Skills & Knowledge (Foundational Must-Have) : Programming fundamentals : data structures, HTTP/REST, git, debugging. ReactJS : components, props/state, hooks, basic performance hygiene. Python : building simple APIs, environment management, packaging basics. SQL/PostgreSQL : joins, indexes, transactions; comfort reading/writing stored procedures/functions. Shell scripting (UNIX) : basic scripting for automation and tooling. Azure DevOps (ADO) basics : running pipelines, reading logs, using boards/repos. Quality practices : unit/integration tests (PyTest/Jest), linting, static analysis, secure coding to Have (Learn on the Job) : TypeScript on the front end; component libraries (MUI/Ant/Tailwind). Auth (OAuth2/OIDC/JWT), OpenAPI/Swagger, basic API gateways. Containers & Cloud : Docker; fundamentals of Azure; Infra-as-Code exposure (Bicep/Terraform). Caching & Messaging : Redis; high-level awareness of queues/streams (RabbitMQ/Kafka). Observability : logs/metrics/traces (e.g., OpenTelemetry), Sentry, Grafana. Security : secrets management, dependency scanning, least privilege. AIML familiarity (optional) : consuming model endpoints, embeddings/vector stores, OCR. How You Work (Values & Behaviors) Team collaborator : communicates clearly, asks good questions, accepts feedback, unblocks others. Ownership & initiative : takes tasks from ticket to done, raises risks early, proposes small improvements. Responsiveness : provides updates, meets commitments, follows through. Growth mindset : learns quickly, documents findings, shares knowledge. Whole-picture awareness : considers users, reliability, and costnot just code. Resilience : willing to support critical moments; sees problems through to resolution with & Support : Paired onboarding with a mentor; clear growth path from task-level execution to owning small modules. Participation in out-of-hours support only when shadowing or pre-agreed with the team during critical releases. (ref:hirist.tech)
Job title : Full-Stack Software Engineer Location : Pune, 5 days a week - WFO Exp : 3-8 Were looking for a hands-on Full-Stack Software Engineer who can build, ship, and operate production systems end-to-end. Youll work across the stackReact on the front end, Python services on the back end, and PostgreSQL in the data layerwhile contributing to reliability, automation, and continuous delivery. The ideal candidate pairs strong technical depth with ownership, teamwork, and a growth mindset. What Youll Do Design, build, and maintain ReactJS single-page applications (state management, routing, performance). Develop Python services and APIs (REST/GraphQL), including data models, validation, and testing. Write performant SQL (PostgreSQL), including stored procedures/functions, query tuning, and schema evolution. Automate workflows with UNIX shell scripting (build, deploy, data/backups, environment automation). Partner with SRE/DevOps on observability (metrics, logs, traces), SLIs/SLOs, capacity, and incident response. Collaborate closely with designers, product managers, and peers to deliver incremental value with high quality. Contribute to code reviews, technical design docs, and engineering standards. Own outcomeshelp troubleshoot production issues, and support occasional extended hours during critical releases/incidents. Core Skills & Experience (Must-Have) ReactJS : modern hooks, component patterns, state management (Context/Redux/Zustand), performance basics. Python : strong hands-on experience with one or more frameworks (e.g., FastAPI, Flask, Django REST). SQL/PostgreSQL : advanced querying, indexes, transactions, stored procedures/functions, migrations. Shell scripting (UNIX) : practical automation, tooling, and safe scripting practices. Azure DevOps (ADO) : pipelines, boards, artifacts; Git workflows; branch strategies; release management. Set up and improve CI/CD pipelines (Azure DevOps/ADO preferred) with quality gates and automated tests. SRE fundamentals : instrumentation, alerting, on-call etiquette, incident handling, post[1]mortems, basic capacity planning. Software engineering practices : TDD/automated testing (PyTest/Jest), code reviews, documentation, secure coding. Nice To Have (Added Advantage) TypeScript on the front end; component libraries (MUI/Ant/Tailwind). API design (OpenAPI/Swagger), AuthN/Z (OAuth2/OIDC/JWT), rate limiting, and API gateways. Containers & Cloud : Docker; Azure (preferred) Databases : NoSQL (MongoDB, Cassandra, DynamoDB, etc.). Caching & Messaging : Redis; basic knowledge of queues/streams (RabbitMQ/Kafka) and webhooks. Observability stack : Prometheus/Grafana, OpenTelemetry, ELK/EFK, Sentry. Security & Compliance : secrets management, dependency scanning, least-privilege, threat-model basics. Data & Integration : ETL basics, file parsers, batch jobs, cron/schedulers. AI/ML familiarity (not mandatory) : calling model endpoints, embeddings, vector stores, OCR pipelines. Kubernetes (nice to have) : workloads, services, config, and deployments. How You Work (Values & Behaviors) Team-first collaborator : communicates clearly, gives/receives feedback, unblocks others. Ownership & initiative : picks up ambiguous problems, proposes options, drives to closure. Responsive & reliable : proactive updates; manages expectations; holds a high bar for quality. Growth mindset : learns quickly, shares knowledge, improves systems and processes. Whole-picture thinking : considers product, users, reliability, cost, and operational impact. Resilience : willing to support critical paths and occasional extended hours to see problems through to resolution. Qualifications 38+ years of full-stack engineering experience (or equivalent impact). Proven delivery of production React + Python + PostgreSQL applications. Preferable experience with CI/CD and at least basic SRE/operability practices (ref:hirist.tech)
Role Overview: As a Senior CRM Developer with over 7 years of experience, you will be responsible for developing, customizing, and integrating CRM solutions to meet the organization's needs. Your role will involve implementing, optimizing, and maintaining CRM platforms to ensure they deliver maximum value. Additionally, you will collaborate with stakeholders to gather requirements and provide mentorship to junior developers. Key Responsibilities: - Requirements Gathering: Collaborate with business stakeholders to analyze CRM system requirements and translate them into functional specifications. - Customization: Lead the customization and configuration of CRM platforms, designing custom entities, forms, workflows, business rules, and dashboards. - Development: Design, code, and maintain custom CRM functionalities using JavaScript, C#, .NET, and other relevant technologies. - Integration: Develop and manage integrations between CRM systems and other business applications using APIs and middleware. - Data Management: Lead data migration efforts to ensure data integrity within the CRM system. - Security and Compliance: Implement security measures, manage access controls, and ensure compliance with data protection laws. - Performance Optimization: Monitor and optimize system performance, troubleshooting technical issues to ensure efficiency. - Mentorship: Provide guidance to junior developers, ensuring best practices in coding and project execution. - Documentation: Create detailed technical documentation for CRM configurations, customizations, and integrations. - Support: Offer technical support and collaborate with IT teams for continuous improvement of CRM operations. Qualifications Required: - Bachelor's degree in computer science, information technology, or related field. - 7+ years of experience as a CRM Developer with expertise in CRM implementations. - Proficiency in CRM platforms like Microsoft Dynamics CRM or Dynamics 365. - Strong skills in CRM customization using JavaScript, C#, .NET, and related tools. - Experience with API development and CRM integrations with other systems. - Knowledge of data migration, management, and reporting within CRM environments. - Familiarity with Agile methodologies and DevOps practices. - Problem-solving skills with attention to detail and effective communication skills. Company Details: The company's name is "hirist.tech" as mentioned in the reference section.,