As a Senior Data Engineer in the healthcare domain, your role involves architecting, developing, and managing large-scale, secure, and high-performance data pipelines on Databricks using Spark, Delta Lake, and cloud-native tools. You will be responsible for designing and implementing healthcare-specific data models to support analytics, AI/ML, and operational reporting. Your expertise will be crucial in ingesting and transforming complex data types such as 837/835 claims, EHR/EMR records, provider/member files, lab results, and clinical notes. Additionally, you will lead data governance, quality, and security initiatives ensuring compliance with HIPAA, HITECH, and organizational policies. Collaboration with cross-functional stakeholders to understand data needs and provide robust, scalable solutions will be a key aspect of your role. Mentoring junior and mid-level engineers, identifying performance bottlenecks, and implementing optimizations in Spark jobs and SQL transformations are also part of your responsibilities. You will own and evolve best practices for CI/CD, version control, and deployment automation, while staying up to date with industry standards and evaluating new tools/technologies. Key Responsibilities: - Architect, develop, and manage large-scale, secure, and high-performance data pipelines on Databricks using Spark, Delta Lake, and cloud-native tools. - Design and implement healthcare-specific data models to support analytics, AI/ML, and operational reporting. - Ingest and transform complex data types such as 837/835 claims, EHR/EMR records, provider/member files, lab results, and clinical notes. - Lead data governance, quality, and security initiatives ensuring compliance with HIPAA, HITECH, and organizational policies. - Collaborate with cross-functional stakeholders to understand data needs and provide robust, scalable solutions. - Mentor junior and mid-level engineers, performing code reviews and technical guidance. - Identify performance bottlenecks and implement optimizations in Spark jobs and SQL transformations. - Own and evolve best practices for CI/CD, version control, and deployment automation. - Stay up to date with industry standards (e.g., FHIR, HL7, OMOP) and evaluate new tools/technologies. Required Qualifications: - 5+ years of experience in data engineering, with 3+ years in healthcare or life sciences domain. - Deep expertise with Databricks, Apache Spark (preferably PySpark), and Delta Lake. - Proficiency in SQL, Python, and data modeling (dimensional/star schema, normalized models). - Strong command over 837/835 EDI formats, CPT/ICD-10/DRG/HCC coding, and data regulatory frameworks. - Experience with cloud platforms such as Azure, AWS, or GCP and cloud-native data services (e.g., S3, ADLS, Glue, Data Factory). - Familiarity with orchestration tools like Airflow, dbt, or Azure Data Factory. - Proven ability to work in agile environments, manage stakeholder expectations, and deliver end-to-end data products. - Experience implementing monitoring, observability, and alerting for data pipelines. - Strong written and verbal communication skills for both technical and non-technical audiences.,
AVEVA System Platform / OMI Specialist 📍 Location: Hyderabad 🕒 Experience: 8+ Years 🖥️ Role: System Platform – AVEVA System Platform (OMI) About the Role We are looking for an experienced AVEVA System Platform / OMI Engineer who can design, configure and support advanced industrial automation and SCADA solutions. The ideal candidate will have strong hands-on expertise with AVEVA technologies and solid skills in databases and cloud integration. Key Responsibilities Design, configure and maintain AVEVA System Platform and OMI applications Develop and manage UOC (Unified Operations Center) dashboards and workflows Integrate plant / field data with SQL databases and cloud environments Collaborate with operations, IT and OT teams to implement reliable, scalable solutions Troubleshoot system issues and optimize performance, security and availability Prepare technical documentation and provide end-user training/support Required Skills & Experience 8+ years of total experience in industrial automation / SCADA / HMI Strong hands-on experience in: AVEVA System Platform AVEVA OMI UOC (Unified Operations Center) Working knowledge of SQL (queries, reporting, integrations) Experience with cloud-based deployments or integrations Good communication, problem-solving and teamwork skills How to Apply If this opportunity matches your profile, please: 📩 Email your updated CV to: madhur@blacklake.in
Job Title: SME Lead – eRTO (Supply Chain & Refining) Location: Hyderabad Experience: 8 – 20 Years About the Role We are looking for an experienced SME Lead – eRTO (Supply Chain) with strong refinery domain knowledge and expertise in planning, scheduling and process optimisation. The role focuses on developing and maintaining LP/eRTO models and supporting refinery and petrochemical planning teams. Key Responsibilities Apply a thorough understanding of refining and refinery processes with relevant experience in trading, planning, scheduling, refining and process optimisation in the refining industry or similar consulting environment. Build models for various process units/plants and perform techno-economic analyses using industry tools such as RPMS, PIMS, Haverly, Petroleum Scheduler or other LP solutions . Use process simulation tools and accurately interpret process flow diagrams (PFDs) and piping and instrumentation diagrams (P&IDs) . Develop and build planning/scheduling models for refineries, petrochemical plants and related facilities , including: Design and model building Implementation using state-of-the-art USC planning and scheduling tools . Required Skills & Profile 8–20 years of experience in the refining / petrochemical / oil & gas domain or related consulting. Strong exposure to trading, planning, scheduling and process optimisation . Hands-on experience with LP / planning tools such as RPMS, PIMS, Haverly, Petroleum Scheduler or similar. Working knowledge of process simulation tools with the ability to read and interpret PFDs and P&IDs . Strong analytical thinking, communication skills and the ability to work with cross-functional teams.
Hiring: Water Management Consultant – Water Management & Digital Transformation 📍 Location: Hyderabad 🕒 Experience: 15+ Years About the Role We are looking for a senior Water Management Consultant with strong domain expertise in oil & gas and digital transformation of water systems. The role focuses on water balance, loss management, pipeline integrity and leak detection across complex networks. Key Expertise Water balance and system understanding across large industrial / pipeline networks Strong knowledge of water balance concepts – inflow/outflow mapping, mass-balance calculations and identification of system losses Experience with Leak Detection Systems (LDS) and overall loss management Hands-on expertise in PIMS (Pipeline Integrity Management System) Proven oil & gas industry experience Desired Profile 15+ years of experience in water management / pipeline operations / oil & gas Strong analytical and problem-solving skills Ability to work with cross-functional operations, integrity and digital teams Excellent communication and consulting skills #Hiring #WaterManagement #DigitalTransformation #LDS #PIMS #OilAndGas #Consultant #HyderabadJobs