Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
0 Lacs
maharashtra
On-site
As a Lead Data Engineer, you will be responsible for leveraging your 7 to 12+ years of hands-on experience in SQL database design, data architecture, ETL, Data Warehousing, Data Mart, Data Lake, Big Data, Cloud (AWS), and Data Governance domains. Your expertise in a modern programming language such as Scala, Python, or Java, with a preference for Spark/ Pyspark, will be crucial in this role. Your role will require you to have experience with configuration management and version control apps like Git, along with familiarity working within a CI/CD framework. If you have experience in building frameworks, it will be considered a significant advantage. A minimum of 8 years of recent hands-on SQL programming experience in a Big Data environment is necessary, with a preference for experience in Hadoop/Hive. Proficiency in PostgreSQL, RDBMS, NoSQL, and columnar databases will be beneficial for this role. Your hands-on experience in AWS Cloud data engineering components, including API Gateway, Glue, IoT Core, EKS, ECS, S3, RDS, Redshift, and EMR, will play a vital role in developing and maintaining ETL applications and data pipelines using big data technologies. Experience with Apache Kafka, Spark, and Airflow is a must-have for this position. If you are excited about this opportunity and possess the required skills and experience, please share your CV with us at omkar@hrworksindia.com. We look forward to potentially welcoming you to our team. Regards, Omkar,
Posted 3 weeks ago
8.0 - 13.0 years
30 - 35 Lacs
Bengaluru
Work from Office
About The Role Data Engineer -1 (Experience 0-2 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics.The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter.As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform.If you"ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 3 weeks ago
10.0 - 17.0 years
25 - 40 Lacs
Noida
Remote
Candidate should have 8+ years of relevant experience in Project Management US Healthcare ,US Hospital ,EMR , EHR,HIS experience is must Interested candidates ,please share resume : ankita.shrivastava@elevancesysyems.com
Posted 3 weeks ago
5.0 - 10.0 years
7 - 17 Lacs
Kolkata, Hyderabad, Pune
Work from Office
Airflow Data Engineer in AWS platform Job Title Apache Airflow Data Engineer ROLE” as per TCS Role Master • 4-8 years of experience in AWS, Apache Airflow (on Astronomer platform), Python, Pyspark, SQL • Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement. • Experience in creating data pipelines and orchestrating using Apache Airflow • Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses, Data Lake and Data Marts. • Good to have: Experience with cloud ETL and ELT in one of the tools like DBT/Glue/EMR or Matillion or any other ELT tool • Excellent communication skills to liaise with Business & IT stakeholders. • Expertise in planning execution of a project and efforts estimation. • Exposure to working in Agile ways of working. Candidate for this position to be offered with TAIC or TCSL as Entity Data warehousing , pyspark , Github, AWS data platform, Glue, EMR, RedShift, databricks,Data Marts. DBT/Glue/EMR or Matillion, data engineering, data modelling, data consumption
Posted 3 weeks ago
8.0 - 10.0 years
13 - 18 Lacs
Chandigarh
Work from Office
Job Description Full-stack Architect Experience 8 - 10 years Architect, design, and oversee the development of full-stack applications using modern JS frameworks and cloud-native tools. Lead microservice architecture design, ensuring system scalability, reliability, and performance. Evaluate and implement AWS services (Lambda, ECS, Glue, Aurora, API Gateway, etc.) for backend solutions. Provide technical leadership to engineering teams across all layers (frontend, backend, database). Guide and review code, perform performance optimization, and define coding standards. Collaborate with DevOps and Data teams to integrate services (Redshift, OpenSearch, Batch). Translate business needs into technical solutions and communicate with cross-functional stakeholders.
Posted 3 weeks ago
6.0 - 11.0 years
11 - 16 Lacs
Gurugram
Work from Office
Project description We are looking for the star Python Developer who is not afraid of work and challenges! Gladly becoming a partner with famous financial institution, we are gathering a team of professionals with wide range of skills to successfully deliver business value to the client. Responsibilities Analyse existing SAS DI pipelines and SQL-based transformations. Translate and optimize SAS SQL logic into Python code using frameworks such as Pyspark. Develop and maintain scalable ETL pipelines using Python on AWS EMR. Implement data transformation, cleansing, and aggregation logic to support business requirements. Design modular and reusable code for distributed data processing tasks on EMR clusters. Integrate EMR jobs with upstream and downstream systems, including AWS S3, Snowflake, and Tableau. Develop Tableau reports for business reporting. Skills Must have 6+ years of experience in ETL development, with at least 5 years working with AWS EMR. Bachelor's degree in Computer Science, Data Science, Statistics, or a related field. Proficiency in Python for data processing and scripting. Proficient in SQL and experience with one or more ETL tools (e.g., SAS DI, Informatica)/. Hands-on experience with AWS servicesEMR, S3, IAM, VPC, and Glue. Familiarity with data storage systems such as Snowflake or RDS. Excellent communication skills and ability to work collaboratively in a team environment. Strong problem-solving skills and ability to work independently. Nice to have N/A
Posted 3 weeks ago
4.0 - 7.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Senior Data Engineer with a deep focus on data quality, validation frameworks, and reliability engineering . This role will be instrumental in ensuring the accuracy, integrity, and trustworthiness of data assets across our cloud-native infrastructure. The ideal candidate combines expert-level Python programming with practical experience in data pipeline engineering, API integration, and managing cloud-native workloads on AWS and Kubernetes . Roles and Responsibilities Design, develop, and deploy automated data validation and quality frameworks using Python. Build scalable and fault-tolerant data pipelines that support quality checks across data ingestion, transformation, and delivery. Integrate with REST APIs to validate and enrich datasets across distributed systems. Deploy and manage validation workflows using AWS services (EKS, EMR, EC2) and Kubernetes clusters. Collaborate with data engineers, analysts, and DevOps to embed quality checks into CI/CD and ETL pipelines. Develop monitoring and alerting systems for real-time detection of data anomalies and inconsistencies. Write clean, modular, and reusable Python code for automated testing, validation, and reporting. Lead root cause analysis for data quality incidents and design long-term solutions. Maintain detailed technical documentation of data validation strategies, test cases, and architecture. Promote data quality best practices and evangelize a culture of data reliability within the engineering teams. Required Skills: Experience with data quality platforms such as Great Expectations , Collibra Data Quality , or similar tools. Proficiency in Docker and container lifecycle management. Familiarity with serverless compute environments (e.g., AWS Lambda, Azure Functions), Python, PySpark Relevant certifications in AWS , Kubernetes , or data quality technologies . Prior experience working in big data ecosystems and real-time data environments.
Posted 3 weeks ago
2.0 - 4.0 years
5 - 11 Lacs
Uttar Pradesh
Work from Office
Create the future of e-health together with us by becoming an Interface Analyst As a pioneer in digital health our heart beats for the development and implementation of new technologies. Become part of a cooperative Agile Team working in the latest technologies Angular and .Net developing PC base, cloud base and mobile base solutions. For the next level of e-health evolution we are looking for creative minds who enjoy working with a variety of technologies, their own design freedom and professional development. Empowering employees to solve challenging problems with full support of your peers. What you can expect from us – An extensive group health and accidental insurance program. A safe digital application and a structured and streamlined onboarding process. Our progressive transportation model allows you to choose: You can either receive a self-transport allowance, or we can pick you up and drop you off on your way from or to the office. Subsidized meal facility. Fun at Work: tons of engagement activities and entertaining games for everyone to participate. Various career growth opportunities as well as a lucrative merit increment policy in a work environment where we promote Diversity, Equity, and Inclusion. Best HR practices along with an open-door policy to ensure a very employee friendly environment. A recession proof and secured workplace for our entire workforce. What you can do for us: You are responsible for building the next physician office experience and help build an amazing application used by healthcare providers and patients across the country If you want to make a difference for physicians, nurses and patients with the code you write, and not just work on the next chat app, this opportunity is for you. Your main duties will include working with vendors and customers to implement important interfaces in the realm of labs (ORM, ORU), PM (ADT, SIU and DFT) and completely custom ones depending on need You and the team will participate in all phases of design and development; from high level design, to defining the REST APIs; to writing the code and tests. The application is built with most of the UI implemented using Angular. There is a messaging layer between the Angular and C# code. The backend is a REST API implemented using ASP.NET Web API. As a member of the team, you will work with all these technologies. Your Qualifications: 2+ years developing software with any OOP language knowledge. For example, C#, Microsoft .NET, or .NET Core. SQL knowledge is a strong plus. Experience designing or working with REST APIs is a strong plus Good knowledge of HL7 standard (ADT, SIU, DFT, ORM, ORU and others) as well as working with an HL7 interface engine – (C#, Mirth, Core point) Flexible to work shift per client requirement. Convinced? Submit your persuasive application now (including desired salary and earliest possible starting date). We create the future of e-health. Become part of a significant mission.
Posted 3 weeks ago
3.0 - 5.0 years
3 - 7 Lacs
Chennai
Work from Office
About The Role Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Electronic Medical Records (EMR) Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time educationCardiology PACS About The Role :-The Cardiology PACS (Picture Archiving and Communication System) Administrator is responsible for managing and supporting the digital imaging systems used in the Cardiology department. This includes the maintenance, integration, and optimization of systems such as CVIS (Cardiovascular Information System), PACS, ECG Management, and imaging modalities. The role ensures continuous availability, performance, and compliance of imaging systems with healthcare standards and regulatory requirements.Key Responsibilities:Administer and support Cardiology PACS and CVIS systems (e.g., FUJI Synapse Cardiovascular 6 & 7, GE Muse, Philips-Xper, eCare Manager etc.)Coordinate with Cardiology, Radiology, and IT teams to ensure optimal performance and uptime of imaging systems.Manage storage, retrieval, archival, and transmission of DICOM images and cardiology reports.Monitor system performance, backup schedules, and perform routine maintenance.Troubleshoot issues related to image availability, modality connectivity, and workflow interruptions.Work with vendors to resolve technical issues and participate in system upgrades and patch deployments.Support integration with EMRs (e.g., Epic, Cerner) and ensure HL7/DICOM interfaces function properly.Conduct detailed Root Cause Analysis (RCA) for PACS outages, data inconsistencies, and workflow failures, document findings and drive remediation plans.Maintain documentation of system configuration, standard operating procedures, and change management.Train and support end users (cardiologists, techs, nurses) on system features and workflows.Ensure compliance with HIPAA, hospital policies, and industry regulations regarding data security and patient privacy. Qualifications & Skills: Bachelors degree in health informatics, IT, Biomedical Engineering, or related field (preferred).Minimum 3+ years of experience supporting any PACS/CVIS systems in a hospital or clinical environment.Strong knowledge of DICOM, HL7, IHE standards, and network protocols.Familiarity with radiology workflows, RIS integration, and medical imaging regulatory compliance.Experience with cardiology imaging modalities:ECHO, Cath Lab, ECG, Holter, Stress Test, Nuclear, etc.Proven ability to manage SLAs and perform in depth Root Cause Analysis (RCA).Familiarity with applications like FUJI Synapse Cardiovascular, GE Muse, Philips products, or similar.Good problem-solving, communication, and vendor management skills.Ability to participate in on-call support and respond to critical incidents as needed. Qualification 15 years full time education
Posted 3 weeks ago
5.0 - 10.0 years
3 - 7 Lacs
Chennai
Work from Office
About The Role Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Electronic Medical Records (EMR) Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time educationRadiology PACS About The Role :-The Radiology PACS (Picture Archiving and Communication System) Administrator is responsible for managing and supporting the digital imaging systems used in the Radiology department. This includes the maintenance, integration, and optimization of systems such as CVIS (Cardiovascular Information System), PACS, ECG Management, and imaging modalities. The role ensures continuous availability, performance, and compliance of imaging systems with healthcare standards and regulatory requirements.Key Responsibilities:Administer and support Radiology PACS and CVIS systems (e.g., FUJI Synapse Cardiovascular 6 & 7, GE Muse, Philips-Xper, eCare Manager etc.)Coordinate with Cardiology, Radiology, and IT teams to ensure optimal performance and uptime of imaging systems.Manage storage, retrieval, archival, and transmission of DICOM images and cardiology reports.Monitor system performance, backup schedules, and perform routine maintenance.Troubleshoot issues related to image availability, modality connectivity, and workflow interruptions.Work with vendors to resolve technical issues and participate in system upgrades and patch deployments.Support integration with EMRs (e.g., Epic, Cerner) and ensure HL7/DICOM interfaces function properly.Conduct detailed Root Cause Analysis (RCA) for PACS outages, data inconsistencies, and workflow failures, document findings and drive remediation plans.Maintain documentation of system configuration, standard operating procedures, and change management.Train and support end users (cardiologists, techs, nurses) on system features and workflows.Ensure compliance with HIPAA, hospital policies, and industry regulations regarding data security and patient privacy. Qualifications & Skills: Bachelors degree in health informatics, IT, Biomedical Engineering, or related field (preferred).Minimum 5 years of experience supporting any PACS/CVIS systems in a hospital or clinical environment.Strong knowledge of DICOM, HL7, IHE standards, and network protocols.Familiarity with radiology workflows, RIS integration, and medical imaging regulatory compliance.Experience with cardiology imaging modalities:ECHO, Cath Lab, ECG, Holter, Stress Test, Nuclear, etc.Proven ability to manage SLAs and perform in depth Root Cause Analysis (RCA).Familiarity with applications like FUJI Synapse Cardiovascular, GE Muse, Philips products, or similar.Good problem-solving, communication, and vendor management skills.Ability to participate in on-call support and respond to critical incidents as needed. Qualification 15 years full time education
Posted 3 weeks ago
15.0 - 20.0 years
3 - 7 Lacs
Chennai
Work from Office
About The Role Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Electronic Medical Records (EMR) Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time educationApplication Support Virtual HealthOver 3 plus years of experience in supporting virtual healthcare platforms such as Teladoc, Caregility, Zoom, eCareManager and Vibe, ensuring seamless telehealth services. To work actively for new device replacements and upgrades to maintain reliability and compatibility with evolving platform requirements. Vendor Coordination & Device ConfigurationCoordinate with vendors for timely support, software updates, and resolution of platform-specific issues. Configured and tested telehealth devices including video carts, tablets, and connected peripherals.Work closely with field support teams to validate physical setup, user access, and audio/video functionality. Provide support and troubleshooting for Zoom meetings/webinars used in virtual health.Maintain inventory and conduct device audits to ensure consistent uptime and availability.Incident and SLA ManagementManage incidents and service requests using ServiceNow tools, ensuring resolution within defined SLAs.Log and escalate issues as required, performing root cause analysis and preventive recommendations.Generate reports to track incident types, resolution times, and SLA performance metrics.Network Collaboration & Site EnablementWork closely with the network team to validate connectivity, firewall configurations, and access for Caregility and Teladoc devices/ endpoints.Support Wi-Fi/LAN testing during device onboarding and site readiness phases.Documentation & ReportingMaintain detailed configuration documentation, SOPs, and knowledge base articles.Provide regular updates and reports to leadership and stakeholders on implementation progress, incidents, and device performance. Qualification 15 years full time education
Posted 3 weeks ago
5.0 - 10.0 years
3 - 7 Lacs
Bengaluru
Work from Office
About The Role Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Electronic Medical Records (EMR) Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time educationCardiology PACS About The Role :-The Cardiology PACS (Picture Archiving and Communication System) Administrator is responsible for managing and supporting the digital imaging systems used in the Cardiology department. This includes the maintenance, integration, and optimization of systems such as CVIS (Cardiovascular Information System), PACS, ECG Management, and imaging modalities. The role ensures continuous availability, performance, and compliance of imaging systems with healthcare standards and regulatory requirements.Key Responsibilities:Administer and support Cardiology PACS and CVIS systems (e.g., FUJI Synapse Cardiovascular 6 & 7, GE Muse, Philips-Xper, eCare Manager etc.)Coordinate with Cardiology, Radiology, and IT teams to ensure optimal performance and uptime of imaging systems.Manage storage, retrieval, archival, and transmission of DICOM images and cardiology reports.Monitor system performance, backup schedules, and perform routine maintenance.Troubleshoot issues related to image availability, modality connectivity, and workflow interruptions.Work with vendors to resolve technical issues and participate in system upgrades and patch deployments.Support integration with EMRs (e.g., Epic, Cerner) and ensure HL7/DICOM interfaces function properly.Conduct detailed Root Cause Analysis (RCA) for PACS outages, data inconsistencies, and workflow failures, document findings and drive remediation plans.Maintain documentation of system configuration, standard operating procedures, and change management.Train and support end users (cardiologists, techs, nurses) on system features and workflows.Ensure compliance with HIPAA, hospital policies, and industry regulations regarding data security and patient privacy. Qualifications & Skills: Bachelors degree in health informatics, IT, Biomedical Engineering, or related field (preferred).Minimum 5 years of experience supporting any PACS/CVIS systems in a hospital or clinical environment.Strong knowledge of DICOM, HL7, IHE standards, and network protocols.Familiarity with radiology workflows, RIS integration, and medical imaging regulatory compliance.Experience with cardiology imaging modalities:ECHO, Cath Lab, ECG, Holter, Stress Test, Nuclear, etc.Proven ability to manage SLAs and perform in depth Root Cause Analysis (RCA).Familiarity with applications like FUJI Synapse Cardiovascular, GE Muse, Philips products, or similar.Good problem-solving, communication, and vendor management skills.Ability to participate in on-call support and respond to critical incidents as needed. Qualification 15 years full time education
Posted 3 weeks ago
3.0 - 5.0 years
3 - 7 Lacs
Bengaluru
Work from Office
About The Role Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Electronic Medical Records (EMR) Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time educationCardiology PACS About The Role :-The Cardiology PACS (Picture Archiving and Communication System) Administrator is responsible for managing and supporting the digital imaging systems used in the Cardiology department. This includes the maintenance, integration, and optimization of systems such as CVIS (Cardiovascular Information System), PACS, ECG Management, and imaging modalities. The role ensures continuous availability, performance, and compliance of imaging systems with healthcare standards and regulatory requirements.Key Responsibilities:Administer and support Cardiology PACS and CVIS systems (e.g., FUJI Synapse Cardiovascular 6 & 7, GE Muse, Philips-Xper, eCare Manager etc.)Coordinate with Cardiology, Radiology, and IT teams to ensure optimal performance and uptime of imaging systems.Manage storage, retrieval, archival, and transmission of DICOM images and cardiology reports.Monitor system performance, backup schedules, and perform routine maintenance.Troubleshoot issues related to image availability, modality connectivity, and workflow interruptions.Work with vendors to resolve technical issues and participate in system upgrades and patch deployments.Support integration with EMRs (e.g., Epic, Cerner) and ensure HL7/DICOM interfaces function properly.Conduct detailed Root Cause Analysis (RCA) for PACS outages, data inconsistencies, and workflow failures, document findings and drive remediation plans.Maintain documentation of system configuration, standard operating procedures, and change management.Train and support end users (cardiologists, techs, nurses) on system features and workflows.Ensure compliance with HIPAA, hospital policies, and industry regulations regarding data security and patient privacy. Qualifications & Skills: Bachelors degree in health informatics, IT, Biomedical Engineering, or related field (preferred).Minimum 3+ years of experience supporting any PACS/CVIS systems in a hospital or clinical environment.Strong knowledge of DICOM, HL7, IHE standards, and network protocols.Familiarity with radiology workflows, RIS integration, and medical imaging regulatory compliance.Experience with cardiology imaging modalities:ECHO, Cath Lab, ECG, Holter, Stress Test, Nuclear, etc.Proven ability to manage SLAs and perform in depth Root Cause Analysis (RCA).Familiarity with applications like FUJI Synapse Cardiovascular, GE Muse, Philips products, or similar.Good problem-solving, communication, and vendor management skills.Ability to participate in on-call support and respond to critical incidents as needed. Qualification 15 years full time education
Posted 3 weeks ago
7.0 - 12.0 years
3 - 7 Lacs
Bengaluru
Work from Office
About The Role Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Electronic Medical Records (EMR) Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years of full time eduaction Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service identifying and solving issues within multiple components of critical business systems. Your typical day involves troubleshooting and resolving software issues to ensure seamless operations. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Ensure effective communication within the team.- Implement best practices for software support.- Conduct regular performance evaluations for team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in Electronic Medical Records (EMR).- Strong understanding of software troubleshooting methodologies.- Experience in diagnosing and resolving software issues.- Knowledge of database management systems.- Familiarity with ITIL framework for service management. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Electronic Medical Records (EMR).- This position is based at our Bengaluru office.- A 15 years of full-time education is required. Qualification 15 years of full time eduaction
Posted 3 weeks ago
5.0 - 10.0 years
3 - 7 Lacs
Gurugram
Work from Office
About The Role Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : EPIC Systems Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service identifying and solving issues within multiple components of critical business systems. You will play a crucial role in ensuring the smooth functioning of the applications and resolving any technical glitches that may arise. Your expertise in EPIC Systems and problem-solving skills will be instrumental in maintaining the efficiency and reliability of the systems. Roles & Responsibilities:Epic Analyst will provide primary support for their designated application/module.Take on more advanced issues that arise during the project for their application area and will take on more complex tasks with respect to system configuration, testing and administration.Provide on-going system support and maintenance based on support rosterRespond in a timely manner to system issues and requestsConduct investigation, assessment, evaluation and deliver solutions and fixes to resolve system issues.Handle and deliver Service Request / Change Request / New Builds Perform system monitoring, such as error queues, alerts, batch jobs, etc and execute the required actions or SOPsPerform/support regular / periodic system patch, maintenance and verification.Perform/support the planned system upgrade work, cutover to production and post cutover support and stabilizationPerform/support the work required to comply with audit and security requirements.Require to overlap with client business or office hours Comply with Compliance requirements as mandated by the project Professional & Technical Skills: - Must To Have Skills: Certified in epic modules (RWB,Epic Care link,Haiku,Healthy Planet,Mychart,Rover,Willow ambulatory,Cogito, Ambulatory, Clindoc, Orders, ASAP, RPB, RHB, HIM Identity,HIM ROI, HIM DT, Cadence, Prelude, GC, Optime, Anesthesia, Beacon, Willow Imp, Cupid, Pheonix, Radiant, Beaker AP, Beaker CP, Bridges, Clarity, Radar, RWB)- Experience in troubleshooting and resolving application issues. Additional Information:- The candidate should have a minimum of 5 years of experience in EPIC Systems.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
7.0 - 12.0 years
3 - 7 Lacs
Navi Mumbai
Work from Office
About The Role Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : EPIC Systems Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service identifying and solving issues within multiple components of critical business systems. Your day will involve troubleshooting and resolving software-related issues to ensure seamless operations. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead and mentor junior team members.- Implement best practices for software support.- Conduct regular performance evaluations. Professional & Technical Skills: - Must To Have Skills: Proficiency in EPIC Systems.- Strong understanding of software architecture principles.- Experience in troubleshooting complex software issues.- Knowledge of ITIL framework for service management.- Hands-on experience with incident management tools. Additional Information:- The candidate should have a minimum of 7.5 years of experience in EPIC Systems.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 3 weeks ago
3.0 - 8.0 years
3 - 7 Lacs
Hyderabad
Work from Office
About The Role Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : EPIC Systems Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service identifying and solving issues within multiple components of critical business systems. Your typical day will involve troubleshooting and resolving software-related issues to ensure seamless operations. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Proactively identify and resolve software issues.- Collaborate with cross-functional teams to address system challenges.- Develop and implement software solutions to enhance system performance.- Conduct regular system audits to ensure data integrity and security.- Provide technical support and guidance to end-users. Professional & Technical Skills: - Must To Have Skills: Proficiency in EPIC Systems.- Strong understanding of software troubleshooting methodologies.- Experience in system monitoring and performance optimization.- Knowledge of database management and SQL queries.- Hands-on experience in software deployment and configuration. Additional Information:- The candidate should have a minimum of 3 years of experience in EPIC Systems.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 3 weeks ago
5.0 - 10.0 years
3 - 7 Lacs
Pune
Work from Office
About The Role Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : EPIC Systems Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service identifying and solving issues within multiple components of critical business systems. Your day will involve troubleshooting and resolving software-related issues to ensure seamless operations. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead and coordinate software investigations.- Implement solutions to enhance system performance.- Conduct system analysis and recommend improvements. Professional & Technical Skills: - Must To Have Skills: Proficiency in EPIC Systems.- Strong understanding of system architecture.- Experience in troubleshooting software issues.- Knowledge of database management systems.- Hands-on experience in system integration.- Good To Have Skills: Experience with ITIL framework. Additional Information:- The candidate should have a minimum of 5 years of experience in EPIC Systems.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
5.0 - 10.0 years
3 - 7 Lacs
Bengaluru
Work from Office
About The Role Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : EPIC Systems Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service identifying and solving issues within multiple components of critical business systems. You will play a crucial role in ensuring the smooth functioning of the applications and resolving any technical glitches that may arise. Your expertise in EPIC Systems and problem-solving skills will be instrumental in maintaining the efficiency and reliability of the systems. Roles & Responsibilities:Epic Analyst will provide primary support for their designated application/module.Take on more advanced issues that arise during the project for their application area and will take on more complex tasks with respect to system configuration, testing and administration.Provide on-going system support and maintenance based on support rosterRespond in a timely manner to system issues and requestsConduct investigation, assessment, evaluation and deliver solutions and fixes to resolve system issues.Handle and deliver Service Request / Change Request / New Builds Perform system monitoring, such as error queues, alerts, batch jobs, etc and execute the required actions or SOPsPerform/support regular / periodic system patch, maintenance and verification.Perform/support the planned system upgrade work, cutover to production and post cutover support and stabilizationPerform/support the work required to comply with audit and security requirements.Require to overlap with client business or office hours Comply with Compliance requirements as mandated by the project Professional & Technical Skills: - Must To Have Skills: Certified in epic modules (RWB,Epic Care link,Haiku,Healthy Planet,Mychart,Rover,Willow ambulatory,Cogito, Ambulatory, Clindoc, Orders, ASAP, RPB, RHB, HIM Identity,HIM ROI, HIM DT, Cadence, Prelude, GC, Optime, Anesthesia, Beacon, Willow Imp, Cupid, Pheonix, Radiant, Beaker AP, Beaker CP, Bridges, Clarity, Radar, RWB)- Experience in troubleshooting and resolving application issues. Additional Information:- The candidate should have a minimum of 5 years of experience in EPIC Systems.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
6.0 - 7.0 years
27 - 35 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Role & responsibilities Provide technical leadership and mentorship to data engineering teams. Architect, design, and deploy scalable, secure, and high-performance data pipelines. Collaborate with stakeholders, clients, and cross-functional teams to deliver end-to-end data solutions. Drive technical strategy and implementation plans in alignment with business needs. Oversee project execution using tools like JIRA, ensuring timely delivery and adherence to best practices. Implement and maintain CI/CD pipelines and automation tools to streamline development workflows. Promote best practices in data engineering and AWS implementations across the team. Preferred candidate profile Strong hands-on expertise in Python, PySpark, and Spark architecture, including performance tuning and optimization. Advanced proficiency in SQL and experience in writing optimized stored procedures. In-depth knowledge of the AWS data engineering stack, including: AWS Glue Lambda API Gateway EMR S3 Redshift Athena Experience with Infrastructure as Code (IaC) using CloudFormation and Terraform. Familiarity with Unix/Linux scripting and system administration is a plus. Proven ability to design and deploy robust, production-grade data solutions.
Posted 3 weeks ago
5.0 - 10.0 years
3 - 7 Lacs
Hyderabad
Work from Office
About The Role Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : EPIC Systems Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service identifying and solving issues within multiple components of critical business systems. You will play a crucial role in ensuring the smooth functioning of the applications and resolving any technical glitches that may arise. Your expertise in EPIC Systems and problem-solving skills will be instrumental in maintaining the efficiency and reliability of the systems. Roles & Responsibilities:Epic Analyst will provide primary support for their designated application/module.Take on more advanced issues that arise during the project for their application area and will take on more complex tasks with respect to system configuration, testing and administration.Provide on-going system support and maintenance based on support rosterRespond in a timely manner to system issues and requestsConduct investigation, assessment, evaluation and deliver solutions and fixes to resolve system issues.Handle and deliver Service Request / Change Request / New Builds Perform system monitoring, such as error queues, alerts, batch jobs, etc and execute the required actions or SOPsPerform/support regular / periodic system patch, maintenance and verification.Perform/support the planned system upgrade work, cutover to production and post cutover support and stabilizationPerform/support the work required to comply with audit and security requirements.Require to overlap with client business or office hours Comply with Compliance requirements as mandated by the project Professional & Technical Skills: - Must To Have Skills: Certified in epic modules (RWB,Epic Care link,Haiku,Healthy Planet,Mychart,Rover,Willow ambulatory,Cogito, Ambulatory, Clindoc, Orders, ASAP, RPB, RHB, HIM Identity,HIM ROI, HIM DT, Cadence, Prelude, GC, Optime, Anesthesia, Beacon, Willow Imp, Cupid, Pheonix, Radiant, Beaker AP, Beaker CP, Bridges, Clarity, Radar, RWB)- Experience in troubleshooting and resolving application issues. Additional Information:- The candidate should have a minimum of 5 years of experience in EPIC Systems.- This position is based at our Hyderabad office.- work from office is mandatory for all working days- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
10.0 - 15.0 years
22 - 37 Lacs
Bengaluru
Work from Office
Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl? As a Data Engineer, you'll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role? Ensuring a treasure trove of pristine, harmonized data is at everyone's fingertips. As an AWS Data Engineer at Kyndryl, you will be responsible for designing, building, and maintaining scalable, secure, and high-performing data pipelines using AWS cloud-native services. This role requires extensive hands-on experience with both real-time and batch data processing, expertise in cloud-based ETL/ELT architectures, and a commitment to delivering clean, reliable, and well-modeled datasets. Key Responsibilities: Design and develop scalable, secure, and fault-tolerant data pipelines utilizing AWS services such as Glue, Lambda, Kinesis, S3, EMR, Step Functions, and Athena. Create and maintain ETL/ELT workflows to support both structured and unstructured data ingestion from various sources, including RDBMS, APIs, SFTP, and Streaming. Optimize data pipelines for performance, scalability, and cost-efficiency. Develop and manage data models, data lakes, and data warehouses on AWS platforms (e.g., Redshift, Lake Formation). Collaborate with DevOps teams to implement CI/CD and infrastructure as code (IaC) for data pipelines using CloudFormation or Terraform. Ensure data quality, validation, lineage, and governance through tools such as AWS Glue Data Catalog and AWS Lake Formation. Work in concert with data scientists, analysts, and application teams to deliver data-driven solutions. Monitor, troubleshoot, and resolve issues in production pipelines. Stay abreast of AWS advancements and recommend improvements where applicable. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience Bachelor’s or master’s degree in computer science, Engineering, or a related field Over 8 years of experience in data engineering More than 3 years of experience with the AWS data ecosystem Strong experience with Java, Pyspark, SQL, and Python Proficiency in AWS services: Glue, S3, Redshift, EMR, Lambda, Kinesis, CloudWatch, Athena, Step Functions Familiarity with data modelling concepts, dimensional models, and data lake architectures Experience with CI/CD, GitHub Actions, CloudFormation/Terraform Understanding of data governance, privacy, and security best practices Strong problem-solving and communication skills Preferred Skills and Experience Experience working as a Data Engineer and/or in cloud modernization. Experience with AWS Lake Formation and Data Catalog for metadata management. Knowledge of Databricks, Snowflake, or BigQuery for data analytics. AWS Certified Data Engineer or AWS Certified Solutions Architect is a plus. Strong problem-solving and analytical thinking. Excellent communication and collaboration abilities. Ability to work independently and in agile teams. A proactive approach to identifying and addressing challenges in data workflows. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.
Posted 3 weeks ago
6.0 - 11.0 years
15 - 25 Lacs
Kolkata, Hyderabad, Bengaluru
Work from Office
Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Masters Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
The ideal candidate for this position should have advanced proficiency in Python, with a solid understanding of inheritance and classes. Additionally, the candidate should be well-versed in EMR, Athena, Redshift, AWS Glue, IAM roles, CloudFormation (CFT is optional), Apache Airflow, Git, SQL, Py-Spark, Open Metadata, and Data Lakehouse. Experience with metadata management is highly desirable, particularly with AWS Services such as S3. The candidate should possess the following key skills: - Creation of ETL Pipelines - Deploying code in EMR - Querying in Athena - Creating Airflow Dags for scheduling ETL pipelines - Knowledge of AWS Lambda and ability to create Lambda functions This role is for an individual contributor, and as such, the candidate is expected to autonomously manage client communication and proactively resolve technical issues without external assistance.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be joining Coders Brain Technology Pvt. Ltd., a global leader in services, digital, and business solutions. At Coders Brain, we partner with our clients to simplify, strengthen, and transform their businesses. We are committed to providing the highest levels of certainty and satisfaction through our comprehensive industry expertise and global network of innovation and delivery centers. As a Data Engineer with a minimum of 5 years of experience, you will be working remotely. Your role will involve collaborating with other developers to define and refine solutions. You will work closely with the business to deliver data and analytics projects. Your responsibilities will include working on data integration with various tools such as Apache Spark, EMR, Glue, Kafka, Kinesis, and Lambda in AWS Cloud environment. You should have strong real-life experience in Python development, especially in pySpark within AWS Cloud. Designing, developing, testing, deploying, maintaining, and improving data integration pipelines will be a key part of your role. Additionally, you should have experience with Python and common libraries, Perl, Unix Scripts, and analytical skills with databases. Proficiency in source control systems like Git, Bitbucket, and continuous integration tools like Jenkins is required. Experience with continuous deployment (CI/CD), Databricks, Airflow, and Apache Spark will be beneficial. Knowledge of databases such as Oracle, SQL Server, PostgreSQL, Redshift, MySQL, or similar is essential. Exposure to ETL tools including Informatica is preferred. A degree in Computer Science, Computer Engineering, or Electrical Engineering is desired. If you are interested in this opportunity, click on the apply button. Alternatively, you can send your resume to prerna.jain@codersbrain.com or pooja.gupta@codersbrain.com.,
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough