Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 5.0 years
18 - 21 Lacs
Hyderabad
Work from Office
Overview Annalect is currently seeking a data engineer to join our technology team. In this role you will build Annalect products which sit atop cloud-based data infrastructure. We are looking for people who have a shared passion for technology, design & development, data, and fusing these disciplines together to build cool things. In this role, you will work on one or more software and data products in the Annalect Engineering Team. You will participate in technical architecture, design, and development of software products as well as research and evaluation of new technical solutions. Responsibilities Design, build, test and deploy scalable and reusable systems that handle large amounts of data. Collaborate with product owners and data scientists to build new data products. Ensure data quality and reliability Qualifications Experience designing and managing data flows. Experience designing systems and APIs to integrate data into applications. 4+ years of Linux, Bash, Python, and SQL experience 2+ years using Spark and other frameworks to process large volumes of data. 2+ years using Parquet, ORC, or other columnar file formats. 2+ years using AWS cloud services, esp. services that are used for data processing e.g. Glue, Dataflow, Data Factory, EMR, Dataproc, HDInsights , Athena, Redshift, BigQuery etc. Passion for Technology: Excitement for new technology, bleeding edge applications, and a positive attitude towards solving real world challenges
Posted 1 month ago
2.0 - 7.0 years
3 - 7 Lacs
Pune
Work from Office
Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : EPIC Systems Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service identifying and solving issues within multiple components of critical business systems. Your day will involve troubleshooting and resolving software-related problems to ensure seamless operations. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Proactively identify and resolve software issues.- Collaborate with cross-functional teams to troubleshoot system problems.- Develop and implement software solutions to enhance system performance.- Document software defects and resolutions for future reference.- Provide technical support and guidance to end-users. Professional & Technical Skills: - Must To Have Skills: Proficiency in EPIC Systems.- Strong understanding of software troubleshooting methodologies.- Experience in software development and implementation.- Knowledge of database management and SQL queries.- Familiarity with IT service management tools. Additional Information:- The candidate should have a minimum of 2 years of experience in EPIC Systems.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 5.0 years
3 - 5 Lacs
Navi Mumbai
Work from Office
Skill required: Order to Cash - Account Management Designation: Order to Cash Operations Analyst Qualifications: Any Graduation Years of Experience: 3 to 5 years Language - Ability: English(International) - Advanced About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do You will be aligned with our Finance Operations vertical and will be helping us in determining financial outcomes by collecting operational data/reports, whilst conducting analysis and reconciling transactions.Optimizing working capital, providing real-time visibility and end-to-end management of revenue and cash flow, and streamlining billing processes. This team over looks the entire processes that starts from customers inquiry, sales order to delivery and invoicing. The Cash Application Processing team focuses on solving queries related to cash applications and coordination with the customers. The role requires a good understanding of cash applications, the process of applying unapplied cash, reconciliation of suspense account in cash application, and process them from payment receipt to finalization.Implement client account plans through relationship development and opportunity pursuits that builds deeper client relationships. Includes monitoring existing services to identify opportunities that provide additional and innovative value to the client. What are we looking for * Analytical Thinking Read & understand the issues/problems* Healthcare experience EPIC or ORMB Roles and Responsibilities: In this role you are required to do analysis and solving of lower-complexity problems Your day to day interaction is with peers within Accenture before updating supervisors In this role you may have limited exposure with clients and/or Accenture management You will be given moderate level instruction on daily work tasks and detailed instructions on new assignments The decisions you make impact your own work and may impact the work of others You will be an individual contributor as a part of a team, with a focused scope of work Please note that this role may require you to work in rotational shifts Qualification Any Graduation
Posted 1 month ago
0.0 - 3.0 years
4 - 5 Lacs
Hyderabad
Work from Office
We are seeking a dynamic and client-facing Healthcare IT Solutions Specialist to bridge the gap between healthcare delivery and technology. This role is ideal for professionals with a healthcare background who are passionate about implementing integrated marketing strategies, supporting sales, and improving client engagement in a healthcare IT environment. Key Responsibilities: Develop impactful marketing content such as product sheets, presentations, videos, newsletters, case studies, and blogs. Collaborate with sales and product teams to support customer acquisition and client retention. Organize and coordinate participation in healthcare events, webinars, and trade shows. Conduct market research and competitor analysis to enhance outreach efforts. Track, analyze, and report on the performance of marketing campaigns, focusing on ROI and engagement metrics. Assist with RFP responses, client presentations, and branding activities. Regularly interact with healthcare stakeholders and visit customer sites as needed. Requirements: Educational Qualification: Bachelor's degree in Medicine, dentistry, healthcare administration, or related fields (MBBS or BDS preferred). An MBA in Hospital & Healthcare Management is mandatory. Experience & Knowledge: Understanding of hospital workflows across clinical, non-clinical, and support departments. Familiarity with healthcare quality and accreditation standards such as NABH, NABL, JCI. Working knowledge of data analytics with the ability to interpret engagement and performance metrics. Communication & Language Skills: Excellent communication skills with the ability to present ideas clearly and confidently. Fluency in English and Hindi is a must (spoken and written). Other Skills: Strong stakeholder engagement and client-facing approach. Willingness to travel for meetings, demos, and client visits. Preferred Qualifications: Knowledge of HIS/EHR/EMR systems and other healthcare IT platforms. Additional Notes: This is a client-facing role requiring strong interpersonal skills. Willingness to travel is essential. Only male candidates will be considered due to specific operational requirements.
Posted 1 month ago
0.0 - 3.0 years
2 - 6 Lacs
Mumbai
Work from Office
Note: This role would involve Rotational Shifts (Morning and Nights) Basic roles: Patient care Maintain compliance Clinically sound Assist the surgeon/Consultant Roles and Responsibilities Basic roles: Patient care Maintain compliance Clinically sound Assist the surgeons
Posted 1 month ago
3.0 - 6.0 years
20 - 30 Lacs
Bengaluru
Work from Office
Job Title: Data Engineer II (Python, SQL) Experience: 3 to 6 years Location: Bangalore, Karnataka (Work from office, 5 days a week) Role: Data Engineer II (Python, SQL) As a Data Engineer II, you will work on designing, building, and maintaining scalable data pipelines. Youll collaborate across data analytics, marketing, data science, and product teams to drive insights and AI/ML integration using robust and efficient data infrastructure. Key Responsibilities: Design, develop and maintain end-to-end data pipelines (ETL/ELT). Ingest, clean, transform, and curate data for analytics and ML usage. Work with orchestration tools like Airflow to schedule and manage workflows. Implement data extraction using batch, CDC, and real-time tools (e.g., Debezium, Kafka Connect). Build data models and enable real-time and batch processing using Spark and AWS services. Collaborate with DevOps and architects for system scalability and performance. Optimize Redshift-based data solutions for performance and reliability. Must-Have Skills & Experience: 3+ years in Data Engineering or Data Science with strong ETL and pipeline experience. Expertise in Python and SQL . Strong experience in Data Warehousing , Data Lakes , Data Modeling , and Ingestion . Working knowledge of Airflow or similar orchestration tools. Hands-on with data extraction techniques like CDC , batch-based, using Debezium, Kafka Connect, AWS DMS . Experience with AWS Services : Glue, Redshift, Lambda, EMR, Athena, MWAA, SQS, etc. Knowledge of Spark or similar distributed systems. Experience with queuing/messaging systems like SQS , Kinesis , RabbitMQ .
Posted 1 month ago
10.0 - 15.0 years
17 - 20 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Work from Office
Key Responsibilities • Design end-to-end data workflow architecture using Alteryx Designer, Server, and Connect. • Translate HLS use cases into technical workflows, including patient journey analytics, claims automation, and real-world evidence processing. • Integrate Alteryx with cloud platforms, EMR systems, SQL-based data warehouses, and visualization tools. • Ensure compliance with HIPAA, GxP, FDA, and other regulatory standards. • Guide the data governance, security, and quality framework across solutions. • Collaborate with clinical analysts, data scientists, and IT to deliver analytics solutions aligned with business goals. • Provide thought leadership, mentor junior Alteryx resources, and contribute to CoEs. Required Skills & Qualifications • Deep expertise in Alteryx Designer, Alteryx Server, and related tools. • Proficient in SQL, Python, and integrating workflows with cloud ecosystems (AWS, Azure, GCP). • Strong understanding of healthcare data models (claims, EMR/EHR, HL7/FHIR). • Mandatory: Minimum 3-5 years of experience working with healthcare or life sciences datasets . • Familiarity with regulatory frameworks (HIPAA, FDA, GxP). • Strong communication and stakeholder management skills.
Posted 1 month ago
5.0 - 10.0 years
15 - 30 Lacs
Hyderabad
Hybrid
Dear Applicants, Astrosoft Technologies is back with Exciting Job opportunity Join our Growing Team & be a part to explore it. We are Hiring 'Sr. AWS Data Engineer' ! Hyderabad (Hybrid) ! 3 Tech Rounds ! Early Joiners ! Apply Here to Email - Karthik.Jangam@astrosofttech.com with your Updated Resume & Requested details to reach you out: Total Experience- AWS DE Services Current Location- Current Company- C-CTC- Ex-CTC – Offer (Y/N)– Notice Period (Max-20 Days)– Ready to Relocate Hyderabad (Y/N) – Company: AstroSoft Technologies (https://www.astrosofttech.com/) Astrosoft is an award-winning company that specializes in the areas of Data, Analytics, Cloud, AI/ML, Innovation, Digital. We have a customer first mindset and take extreme ownership in delivering solutions and projects for our customers and have consistently been recognized by our clients as the premium partner to work with. We bring to bear top tier talent, a robust and structured project execution framework, our significant experience over the years and have an impeccable record in delivering solutions and projects for our clients. Founded in 2004, Headquarters in FL,USA, Corporate Office - India, Hyderabad Benefits Program: H1B Sponsorship (Depends on Project & Performance) Lunch & Dinner (Every day) Health Insurance Coverage- Group Industry Standards Leave Policy Skill Enhancement Certification Hybrid Mode Job Summary: Strong experience and understanding of streaming architecture and development practices using Kafka, spark, flink etc., Strong AWS development experience using S3, SNS, SQS, MWAA (Airflow) Glue, DMS and EMR. Strong knowledge of one or more programing languages Python/Java/Scala (ideally Python) Experience using Terraform to build IAC components in AWS. Strong experience with ETL Tools in AWS; ODI experience is as plus. Strong experience with Database Platforms: Oracle, AWS Redshift Strong experience in SQL tuning, tuning ETL solutions, physical optimization of databases. Very familiar with SRE concepts which includes evaluating and implementing monitoring and observability tools like Splunk, Data Dog, CloudWatch and other job, log or dashboard concepts for customer support and application health checks. Ability to collaborate with our business partners to understand and implement their requirements. Excellent interpersonal skills and be able to build consensus across teams. Strong critical thinking and ability to think out-of-the box. Self-motivated and able to perform under pressure. AWS certified (preferred) Thanks & Regards Karthik Kumar (HR TAG Lead -India) Astrosoft Technologies India Private Limited Contact: +91-8712229084 Email: karthik.jangam@astrosofttech.com
Posted 1 month ago
5.0 - 10.0 years
35 - 60 Lacs
Katihar
Work from Office
Managing patient's condition, including prescribing medications, adjusting life support equipment, & coordinating with specialists. Monitor patient progress, interpret data from various diagnostic tools, and adjust treatment strategies as needed.
Posted 1 month ago
0.0 - 4.0 years
9 - 10 Lacs
Chennai
Work from Office
Role: As a doctor with Amura Health, you'll experience the perfect blend of clinical experience and work-life balance. Here's what we offer: Practice and Learn: Work on real-world cases with a team of senior doctors who support your growth. Leadership Pathways: The opportunity to rise into leadership roles and make a significant impact as we scale our practice. Recognition and Respect: We deeply value your contributions every effort is acknowledged and celebrated. A Unique Model of Care: Practice Natural Molecular Therapy (NMT), a breakthrough treatment approach for chronic diseases, with patients under your care from the comfort of a digital environment. Global Reach: Deliver healthcare to patients worldwide, all from a digital platform think of us as a hospital in the cloud. Preferred candidate profile Fresh and experienced MBBS Graduates ready to dive into the future of medicine and healthcare at Amura Health. Passionate about learning, growth, and clinical excellence. Comfortable working in a tech-driven, innovative environment with an openness to virtual care. Committed to hard work, but also looking for a place where your efforts are recognized and respected. Interested in becoming a part of a rockstar team at Amura Health that holds itself to the highest standards of professionalism, empathy, and innovation. MBBS with medical council registration (MCI) Type: Online consultations from our office Location: Perungudi, OMR Chennai. Timing: 12-hour shift, weekly one-off (6 days a week) Perks and benefits Competitive Salary: 10.8 Lakh per annum (90,000 CTC per month). Unmatched Learning Opportunities: Get firsthand exposure to NMT, a revolutionary form of healthcare.
Posted 1 month ago
2.0 - 6.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Flexing It is a freelance consulting marketplace that connects freelancers and independent consultants with organisations seeking independent talent.. Flexing It has partnered with Our client, a global leader in energy management and automation, is seeking a Data engineer to prepare data and make it available in an efficient and optimized format for their different data consumers, ranging from BI and analytics to data science applications. It requires to work with current technologies in particular Apache Spark, Lambda & Step Functions, Glue Data Catalog, and RedShift on AWS environment.. Key Responsibilities:. Design and develop new data ingestion patterns into IntelDS Raw and/or Unified data layers based on the requirements and needs for connecting new data sources or for building new data objects. Working in ingestion patterns allow to automate the data pipelines.. Participate to and apply DevSecOps practices by automating the integration and delivery of data pipelines in a cloud environment. This can include the design and implementation of end-to-end data integration tests and/or CICD pipelines.. Analyze existing data models, identify and implement performance optimizations for data. ingestion and data consumption. The objective is to accelerate data availability within the. platform and to consumer applications.. Support client applications in connecting and consuming data from the platform, and ensure they follow our guidelines and best practices.. Participate in the monitoring of the platform and debugging of detected issues and bugs. Skills required:. Minimum of 3 years prior experience as data engineer with proven experience on Big Data and Data Lakes on a cloud environment.. Bachelor or Master degree in computer science or applied mathematics (or equivalent). Proven experience working with data pipelines / ETL / BI regardless of the technology.. Proven experience working with AWS including at least 3 of: RedShift, S3, EMR, Cloud. Formation, DynamoDB, RDS, lambda.. Big Data technologies and distributed systems: one of Spark, Presto or Hive.. Python language: scripting and object oriented.. Fluency in SQL for data warehousing (RedShift in particular is a plus).. Good understanding on data warehousing and Data modelling concepts. Familiar with GIT, Linux, CI/CD pipelines is a plus.. Strong systems/process orientation with demonstrated analytical thinking, organization. skills and problem-solving skills.. Ability to self-manage, prioritize and execute tasks in a demanding environment.. Strong consultancy orientation and experience, with the ability to form collaborative,. productive working relationships across diverse teams and cultures is a must.. Willingness and ability to train and teach others.. Ability to facilitate meetings and follow up with resulting action items. Show more Show less
Posted 1 month ago
4.0 - 8.0 years
9 - 13 Lacs
Bengaluru
Work from Office
???? We’re Hiring: Delivery Solution Architect – Data Analytics (AWS) ????. ???? Location: Remote. ???? Level: Mid to Senior (3–5+ years experience). Are you passionate about turning complex data challenges into elegant, scalable solutions on the AWS cloud? We're looking for a Delivery Solution Architect – Data Analytics to join our growing team and take the lead in architecting and delivering next-gen data platforms that drive real business impact.. ???? About the Role:. As a Delivery Solution Architect, you will play a pivotal role in designing and implementing end-to-end data analytics solutions on AWS. You’ll collaborate with cross-functional teams and lead a group of 5–10 consultants and engineers to bring modern data architectures to life—powering business intelligence, machine learning, and operational insights.. ????? Key Responsibilities:. Lead the design and delivery of data analytics solutions using AWS services (Redshift, EMR, Glue, Kinesis, etc.). Collaborate with project teams, clients, and sales stakeholders to craft technical proposals and solution blueprints. Design scalable, secure, high-performance data models and ETL pipelines. Optimize data platforms for cost-efficiency, query performance, and concurrency. Ensure data governance, security, and compliance with best practices. Troubleshoot technical issues and provide mentorship to engineering teams. Stay ahead of industry trends and bring innovative solutions to the table. Report to practice leads, contribute to documentation, and support deployment activities. ???? Qualifications:. 3–5 years of experience as a Solution Architect or Technical Lead in data analytics delivery. Hands-on expertise with AWS data tools (Redshift, EMR, Glue, Kinesis, etc.). Proficiency in SQL and Python; strong data modeling and ETL experience. Knowledge of Microsoft Azure Data Analytics tools is a plus. Experience working in Agile teams and using version control (e.g., Git). Strong communication skills and ability to collaborate with technical and non-technical stakeholders. AWS Certifications (Solutions Architect & Data Analytics – Specialty) are required. ???? Preferred Skills:. Team leadership in project delivery environments. Familiarity with data governance, data quality, and metadata management. Documentation, proposal writing, and client engagement skills. ???? What’s In It For You?. Opportunity to work with advanced AWS data technologies. Be part of a collaborative, innovation-focused team. Shape data strategies that directly impact enterprise decision-making. Career growth in a cloud-first, analytics-driven environment. ???? Ready to architect the future of data? Apply now or reach out to learn more!. #AWSJobs #DataAnalytics #SolutionArchitect #Hiring #AWSCareers #CloudComputing #DataEngineering #Redshift #Glue #AzureData #TechJobs. Show more Show less
Posted 1 month ago
8.0 - 13.0 years
25 - 30 Lacs
Mangaluru
Work from Office
Job Summary. As a DevOps Engineer specializing in data, you will be responsible for implementing and managing our cloud-based data infrastructure using AWS and Snowflake. You will collaborate with data engineers, data scientists, and other stakeholders to design, deploy, and maintain a robust data ecosystem that supports our analytics and business intelligence initiatives. Your expertise in modern data tech stacks, MLOps methodologies, automation, and information security will be crucial in enhancing our data pipelines and ensuring data integrity and availability.. Technical Skills. 3+ years of experience in the field of Data Warehousing and BI. Experience working with Snowflake Database. In depth knowledge of Data Warehouse concepts.. Experience in designing, developing, testing, and implementing ETL solutions using enterprise ETL tools .. Experience with large or partitioned relational databases (Aurora / MySQL / DB2). Very strong SQL and data analysis capabilities. Familiarly with Billing and Payment data is a plus. Agile development (Scrum) experience. Other preferred experience includes working with DevOps practices, SaaS, IaaS, code management (CodeCommit, git), deployment tools (CodeBuild, CodeDeploy, Jenkins, Shell scripting), and Continuous Delivery. Primary AWS development skills include S3, IAM, Lambda, RDS, Kinesis, APIGateway, Redshift, EMR, Glue, and CloudFormation. Responsibilities. Be a key contributor to the design and development of a scalable and cost-effective cloud-based data platform based on a data lake design. Develop data platform components in a cloud environment to ingest data and events from cloud and on-premises environments as well as third parties. Build automated pipelines and data services to validate, catalog, aggregate and transform ingested data. Build automated data delivery pipelines and services to integrate data from the data lake to internal and external consuming applications and services. Build and deliver cloud-based deployment and monitoring capabilities consistent with DevOps models. Deep knowledge and skills current with the latest cloud services, features and best practices. Who We Are:. unifyCX is an emerging Global Business Process Outsourcing company with a strong presence in the U.S., Colombia, Dominican Republic, India, Jamaica, Honduras, and the Philippines. We provide personalized contact centers, business processing, and technology outsourcing solutions to clients worldwide. In nearly two decades, unifyCX has grown from a small team to a global organization with staff members all over the world dedicated to supporting our international clientele.. At unifyCX, we leverage advanced AI technologies to elevate the customer experience (CX) and drive operational efficiency for our clients. Our commitment to innovation positions us as a trusted partner, enabling businesses across industries to meet the evolving demands of a global market with agility and precision.. unifyCX is a certified minority-owned business and an EOE employer who welcomes diversity.. Show more Show less
Posted 1 month ago
3.0 - 7.0 years
8 - 12 Lacs
Mangaluru
Work from Office
Job Summary. As a DevOps Engineer specializing in data, you will be responsible for implementing and managing our cloud-based data infrastructure using AWS and Snowflake. You will collaborate with data engineers, data scientists, and other stakeholders to design, deploy, and maintain a robust data ecosystem that supports our analytics and business intelligence initiatives. Your expertise in modern data tech stacks, MLOps methodologies, automation, and information security will be crucial in enhancing our data pipelines and ensuring data integrity and availability.. Technical Skills. 3+ years of experience in the field of Data Warehousing and BI. Experience working with Snowflake Database. In depth knowledge of Data Warehouse concepts.. Experience in designing, developing, testing, and implementing ETL solutions using enterprise ETL tools .. Experience with large or partitioned relational databases (Aurora / MySQL / DB2). Very strong SQL and data analysis capabilities. Familiarly with Billing and Payment data is a plus. Agile development (Scrum) experience. Other preferred experience includes working with DevOps practices, SaaS, IaaS, code management (CodeCommit, git), deployment tools (CodeBuild, CodeDeploy, Jenkins, Shell scripting), and Continuous Delivery. Primary AWS development skills include S3, IAM, Lambda, RDS, Kinesis, APIGateway, Redshift, EMR, Glue, and CloudFormation. Responsibilities. Be a key contributor to the design and development of a scalable and cost-effective cloud-based data platform based on a data lake design. Develop data platform components in a cloud environment to ingest data and events from cloud and on-premises environments as well as third parties. Build automated pipelines and data services to validate, catalog, aggregate and transform ingested data. Build automated data delivery pipelines and services to integrate data from the data lake to internal and external consuming applications and services. Build and deliver cloud-based deployment and monitoring capabilities consistent with DevOps models. Deep knowledge and skills current with the latest cloud services, features and best practices. Who We Are:. unifyCX is an emerging Global Business Process Outsourcing company with a strong presence in the U.S., Colombia, Dominican Republic, India, Jamaica, Honduras, and the Philippines. We provide personalized contact centers, business processing, and technology outsourcing solutions to clients worldwide. In nearly two decades, unifyCX has grown from a small team to a global organization with staff members all over the world dedicated to supporting our international clientele.. At unifyCX, we leverage advanced AI technologies to elevate the customer experience (CX) and drive operational efficiency for our clients. Our commitment to innovation positions us as a trusted partner, enabling businesses across industries to meet the evolving demands of a global market with agility and precision.. unifyCX is a certified minority-owned business and an EOE employer who welcomes diversity.. Show more Show less
Posted 1 month ago
8.0 - 13.0 years
12 - 22 Lacs
Pune
Hybrid
- Exp in developing applications using Python, Glue(ETL), Lambda, step functions services in AWS EKS, S3, Glue, EMR, RDS Data Stores, CloudFront, API Gateway - Exp in AWS services such as Amazon Elastic Compute (EC2), Glue, Amazon S3, EKS, Lambda Required Candidate profile - 10+ years of exp in software development and technical leadership, preferably having a strong financial knowledge in building complex trading applications. - 5+ years of people management exp.
Posted 1 month ago
2.0 - 3.0 years
5 - 10 Lacs
Gurugram
Work from Office
Role & responsibilities Gather and collate inputs from stakeholders and end users for designing and improving Clinical HIS, EMR and other medical IT solutions. Engage proactively with development team and other cross functional teams to support in design/ modification/ upgrade of functionalities in modules. Complete documentation of processes, design, training modules, changes in modules as per scope/new requirement and as per guidelines. Take responsibility and engagement in setting up new applications, preparatory and support activities, project planning and execution, vendor and customer engagement. Master data preparation, testing and management at hospital and central level. Carry out and facilitate end to end user acceptance testing as per project timeline and when required. • Provide onsite and online training to end user(s). Identify, escalate and resolve risks/issues at units and complete post implementation support. Analysis and report preparation on various business parameters as and when required Preferred candidate profile 2-3 years experience in hospital/healthcare operation. MBBS/BDS with MHA/MBA
Posted 1 month ago
4.0 - 9.0 years
12 - 16 Lacs
Kochi
Work from Office
As Data Engineer, you wi deveop, maintain, evauate and test big data soutions. You wi be invoved in the deveopment of data soutions using Spark Framework with Python or Scaa on Hadoop and AWS Coud Data Patform Responsibiities: Experienced in buiding data pipeines to Ingest, process, and transform data from fies, streams and databases. Process the data with Spark, Python, PySpark, Scaa, and Hive, Hbase or other NoSQL databases on Coud Data Patforms (AWS) or HDFS Experienced in deveop efficient software code for mutipe use cases everaging Spark Framework / using Python or Scaa and Big Data technoogies for various use cases buit on the patform Experience in deveoping streaming pipeines Experience to work with Hadoop / AWS eco system components to impement scaabe soutions to meet the ever-increasing data voumes, using big data/coud technoogies Apache Spark, Kafka, any Coud computing etc Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Minimum 4+ years of experience in Big Data technoogies with extensive data engineering experience in Spark / Python or Scaa ; Minimum 3 years of experience on Coud Data Patforms on AWS; Experience in AWS EMR / AWS Gue / DataBricks, AWS RedShift, DynamoDB Good to exceent SQL skis Exposure to streaming soutions and message brokers ike Kafka technoogies Preferred technica and professiona experience Certification in AWS and Data Bricks or Coudera Spark Certified deveopers
Posted 1 month ago
5.0 - 8.0 years
7 - 11 Lacs
Chennai, Malaysia, Malaysia
Work from Office
Responsibilities for Data Engineer Create and maintain the optimal data pipeline architecture, Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS big data technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Keep our data separated and secure across national boundaries through multiple data centers and AWS regions. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems. Qualifications for Data Engineer We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools: Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience building and optimizing big data data pipelines, architectures and data sets. Experience performing root cause analysis of internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. A successful history of manipulating, processing and extracting value from large disconnected datasets. Working knowledge of message queuing, stream processing, and highly scalable big data data stores. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including MongoDB, Postgres and Cassandra, AWS Redshift, Snowflake Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. Experience with AWS cloud services: EC2, EMR, ETL, Glue, RDS, Redshift Experience with stream-processing systems: Storm, Spark-Streaming, etc. Experience with object-oriented/object function scripting languages: Python, Java, etc. Knowledge of data pipelines and workflow management tools like Airflo Location: Chenna, India / Kuala Lumpur, Malaysia
Posted 1 month ago
1.0 - 4.0 years
3 - 7 Lacs
Kochi
Work from Office
Market Development Representative RCM Services: Not a Lead Tracker. A Market Activator. Lets be realRCM is crowded. Everyone claims to do billing, coding, AR follow-up, and denial management like its revolutionary. We don’t need someone to parrot the pitch. We need someone who can open doors in closed markets , make providers stop and listen , and drive the first wedge in crowded conversations. At blueBriX , we deliver end-to-end RCM services for U.S.-based healthcare providers who are done losing revenue to broken systems, slow follow-up, and outsourced chaos. Your mission? Create awareness. Build demand. Start the conversation. This role is not for: People who love templates and mass emails Folks who panic at “cold” anything Anyone who needs a lead handed to them with a bow on it This role is for: Strategic prospectors who understand healthcare lingo and provider pain points People who can speak to billing managers, practice owners, and CFOs without blinking Professionals who know how to turn curiosity into a calendar booking The first line of offense in the sales engine What you’ll do: Identify and qualify new prospects through research, outreach, and hustle Send personalized, high-converting emails and messages that don’t end up in Trash Make cold calls that actually land Collaborate with Sales and Marketing to refine targeting and messaging Educate prospects on how blueBriX RCM can stop revenue leaks and speed up cashflow Track every interaction like a hawk—because data builds momentum You’ll thrive here if: You have 1–3 years of experience in sales or market development—preferably in RCM, healthcare BPO, or B2B services You’re a research ninja, a strong communicator, and an even better listener You can turn “just exploring” into “let’s schedule a demo” You don’t need handholding, but know how to loop in the team when the moment’s right You bring hunger, humility, and relentless follow-up Bonus if: You’ve prospected into U.S. physician practices, medical groups, or ambulatory care centers You understand RCM basics—denials, aging AR, clean claims, clearinghouses, and the real revenue killers You’ve worked with CRM tools (Zoho, HubSpot, Salesforce—we don’t care, as long as you use it well) Location: Kochi, India (in-office) Reports To: RCM Sales Lead Vertical: U.S. Healthcare Revenue Cycle Management You’re not just warming up leads. You’re lighting the fuse. EH If you love the chase, know how to get attention in a crowded inbox, and want to be the reason a deal starts—this is your shot. Let’s open the market—one conversation at a time.
Posted 1 month ago
8.0 - 12.0 years
20 - 22 Lacs
Chennai
Hybrid
This position is in US Shift timings Position Summary This position is responsible for the development, implementation, and support of interfaces between Quest Diagnostic applications and customer applications or Quest Lab Systems to Quest Laboratory instruments. Responsibilities Performs and facilitates research and testing, provide overall project leadership to ensure interfaces conform to the Quest HL7 standards. Responsible for multiple implementations or conversion projects. Participates in information gathering for interface requirements. Educates and directly interacts with stakeholders to help them understand the benefits and limitations of their specific EMR/LIS-Interface during implementation. Provides direct interaction with client and clients vendor Designs and oversees interface implementation project plan. Responsible for providing project status updates. Will provide communications such as go-live notices, support documentation and operational checklists to our internal organization relative to any new or change to existing client interface. Implementation of communication scripts and protocols to include dialup, VPN and frame and web services. Provides 3rd tier support (as defined in the Support SOP) to EMR/LIS-Interface related problems. Provides post implementation support Responsible for escalating implementation and support issues to appropriate persons or groups for resolution. Coordinate user-group sessions with EMR/LIS-Interfaced clients and internal staff to provide a forum for exchange of ideas to improve the overall delivery of our services. Meets productivity requirements for this position Required Skills HL7 (Health Level 7) implementation experience required EMR (Electronic Medical Records) and LIS (Lab Information System) software experience required Interface project implementation experience required Interface project testing experience preferred/helpful Interface project go-live experience required
Posted 1 month ago
3.0 - 8.0 years
7 - 16 Lacs
Kolkata, Chennai, Bengaluru
Hybrid
Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Electronic Medical Records (EMR) Summary: As an Application Support Engineer, you will be responsible for identifying and solving issues within multiple components of critical business systems related to Electronic Medical Records (EMR). Your typical day will involve providing dynamic service and support to ensure seamless functioning of the systems. Roles & Responsibilities: - Provide technical support and troubleshooting for EMR systems, identifying and resolving issues within multiple components of critical business systems. - Collaborate with cross-functional teams to ensure seamless functioning of EMR systems, including software development, testing, and deployment teams. - Develop and maintain technical documentation related to EMR systems, including user manuals, troubleshooting guides, and knowledge base articles. - Conduct regular system audits and performance monitoring to identify potential issues and proactively address them before they impact system functionality. - Stay updated with the latest advancements in EMR systems and technologies, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: - Must To Have Skills: Strong understanding of Electronic Medical Records (EMR) systems and related technologies. - Good To Have Skills: No Technology Specialization. - Experience in providing technical support and troubleshooting for critical business systems. - Experience in collaborating with cross-functional teams, including software development, testing, and deployment teams. - Strong technical documentation skills, including user manuals, troubleshooting guides, and knowledge base articles. - Experience in conducting system audits and performance monitoring to identify potential issues and proactively address them before they impact system functionality.
Posted 1 month ago
3.0 - 8.0 years
7 - 17 Lacs
Noida
Work from Office
About The Job Position Title: Big Data Admin Department: Product & Engineering Job Scope: India Location: Noida, India Reporting to: Director- DevOps Work Setting: Onsite Purpose of the Job As we are working on Big Data stack as well; where we need to setup, optimise, and maintain multiple services of Big Data clusters. We have expertise in AWS, Cloud, security, cluster etc. but now, we need a special expertise person (Big-Data Admin) who can help us to setup and maintain Bigdata environment in better way and keep it live with multi-cluster setup in Production environment. Key Responsibilities Participate in requirements analysis. Write clean, scalable jobs. Collaborate with internal teams to produce solutions and architecture. Test and deploy applications and systems. Revise, update, refactor, and debug code. Improve existing software. Serve as an expert on Big Data stack and provide technical support Qualifications Requirement: Experience, Skills & Education Graduate with 3+ years of Experience in Big Data technology Expertise in Hadoop, Yarn, Spark, Airflow, Cassandra, ELK, Redis, Grafana etc Expertise in cloud managed Bigdata stack like MWAA, EMR, EKS etc Good Knowledge of python and scripting Knowledge of optimization and performance tuning for Bigdata stack Troubleshoot skill is a must. Must have good knowledge of Linux OS and troubleshooting. Desired Skills Bigdata stack. Linux OS and troubleshooting Why Explore a Career: Be a Part of the Revolution in Healthcare Marketing. Innovate with Us to Unite and Transform the Healthcare Providers (HCPs) - Ecosystem for Improved Patient Outcomes. It has been recognized and certified two times in a row Best places to work NJ 2023, Great Place to work 2023. If you are passionate about health technology and have a knack for turning complex concepts into compelling narratives, we invite you to apply for this exciting opportunity to contribute to the success of our innovative health tech company. Below are the competitive benefits that will be provided to the selected candidates basis their location. Competitive Salary Package Generous Leave Policy Flexible Working Hours Performance-Based Bonuses Health Care Benefits
Posted 1 month ago
8.0 - 13.0 years
30 - 35 Lacs
Bengaluru
Work from Office
About The Role Data Engineer -1 (Experience 0-2 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 1 month ago
9.0 - 14.0 years
30 - 35 Lacs
Bengaluru
Work from Office
About The Role Data Engineer -2 (Experience 2-5 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 1 month ago
7.0 - 12.0 years
20 - 35 Lacs
Pune
Hybrid
Job Duties and Responsibilities: We are looking for a self-starter to join our Data Engineering team. You will work in a fast-paced environment where you will get an opportunity to build and contribute to the full lifecycle development and maintenance of the data engineering platform. With the Data Engineering team you will get an opportunity to - Design and implement data engineering solutions that is scalable, reliable and secure on the Cloud environment Understand and translate business needs into data engineering solutions Build large scale data pipelines that can handle big data sets using distributed data processing techniques that supports the efforts of the data science and data application teams Partner with cross-functional stakeholder including Product managers, Architects, Data Quality engineers, Application and Quantitative Science end users to deliver engineering solutions Contribute to defining data governance across the data platform Basic Requirements: A minimum of a BS degree in computer science, software engineering, or related scientific discipline is desired 3+ years of work experience in building scalable and robust data engineering solutions Strong understanding of Object Oriented programming and proficiency with programming in Python (TDD) and Pyspark to build scalable algorithms 3+ years of experience in distributed computing and big data processing using the Apache Spark framework including Spark optimization techniques 2+ years of experience with Databricks, Delta tables, unity catalog, Delta Sharing, Delta live tables(DLT) and incremental data processing Experience with Delta lake, Unity Catalog Advanced SQL coding and query optimization experience including the ability to write analytical and nested queries 3+ years of experience in building scalable ETL/ ELT Data Pipelines on Databricks and AWS (EMR) 2+ Experience of orchestrating data pipelines using Apache Airflow/ MWAA Understanding and experience of AWS Services that include ADX, EC2, S3 3+ years of experience with data modeling techniques for structured/ unstructured datasets Experience with relational/columnar databases - Redshift, RDS and interactive querying services - Athena/ Redshift Spectrum Passion towards healthcare and improving patient outcomes Demonstrate analytical thinking with strong problem solving skills Stay on top of emerging technologies and posses willingness to learn. Bonus Experience (optional) Experience with Agile environment Experience operating in a CI/CD environment Experience building HTTP/REST APIs using popular frameworks Healthcare experience
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough