Jobs
Interviews

522 Emr Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 7.0 years

15 - 30 Lacs

Gurugram

Remote

Design, develop, and maintain robust data pipelines and ETL/ELT processes on AWS. Leverage AWS services such as S3, Glue, Lambda, Redshift, Athena, EMR , and others to build scalable data solutions. Write efficient and reusable code using Python for data ingestion, transformation, and automation tasks. Collaborate with cross-functional teams including data analysts, data scientists, and software engineers to support data needs. Monitor, troubleshoot, and optimize data workflows for performance, reliability, and cost efficiency. Ensure data quality, security, and governance across all systems. Communicate technical solutions clearly and effectively with both technical and non-technical stakeholders. Required Skills & Qualifications 5+ years of experience in data engineering roles. Strong hands-on experience with Amazon Web Services (AWS) , particularly in data-related services (e.g., S3, Glue, Lambda, Redshift, EMR, Athena). Proficiency in Python for scripting and data processing. Experience with SQL and working with relational databases. Solid understanding of data architecture, data modeling, and data warehousing concepts. Experience with CI/CD pipelines and version control tools (e.g., Git). Excellent verbal and written communication skills . Proven ability to work independently in a fully remote environment. Preferred Qualifications Experience with workflow orchestration tools like Apache Airflow or AWS Step Functions. Familiarity with big data technologies such as Apache Spark or Hadoop. Exposure to infrastructure-as-code tools like Terraform or CloudFormation. Knowledge of data privacy and compliance standards.

Posted 1 week ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Responsibilities : ETL Development The CDP ETL & Database Engineer will be responsible for building pipelines to feed downstream data processes. They will be able to analyze data, interpret business requirements, and establish relationships between data sets. The ideal candidate will be familiar with different encoding formats and file layouts such as JSON and XML. I mplementations & Onboarding Will work with the team to onboard new clients onto the ZMP/CDP+ platform. The candidate will solidify business requirements, perform ETL file validation, establish users, perform complex aggregations, and syndicate data across platforms. The hands-on engineer will take a test-driven approach towards development and will be able to document processes and workflows. Incremental Change Requests The CDP ETL & Database Engineer will be responsible for analyzing change requests and determining the best approach towards implementation and execution of the request. This requires the engineer to have a deep understanding of the platform's overall architecture. Change requests will be implemented and tested in a development environment to ensure their introduction will not negatively impact downstream processes. Change Data Management The candidate will adhere to change data management procedures and actively participate in CAB meetings where change requests will be presented and approved. Prior to introducing change, the engineer will ensure that processes are running in a development environment. The engineer will be asked to do peer-to-peer code reviews and solution reviews before production code deployment. Collaboration & Process Improvement The engineer will be asked to participate in knowledge share sessions where they will engage with peers, discuss solutions, best practices, overall approach, and process. The candidate will be able to look for opportunities to streamline processes with an eye towards building a repeatable model to reduce implementation duration. Job Requirements : The CDP ETL & Database Engineer will be well versed in the following areas: Relational data modeling ETL and FTP concepts Advanced Analytics using SQL Functions Cloud technologies - AWS, Snowflake Able to decipher requirements, provide recommendations, and implement solutions within predefined timeframes. The ability to work independently, but at the same time, the individual will be called upon to contribute in a team setting. The engineer will be able to confidently communicate status, raise exceptions, and voice concerns to their direct manager. Participate in internal client project status meetings with the Solution/Delivery management teams. When required, collaborate with the Business Solutions Analyst (BSA) to solidify requirements. Ability to work in a fast paced, agile environment; the individual will be able to work with a sense of urgency when escalated issues arise. Strong communication and interpersonal skills, ability to multitask and prioritize workload based on client demand. Familiarity with Jira for workflow mgmt., and time allocation. Familiarity with Scrum framework, backlog, planning, sprints, story points, retrospectives etc. Required Skills : ETL ETL tools such as Talend (Preferred, not required) DMExpress Nice to have Informatica Nice to have Database - Hands on experience with the following database Technologies Snowflake (Required) MYSQL/PostgreSQL Nice to have Familiar with NOSQL DB methodologies (Nice to have) Programming Languages Can demonstrate knowledge of any of the following. PLSQL JavaScript Strong Plus Python - Strong Plus Scala - Nice to have AWS Knowledge of the following AWS services: S3 EMR (Concepts) EC2 (Concepts) Systems Manager / Parameter Store Understands JSON Data structures, key value pair. Working knowledge of Code Repositories such as GIT, Win CVS, SVN. Workflow management tools such as Apache Airflow, Kafka, Automic/Appworx Jira Minimum Qualifications Bachelor's degree or equivalent 2-4 Years' experience Excellent verbal & written communications skills Self-Starter, highly motivated Analytical mindset

Posted 1 week ago

Apply

4.0 - 6.0 years

6 - 10 Lacs

Gurugram

Work from Office

Role Description As a Senior Cloud Data Platform (AWS) Specialist at Incedo, you will be responsible for designing, deploying and maintaining cloud-based data platforms on the AWS platform. You will work with data engineers, data scientists and business analysts to understand business requirements and design scalable, reliable and cost-effective solutions that meet those requirements. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Amazon Web Services (AWS) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Collaborating with other teams to ensure the consistency and integrity of data Troubleshooting and resolving data platform issues Technical Skills Skills Requirements: In-depth knowledge of AWS services and tools such as AWS Glue, AWS Redshift, and AWS Lambda Experience in building scalable and reliable data pipelines using AWS services, Apache Spark, and related big data technologies Familiarity with cloud-based infrastructure and deployment, specifically on AWS Strong knowledge of programming languages such as Python, Java, and SQL Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

maharashtra

On-site

As a Solutions Architect with over 7 years of experience, you will have the opportunity to leverage your expertise in cloud data solutions to architect scalable and modern solutions on AWS. In this role at Quantiphi, you will be a key member of our high-impact engineering teams, working closely with clients to solve complex data challenges and design cutting-edge data analytics solutions. Your responsibilities will include acting as a trusted advisor to clients, leading discovery/design workshops with global customers, and collaborating with AWS subject matter experts to develop compelling proposals and Statements of Work (SOWs). You will also represent Quantiphi in various forums such as tech talks, webinars, and client presentations, providing strategic insights and solutioning support during pre-sales activities. To excel in this role, you should have a strong background in AWS Data Services including DMS, SCT, Redshift, Glue, Lambda, EMR, and Kinesis. Your experience in data migration and modernization, particularly with Oracle, Teradata, and Netezza to AWS, will be crucial. Hands-on experience with ETL tools such as SSIS, Informatica, and Talend, as well as a solid understanding of OLTP/OLAP, Star & Snowflake schemas, and data modeling methodologies, are essential for success in this position. Additionally, familiarity with backend development using Python, APIs, and stream processing technologies like Kafka, along with knowledge of distributed computing concepts including Hadoop and MapReduce, will be beneficial. A DevOps mindset with experience in CI/CD practices and Infrastructure as Code is also desired. Joining Quantiphi as a Solutions Architect is more than just a job it's an opportunity to shape digital transformation journeys and influence business strategies across various industries. If you are a cloud data enthusiast looking to make a significant impact in the field of data analytics, this role is perfect for you.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

noida, uttar pradesh

On-site

You will be responsible for building the most personalized and intelligent news experiences for India's next 750 million digital users. As Our Principal Data Engineer, your main tasks will include designing and maintaining data infrastructure to power personalization systems and analytics platforms. This involves ensuring seamless data flow from source to consumption, architecting scalable data pipelines to process massive volumes of user interaction and content data, and developing robust ETL processes for large-scale transformations and analytical processing. You will also be involved in creating and maintaining data lakes/warehouses that consolidate data from multiple sources, optimized for ML model consumption and business intelligence. Additionally, you will implement data governance practices and collaborate with the ML team to ensure the right data availability for recommendation systems. To excel in this role, you should have a Bachelor's or Master's degree in Computer Science, Engineering, Data Science, or a related field, along with 8-12 years of data engineering experience, including at least 3 years in a senior role. You must possess expert-level SQL skills and have strong experience in the Apache Spark ecosystem (Spark SQL, Streaming, SparkML), as well as proficiency in Python/Scala. Experience with the AWS data ecosystem (RedShift, S3, Glue, EMR, Kinesis, Lambda, Athena) and ETL frameworks (Glue, Airflow) is essential. A proven track record of building large-scale data pipelines in production environments, particularly in high-traffic digital media, will be advantageous. Excellent communication skills are also required, as you will need to collaborate effectively across teams in a fast-paced environment that demands engineering agility.,

Posted 2 weeks ago

Apply

7.0 - 12.0 years

10 - 14 Lacs

Gurugram

Work from Office

Company Overview Incedo is a US-based consulting, data science and technology services firm with over 3000 people helping clients from our six offices across US, Mexico and India. We help our clients achieve competitive advantage through end-to-end digital transformation. Our uniqueness lies in bringing together strong engineering, data science, and design capabilities coupled with deep domain understanding. We combine services and products to maximize business impact for our clients in telecom, Banking, Wealth Management, product engineering and life science & healthcare industries. Working at Incedo will provide you an opportunity to work with industry leading client organizations, deep technology and domain experts, and global teams. Incedo University, our learning platform, provides ample learning opportunities starting with a structured onboarding program and carrying throughout various stages of your career. A variety of fun activities is also an integral part of our friendly work environment. Our flexible career paths allow you to grow into a program manager, a technical architect or a domain expert based on your skills and interests. Our Mission is to enable our clients to maximize business impact from technology by Harnessing the transformational impact of emerging technologies Bridging the gap between business and technology Role Description As an AWS Data Engineer, your role will be to design, develop, and maintain scalable data pipelines on AWS. You will work closely with technical analysts, client stakeholders, data scientists, and other team members to ensure data quality and integrity while optimizing data storage solutions for performance and cost-efficiency. This role requires leveraging AWS native technologies and Databricks for data transformations and scalable data processing. Technical Skills Responsibilities Lead and support the delivery of data platform modernization projects. Design and develop robust and scalable data pipelines leveraging AWS native services. Optimize ETL processes, ensuring efficient data transformation. Migrate workflows from on-premise to AWS cloud, ensuring data quality and consistency. Design automations and integrations to resolve data inconsistencies and quality issues Perform system testing and validation to ensure successful integration and functionality. Implement security and compliance controls in the cloud environment. Ensure data quality pre- and post-migration through validation checks and addressing issues regarding completeness, consistency, and accuracy of data sets. Collaborate with data architects and lead developers to identify and document manual data movement workflows and design automation strategies. Nice-to-have skills Qualifications 7+ years experience with a core data engineering skillset leveraging AWS native technologies (AWS Glue, Python, Snowflake, S3, Redshift). Experience in the design and development of robust and scalable data pipelines leveraging AWS native services. Proficiency in leveraging Snowflake for data transformations, optimization of ETL pipelines, and scalable data processing. Experience with streaming and batch data pipeline/engineering architectures. Familiarity with DataOps concepts and tooling for source control and setting up CI/CD pipelines on AWS. Hands-on experience with Databricks and a willingness to grow capabilities. Experience with data engineering and storage solutions (AWS Glue, EMR, Lambda, Redshift, S3). Strong problem-solving and analytical skills. Knowledge of Dataiku is needed Graduate/Post-Graduate degree in Computer Science or a related field.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Zeta Global is looking for an experienced Machine Learning Engineer with industry-proven hands-on experience of delivering machine learning models to production to solve business problems. To be a good fit to join our AI/ML team, you should ideally: Be a thought leader that can work with cross-functional partners to foster a data-driven organisation. Be a strong team player, have experience contributing to a large project as part of a collaborative team effort. Have extensive knowledge and expertise with machine learning engineering best-practices and industry standards. Empower the product and engineering teams to make data-driven decisions. What you need to succeed: 2 to 4 years of proven experience as a Machine Learning Engineer in a professional setting. Proficiency in any programming language (Python preferable). Prior experience in building and deploying Machine learning systems. Experience with containerization: Docker & Kubernetes. Experience with AWS cloud services like EKS, ECS, EMR, Lambda, and others. Fluency with workflow management tools like Airflow or dbt. Familiarity with distributed batch compute technologies such as Spark. Experience with modern data warehouses like Snowflake or BigQuery. Knowledge of MLFlow, Feast, and Terraform is a plus.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Mumbai

Work from Office

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Responsibilities: Design and implement the data modeling, data ingestion and data processing for various datasets Design, develop and maintain ETL Framework for various new data source Develop data ingestion using AWS Glue/ EMR, data pipeline using PySpark, Python and Databricks. Build orchestration workflow using Airflow & databricks Job workflow Develop and execute adhoc data ingestion to support business analytics. Proactively interact with vendors for any questions and report the status accordingly Explore and evaluate the tools/service to support business requirement Ability to learn to create a data-driven culture and impactful data strategies. Aptitude towards learning new technologies and solving complex problem. Qualifications : Minimum of bachelors degree. Preferably in Computer Science, Information system, Information technology. Minimum 5 years of experience on cloud platforms such as AWS, Azure, GCP. Minimum 5 year of experience in Amazon Web Services like VPC, S3, EC2, Redshift, RDS, EMR, Athena, IAM, Glue, DMS, Data pipeline & API, Lambda, etc. Minimum of 5 years of experience in ETL and data engineering using Python, AWS Glue, AWS EMR /PySpark and Airflow for orchestration. Minimum 2 years of experience in Databricks including unity catalog, data engineering Job workflow orchestration and dashboard generation based on business requirements Minimum 5 years of experience in SQL, Python, and source control such as Bitbucket, CICD for code deployment. Experience in PostgreSQL, SQL Server, MySQL & Oracle databases. Experience in MPP such as AWS Redshift, AWS EMR, Databricks SQL warehouse & compute cluster. Experience in distributed programming with Python, Unix Scripting, MPP, RDBMS databases for data integration Experience building distributed high-performance systems using Spark/PySpark, AWS Glue and developing applications for loading/streaming data into Databricks SQL warehouse & Redshift. Experience in Agile methodology Proven skills to write technical specifications for data extraction and good quality code. Experience with big data processing techniques using Sqoop, Spark, hive is additional plus Experience in data visualization tools including PowerBI, Tableau. Nice to have experience in UI using Python Flask framework anglular Mandatory Skills: Python for Insights Experience : 5-8 Years.

Posted 2 weeks ago

Apply

1.0 - 3.0 years

2 - 3 Lacs

Chennai

Work from Office

Ensure production & accuracy targets are met as per client expectation Daily learning & updating of changes in client protocols Utilize the AI tools effectively & process is efficient & effective Daily annotation records Required Candidate profile 1 to 3yrs of Exp in nursing/ hospital/ annotation environment is an added advantage Strong verbal &written communication skill in English Strong comprehension & analytical skills

Posted 2 weeks ago

Apply

8.0 - 12.0 years

15 - 30 Lacs

Gurugram

Work from Office

Role description Lead and mentor a team of data engineers to design, develop, and maintain high-performance data pipelines and platforms. Architect scalable ETL/ELT processes, streaming pipelines, and data lake/warehouse solutions (e.g., Redshift, Snowflake, BigQuery). Own the roadmap and technical vision for the data engineering function, ensuring best practices in data modeling, governance, quality, and security. Drive adoption of modern data stack tools (e.g., Airflow, Kafka, Spark etc.) and foster a culture of continuous improvement. Ensure the platform is reliable, scalable, and cost-effective across batch and real-time use cases. Champion data observability, lineage, and privacy initiatives to ensure trust in data across the org. Skills Bachelors or Masters degree in Computer Science, Engineering, or related technical field. 8+ years of hands-on experience in data engineering with at least 2+ years in a leadership or managerial role. Proven experience with distributed data processing frameworks such as Apache Spark, Flink, or Kafka. Strong SQL skills and experience in data modeling, data warehousing, and schema design. Proficiency with cloud platforms (AWS/GCP/Azure) and their native data services (e.g., AWS Glue, Redshift, EMR, BigQuery). Solid grasp of data architecture, system design, and performance optimization at scale. Experience working in an agile development environment and managing sprint-based delivery cycles.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru, Karnataka

Work from Office

Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If youve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologies: Redshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects

Posted 2 weeks ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Hyderabad

Work from Office

About the Role: Grade Level (for internal use): 10 One of the most valuable asset in today's Financial industry is the data which can provide businesses the intelligence essential to making business and financial decisions with conviction. This role will provide an opportunity to you to work on Ratings and Research related data. You will get an opportunity to work on cutting edge big data technologies and will be responsible for development of both Data feeds as well as API work. Location: Hyderabad The Team: RatingsXpress is at the heart of financial workflows when it comes to providing and analyzing data. We provide Ratings and Research information to clients . Our work deals with content ingestion, data feeds generation as well as exposing the data to clients via API calls. This position in part of the Ratings Xpresss team and is focused on providing clients the critical data they need to make the most informed investment decisions possible. Impact: As a member of the Xpressfeed Team in S&P Global Market Intelligence, you will work with a group of intelligent and visionary engineers to build impactful content management tools for investment professionals across the globe. Our Software Engineers are involved in the full product life cycle, from design through release. You will be expected to participate in application designs , write high-quality code and innovate on how to improve the overall system performance and customer experience. If you are a talented developer and want to help drive the next phase for Data Management Solutions at S&P Global and can contribute great ideas, solutions and code and understand the value of Cloud solutions, we would like to talk to you. Whats in it for you: We are currently seeking a Software Developer with a passion for full-stack development. In this role, you will have the opportunity to work on cutting-edge cloud technologies such as Databricks , Snowflake , and AWS , while also engaging in Scala and SQL Server -based database development. This position offers a unique opportunity to grow both as a Full Stack Developer and as a Cloud Engineer , expanding your expertise across modern data platforms and backend development. Responsibilities: Analyze, design and develop solutions within a multi-functional Agile team to support key business needs for the Data feeds Design, implement and test solutions using AWS EMR for content Ingestion. Work on complex SQL server projects involving high volume data Engineer components, and common services based on standard corporate development models, languages and tools Apply software engineering best practices while also leveraging automation across all elements of solution delivery Collaborate effectively with technical and non-technical stakeholders. Must be able to document and demonstrate technical solutions by developing documentation, diagrams, code comments, etc. Basic Qualifications: Bachelors degree in Computer Science, Information Technology, Engineering, or a related field. 3-6 years of experience in application development. Minimum of 2 years of hands-on experience with Scala. Minimum of 2 years of hands-on experience with Microsoft SQL Server. Solid understanding of Amazon Web Services (AWS) and cloud-based development. In-depth knowledge of system architecture, object-oriented programming, and design patterns. Excellent communication skills, with the ability to convey complex ideas clearly both verbally and in writing. Preferred Qualifications: Familiarity with AWS Services, EMR, Auto scaling, EKS Working knowledge of snowflake. Preferred experience in Python development. Familiarity with the Financial Services domain and Capital Markets is a plus. Experience developing systems that handle large volumes of data and require high computational performance.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Gurugram

Work from Office

Primary Responsibilities: Solicit, review and analyze business requirements Write business and technical requirements Communicate and validate requirements with stakeholders Validate solution meets business needs Work with application users to develop test scripts and facilitate testing to validate application functionality and configuration Participate in organizational projects and/or manage small/medium projects related to assigned applications Translates customer needs into quality system solutions and ensures effective operational outcomes Focus on business value proposition*Apply understanding of As Is and To Be processes to develop solution Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Role Focus Areas: Core Expertise Required: Provider Management Utilization Management Care Management Domain Knowledge: Value-Based Care Clinical & Care Management Familiarity with Medical Terminology Experience with EMR (Electronic Medical Records) and Claims Processing Technical/Clinical Understanding: Admission & Discharge Processes CPT Codes, Procedure Codes, Diagnosis Codes Core AI Understanding AI/ML Fundamentals: Understanding of supervised, unsupervised, and reinforcement learning Model Lifecycle Awareness: Familiarity with model training, evaluation, deployment, and monitoring Data Literacy: Ability to interpret data, understand data quality issues, and collaborate with data scientists AI Product Strategy AI Use Case Identification: Ability to identify and validate AI opportunities aligned with business goals Feasibility Assessment: Understanding of whats technically possible with current AI capabilities AI/ML Roadmapping: Planning features and releases that depend on model development cycles Collaboration with Technical Teams Cross-functional Communication: Ability to translate business needs into technical requirements and vice versa Experimentation & A/B Testing: Understanding of how to run and interpret experiments involving AI models MLOps Awareness: Familiarity with CI/CD for ML, model versioning, and monitoring tools AI Tools & Platforms Prompt Engineering (for LLMs): Crafting effective prompts for tools like ChatGPT, Copilot, or Claude Responsible AI & Ethics Bias & Fairness: Understanding of how bias can enter models and how to mitigate it Explainability: Familiarity with tools like SHAP, LIME, or model cards Regulatory Awareness: Knowledge of AI-related compliance (e.g., HIPPA, AI Act) AI-Enhanced Product Management AI in SDLC: Using AI tools for user story generation, backlog grooming, and documentation AI for User Insights: Leveraging NLP for sentiment analysis, user feedback clustering, etc. AI-Driven Personalization: Understanding recommendation systems, dynamic content delivery, etc. Required Qualifications: Undergraduate degree or equivalent experience. 5+ years of experience in Business Analysis in healthcare including providing overall support, maintenance, configuration, troubleshooting, system upgrades, and more for Healthcare Applications Good experience on EMR / RCM systems Experience working with stakeholders, gathering requirements, and taking action based on their business needs Demonstrated success in running EMR / RCM / UM, CM and DM systems support in requirements, UAT, deployment supports Proven ability to work independently without direct supervision Proven ability to effectively manage time and competing priorities Proven ability to work with cross-functional teams #NJP #Gen

Posted 2 weeks ago

Apply

3.0 - 6.0 years

7 - 11 Lacs

Chennai

Work from Office

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Job Function Description Jobs in this function provide coding and coding auditing services directly to providers. This includes the analysis and translation of medical and clinical diagnoses, procedures, injuries, or illnesses into designated numerical codes. *Employees in jobs labeled with ‘SCA’ must support a government Service Contract Act (SCA) agreement. General Job Profile Coordinates, supervises and is accountable for the daily activities of business support, technical or production team or unit Impact of work is most often at the team level Primary Responsibilities: Owns output at task level Work is generally limited to own function Sets priorities for the team to ensure task completion Coordinates work activities with other supervisors Develops plans to meet short-term objectives Identifies and resolves operational problems using defined processes, expertise and judgment Decisions are guided by policies, procedures and business plan Product, service or process decisions are most likely to impact individual employees and/or customers (internal or external) Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualification: Graduate degree or equivalent experience Values Based Competencies Integrity Value: Act Ethically Comply with Applicable Laws, Regulations and Policies Demonstrate Integrity Compassion Value: Focus on Customers Identify and Exceed Customer Expectations Improve the Customer Experience Relationships Value: Act as a Team Player Collaborate with Others Demonstrate Diversity Awareness Learn and Develop Relationships Value: Communicate Effectively Influence Others Listen Actively Speak and Write Clearly Innovation Value: Support Change and Innovation Contribute Innovative Ideas Work Effectively in a Changing Environment Performance Value: Make Fact-Based Decisions Apply Business Knowledge Use Sound Judgement Performance Value: Deliver Quality Results Drive for Results Manage Time Effectively Produce High-Quality Work At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission. #njp External Candidate Application Internal Employee Application

Posted 2 weeks ago

Apply

3.0 - 6.0 years

8 - 12 Lacs

Gurugram

Work from Office

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Solicit, review and analyze business requirements Write business and technical requirements Communicate and validate requirements with stakeholders Validate solution meets business needs Work with application users to develop test scripts and facilitate testing to validate application functionality and configuration Participate in organizational projects and/or manage small/medium projects related to assigned applications Translates customer needs into quality system solutions and ensures effective operational outcomes Focus on business value proposition*Apply understanding of ‘As Is’ and ‘To Be’ processes to develop solution Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Role Focus Areas: Core Expertise Required: Provider Management Utilization Management Care Management Domain Knowledge: Value-Based Care Clinical & Care Management Familiarity with Medical Terminology Experience with EMR (Electronic Medical Records) and Claims Processing Technical/Clinical Understanding: Admission & Discharge Processes CPT Codes, Procedure Codes, Diagnosis Codes Core AI Understanding AI/ML Fundamentals: Understanding of supervised, unsupervised, and reinforcement learning Model Lifecycle Awareness: Familiarity with model training, evaluation, deployment, and monitoring Data Literacy: Ability to interpret data, understand data quality issues, and collaborate with data scientists AI Product Strategy AI Use Case Identification: Ability to identify and validate AI opportunities aligned with business goals Feasibility Assessment: Understanding of what’s technically possible with current AI capabilities AI/ML Roadmapping: Planning features and releases that depend on model development cycles Collaboration with Technical Teams Cross-functional Communication: Ability to translate business needs into technical requirements and vice versa Experimentation & A/B Testing: Understanding of how to run and interpret experiments involving AI models MLOps Awareness: Familiarity with CI/CD for ML, model versioning, and monitoring tools AI Tools & Platforms Prompt Engineering (for LLMs): Crafting effective prompts for tools like ChatGPT, Copilot, or Claude Responsible AI & Ethics Bias & Fairness: Understanding of how bias can enter models and how to mitigate it Explainability: Familiarity with tools like SHAP, LIME, or model cards Regulatory Awareness: Knowledge of AI-related compliance (e.g., HIPPA, AI Act) AI-Enhanced Product Management AI in SDLC: Using AI tools for user story generation, backlog grooming, and documentation AI for User Insights: Leveraging NLP for sentiment analysis, user feedback clustering, etc. AI-Driven Personalization: Understanding recommendation systems, dynamic content delivery, etc. Required Qualifications: Undergraduate degree or equivalent experience. 5+ years of experience in Business Analysis in healthcare including providing overall support, maintenance, configuration, troubleshooting, system upgrades, and more for Healthcare Applications Good experience on EMR / RCM systems Experience working with stakeholders, gathering requirements, and taking action based on their business needs Demonstrated success in running EMR / RCM / UM, CM and DM systems support in requirements, UAT, deployment supports Proven ability to work independently without direct supervision Proven ability to effectively manage time and competing priorities Proven ability to work with cross-functional teams At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission. #NJP #Gen External Candidate Application Internal Employee Application

Posted 2 weeks ago

Apply

8.0 - 13.0 years

17 - 27 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Location Any where notice period-45 days max Shift-6.30 Pm-3.30 AM Project Manager-Exp in EMR EHR (Healthcare domain is mandatory) email -bushra@bharatjobs.com Key Responsibility Area Experience/Educational Qualification Perf-orm all responsibilities of Project Manager Bachelors/Masters in Business Administration, Computer Science, or equivalent combination of education and /or experience Develop, plan, and schedule the development, introduction, communication and maintenance of projects 3-5 years of experience managing large scale initiatives in engineering or technology environment ;>7 years of project management experience Organize and control the activities of the area, assign personnel to projects and direct their activities Expert knowledge in assigned business area discipline such as engineering or inf-ormation technology Manage high complexities, cross functional system implementation, maintenance, and integration projects Project Management Professional (PMP) certification strongly desired Understand and meet project sponsor expectations Advanced Proficiency in automated project management tools such as Microsoft Project, including financial and scheduling perf-ormance reporting Monitor progress to ensure project ob-jectives are delivered on time and within budget, and business results and realized Advanced proficiency in analytical, organizational, project management, interpersonal and communication skill (verbal and written) Monitor and control quality, risks, issues, and project changes to ensure project ob-jectives are delivered on time and within budget, and business results are realized Customer- and relationship-focused, process-driven, metric -focused, result oriented, organized, self- directed Adhere to, support, and contribute to development or enhancement of applicable project management standards and processes Ability to multi-task and solve problems innovatively Determine the impact of project changes on the business case and re-forecast value creation Develop sponsorship/support for projects (at the executive level) within affected organizations and establish a governance organization Monitor and maintain project team morale Resolve issues escalated by the management team Escalated unresolved issues via the governance framework

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Who We Are:- We are a digitally native company that helps organizations reinvent themselves and unleash their potential. We are the place where innovation, design and engineering meet scale. Globant is 20 years old, NYSE listed public organization with more than 33,000+ employees worldwide working out of 35 countries globally. www.globant.com Job location: Pune/Hyderabad/Bangalore Work Mode: Hybrid Experience: 5 to 10 Years Must have skills are 1) AWS (EC2 & EMR & EKS) 2) RedShift 3) Lambda Functions 4) Glue 5) Python 6) Pyspark 7) SQL 8) Cloud watch 9) No SQL Database - DynamoDB/MongoDB/ OR any We are seeking a highly skilled and motivated Data Engineer to join our dynamic team. The ideal candidate will have a strong background in designing, developing, and managing data pipelines, working with cloud technologies, and optimizing data workflows. You will play a key role in supporting our data-driven initiatives and ensuring the seamless integration and analysis of large datasets. Design Scalable Data Models: Develop and maintain conceptual, logical, and physical data models for structured and semi-structured data in AWS environments. Optimize Data Pipelines: Work closely with data engineers to align data models with AWS-native data pipeline design and ETL best practices. AWS Cloud Data Services: Design and implement data solutions leveraging AWS Redshift, Athena, Glue, S3, Lake Formation, and AWS-native ETL workflows. Design, develop, and maintain scalable data pipelines and ETL processes using AWS services (Glue, Lambda, RedShift). Write efficient, reusable, and maintainable Python and PySpark scripts for data processing and transformation. Optimize SQL queries for performance and scalability. Expertise in writing complex SQL queries and optimizing them for performance. Monitor, troubleshoot, and improve data pipelines for reliability and performance. Focusing on ETL automation using Python and PySpark, responsible for design, build, and maintain efficient data pipelines, ensuring data quality and integrity for various applications.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Amazon Web Services (AWS) Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions with your team, providing guidance and support to ensure successful project outcomes. Additionally, you will monitor project progress, address any challenges that arise, and facilitate communication among team members to foster a productive work environment. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior team members to support their professional growth. Professional & Technical Skills: Professional & Technical Skills: Primary:AWS + Python Secondary:Devops, TerraformGood To Have:AWS CDK3-4 Years of overall software development experience with strong hands on in AWS and Python.Hands on experience on AWS services EC2, Lambda, SNS, SQS, Glue, Step function, Cloud watch, API Gateway, EMR, S3, Dynamo DB, RDS, Athena.Hands on experience in writing Python code for AWS services like Glue job, Lambda and AWS CDK.Strong technical and Debugging hands on.Strong Devops experience in Terraform, Git and CI/CD.Experience working in Agile development environments.Strong verbal and written communication skills, with the ability to engage directly with clients. Additional Information:- The candidate should have minimum 5 years of experience in Amazon Web Services (AWS).- This position is based at our Bengaluru office.- A 15-year full time education is required.- Shift Timing:12:30 PM to 9:30 PM IST [Weekdays] Additional Information:- The candidate should have minimum 3 years of experience in Amazon Web Services (AWS).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 10.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : EPIC Systems Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service identifying and solving issues within multiple components of critical business systems. You will play a crucial role in ensuring the smooth functioning of the applications and resolving any technical glitches that may arise. Your expertise in EPIC Systems and problem-solving skills will be instrumental in maintaining the efficiency and reliability of the systems. Roles & Responsibilities:Epic Analyst will provide primary support for their designated application/module.Take on more advanced issues that arise during the project for their application area and will take on more complex tasks with respect to system configuration, testing and administration.Provide on-going system support and maintenance based on support rosterRespond in a timely manner to system issues and requestsConduct investigation, assessment, evaluation and deliver solutions and fixes to resolve system issues.Handle and deliver Service Request / Change Request / New Builds Perform system monitoring, such as error queues, alerts, batch jobs, etc and execute the required actions or SOPsPerform/support regular / periodic system patch, maintenance and verification.Perform/support the planned system upgrade work, cutover to production and post cutover support and stabilizationPerform/support the work required to comply with audit and security requirements.Require to overlap with client business or office hours Comply with Compliance requirements as mandated by the project Professional & Technical Skills: - Must To Have Skills: Certified in epic modules (RWB,Epic Care link,Haiku,Healthy Planet,Mychart,Rover,Willow ambulatory,Cogito, Ambulatory, Clindoc, Orders, ASAP, RPB, RHB, HIM Identity,HIM ROI, HIM DT, Cadence, Prelude, GC, Optime, Anesthesia, Beacon, Willow Imp, Cupid, Pheonix, Radiant, Beaker AP, Beaker CP, Bridges, Clarity, Radar, RWB)- Experience in troubleshooting and resolving application issues. Additional Information:- The candidate should have a minimum of 5 years of experience in EPIC Systems.- This position is based at our Chennai office.- work from office is mandatory for all working days- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

2.0 - 5.0 years

30 - 32 Lacs

Bengaluru

Work from Office

Data Engineer -2 (Experience 2-5 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. Thats why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotaks Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotaks data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If youve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 2 weeks ago

Apply

5.0 - 9.0 years

30 - 32 Lacs

Bengaluru

Work from Office

Data Engineering Manager 5-9 years Software Development Manager 9+ Years Kotak Mahindra BankBengaluru, Karnataka, India (On-site) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. Thats why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotaks Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotaks data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If youve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects

Posted 2 weeks ago

Apply

5.0 - 8.0 years

15 - 22 Lacs

Kochi, Hyderabad, Thiruvananthapuram

Hybrid

Job Role: Senior Cloud Engineer Job Location : Trivandrum, Kochi, Hyderabad Primary Skills: Aws Services, Eks, Terraform, AWS EMR Role Summary: We are seeking a highly skilled Senior AWS Cloud Engineer with proven expertise in designing, implementing, and managing cloud infrastructure on AWS. The ideal candidate will have hands-on experience in cloud migration projects , especially migrating legacy or mainframe systems to AWS. This role demands strong technical leadership, infrastructure automation proficiency, and effective stakeholder collaboration. Role Proficiency: Apply creative solutions to develop applications and select appropriate technical options. Optimize application development, maintenance, and performance using design patterns and reusable solutions. Guide and oversee the development activities of peers and junior team members. Key Responsibilities: Cloud & Infrastructure: Lead end-to-end cloud migration projects (rehosting, replatforming, refactoring). Design, implement, and manage scalable, secure AWS architectures. Collaborate with development teams to modernize and migrate .NET/Mainframe applications to AWS. Develop Infrastructure as Code (IaC) using Terraform or CloudFormation . Leverage AWS services such as EC2, S3, Lambda, EKS, RDS, VPC, IAM, Glue, CloudWatch, etc. Optimize AWS usage and cloud costs through automation and monitoring. Development & Testing: Code as per design, following defined standards and best practices. Review code of peers and ensure quality in deliverables. Create and execute unit test cases; assist testing team with integration and validation. Contribute to the creation and review of design documentation and technical artifacts. Project Management & Delivery: Estimate time and effort for features, modules, or full solutions. Manage delivery of modules/user stories and ensure on-time releases. Perform Root Cause Analysis (RCA) of defects and initiate corrective actions. Stakeholder Interaction: Interface with customers to clarify requirements and propose solutions. Conduct product demos and participate in solution design discussions. Address customer queries and build trust through timely and quality deliverables. Team Management & Leadership: Set FAST (Focused, Accountable, Specific, Transparent) goals for self/team. Mentor junior team members and provide technical guidance. Engage team members, address aspirations, and ensure healthy team dynamics. Expected Outcomes & Measures: Performance Metrics: Adherence to coding standards and SDLC processes. Delivery within estimated timelines. Number and severity of defects pre- and post-delivery. Completion of compliance and domain trainings. Customer satisfaction and engagement feedback. Key Skills & Competencies: Cloud & DevOps: AWS Core Services (EC2, S3, Lambda, IAM, VPC, RDS, Glue, SQS/SNS) AWS EMR Serverless, Aurora Postgres, MWAA Terraform (Intermediate), CloudFormation CI/CD pipelines: GitHub Actions, Jenkins, AWS CodePipeline Monitoring Tools: CloudWatch, ELK, Datadog Programming & Automation: Scripting: Python, Bash, PowerShell Infrastructure Automation & IaC Docker, Kubernetes (EKS), ECS Serverless Architecture Project & Stakeholder Management: Effective communication with customers and internal teams. Proactive risk identification and mitigation. Ability to handle pressure and multiple tasks simultaneously. Desirable Experience: Experience with mainframe systems (migration or integration). Familiarity with Agile methodologies (Scrum/Kanban). Prior domain certifications in cloud technologies or customer-specific platforms. Certifications (Preferred): AWS Certified Solutions Architect / DevOps Engineer / SysOps Administrator Relevant domain certifications

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

You should have 2-7 years of experience in Noida, Gurugram, Indore, Pune, or Bangalore with a notice period of currently serving or immediate joiners. Your primary responsibilities will include having 2-6 years of hands-on experience with Big Data technologies like PySpark (Data frame and SparkSQL), Hadoop, and Hive. Additionally, you should have good experience with Python and Bash Scripts, a solid understanding of SQL and data warehouse concepts, and strong analytical, problem-solving, data analysis, and research skills. You should also demonstrate the ability to think creatively and independently, along with excellent communication, presentation, and interpersonal skills. It would be beneficial if you have hands-on experience with using Cloud Platform provided Big Data technologies such as IAM, Glue, EMR, RedShift, S3, and Kinesis. Experience in orchestration with Airflow and any job scheduler, as well as experience in migrating workloads from on-premise to cloud and cloud to cloud migrations, would be considered a plus.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

maharashtra

On-site

As a Lead Software Engineer at HERE Technologies, you will be responsible for developing full-stack software solutions and building extensive ETL pipelines. Joining the HERE Analytics group, you will play a key role in strengthening the infrastructure of big data visualization tools to view complex large-scale location attributes on a map. Your responsibilities will cover all aspects of the software development lifecycle, from refining product vision and gathering requirements to coding, testing, release, and support. Working collaboratively with team members located worldwide, you will tackle challenging problems related to large-scale data extraction, transformation, and enrichment. In this role, you will implement tools to enhance automated and semi-automated map data processing, involving both backend/service-based software stack and front-end visualization components for big data analysis. You will also utilize CI/CD tools, taking end-to-end ownership of the software you develop, including DevOps and Testing. Collaborating closely with full-stack and frontend engineers, you will refine APIs and system integrations. Additionally, you will engage with other engineering teams and internal customers to identify new opportunities, address critical needs, and solve complex problems using your backend development expertise. Becoming an expert in leveraging internal platform resources and APIs, you will work in the AWS cloud computing environment. To be successful in this role, you should have at least 8 years of software development experience and be proficient in Java, Python, Scala, or a similar Functional Programming language. You should also possess expertise in working with Relational Databases, Cloud Computing Services like AWS, Continuous Integration (CI), Continuous Deployment, ETL systems using Big Data processing engines such as Hadoop, Spark, EMR, NoSQL Databases, and SOAP/REST Web Services. If you are passionate about driving innovation, creating positive change, and working on cutting-edge technologies in a collaborative global environment, we invite you to join our team at HERE Technologies.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Data Engineer at our company, you will be an integral part of a skilled Data Engineering team focused on developing reusable capabilities and tools to automate various data processing pipelines. Your responsibilities will include contributing to data acquisition, ingestion, processing, monitoring pipelines, and validating data. Your role is pivotal in maintaining the smooth operation of data ingestion and processing pipelines, ensuring that data in the data lake is up-to-date, valid, and usable at all times. With a minimum of 3 years of experience in data engineering, you should be proficient in Python programming and have a strong background in working with both RDBMS and NoSQL systems. Experience in the AWS ecosystem, including components like Airflow, EMR, Redshift, S3, Athena, and PySpark, is essential. Additionally, you should have expertise in developing REST APIs using Python frameworks such as flask and fastapi. Familiarity with crawling libraries like BeautifulSoup in Python would be advantageous. Your skill in writing complex SQL queries to retrieve key metrics and working with various data lake storage formats will be key to your success in this role. Key Responsibilities: - Design and implement scalable data pipelines capable of handling large data volumes. - Develop ETL/ELT pipelines to extract data from upstream sources and synchronize it with data lakes in formats like parquet, iceberg, and delta. - Optimize and maintain data pipelines to ensure smooth operation and business continuity. - Collaborate with cross-functional teams to source data for various business use cases. - Stay informed about emerging data technologies and trends to enhance our data infrastructure and architecture continuously. - Adhere to best practices in data querying and manipulation to uphold data integrity. If you are a motivated Data Engineer with a passion for building robust data pipelines and ensuring data quality, we invite you to join our dynamic team and contribute to the success of our data engineering initiatives.,

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies