Jobs
Interviews

1265 Azure Databricks Jobs - Page 23

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 10.0 years

6 - 10 Lacs

Noida

Work from Office

JD Staff Software Data Engineer R1 RCM Inc. is a leading provider of technology-enabled revenue cycle management services which transform and solve challenges across health systems, hospitals and physician practices. Headquartered in Chicago, R1 is a publicly-traded organization with employees throughout the US and multiple INDIA locations. Our mission is to be the one trusted partner to manage revenue, so providers and patients can focus on what matters most. Our priority is to always do what is best for our clients, patients and each other. With our proven and scalable operating model, we complement a healthcare organizations infrastructure, quickly driving sustainable improvements to net patient revenue and cash flows while reducing operating costs and enhancing the patient experience. Description: We are seeking a Staff Data Engineer with 7-10 year of experience to join our Data Platform team. This role will report to the Manager of data engineering and be involved in the planning, design, and implementation of our centralized data warehouse solution for ETL, reporting and analytics across all applications within the company. Qualifications: Deep knowledge and experience working with SSIS, T-SQL Experienced in Azure data factory, Azure Data bricks & Azure Data Lake. Experience working with any language like Python Experience working with SQL and NoSQL database systems such as MongoDB Experience in distributed system architecture design Experience with cloud environments (Azure Preferred) Experience with acquiring and preparing data from primary and secondary disparate data sources (real-time preferred) Experience working on large scale data product implementation, responsible for technical delivery, mentoring and managing peer engineers Experience working with PowerBI preferred. Experience working with agile methodology preferred. Healthcare industry experience preferred Responsibilities: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions Work with other team with deep experience in ETL process, distributed microservices, and data science domains to understand how to centralize their data Share your passion for staying experimenting with and learning new technologies. Perform thorough data analysis, uncover opportunities, and address business problems. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 1 month ago

Apply

4.0 - 7.0 years

3 - 7 Lacs

Noida

Work from Office

R1 is a leading provider of technology-driven solutions that help hospitals and health systems to manage their financial systems and improve patients experience. We are the one company that combines the deep expertise of a global workforce of revenue cycle professionals with the industry's most advanced technology platform, encompassing sophisticated analytics, Al, intelligent automation and workflow orchestration. R1 is a place where we think boldly to create opportunities for everyone to innovate and grow. A place where we partner with purpose through transparency and inclusion. We are a global community of engineers, front-line associates, healthcare operators, and RCM experts that work together to go beyond for all those we serve. Because we know that all this adds up to something more, a place where we're all together better. R1 India is proud to be recognized amongst Top 25 Best Companies to Work For 2024, by the Great Place to Work Institute. This is our second consecutive recognition on this prestigious Best Workplaces list, building on the Top 50 recognition we achieved in 2023. Our focus on employee wellbeing and inclusion and diversity is demonstrated through prestigious recognitions with R1 India being ranked amongst Best in Healthcare, Top 100 Best Companies for Women by Avtar & Seramount, and amongst Top 10 Best Workplaces in Health & Wellness. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare work better for all by enabling efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 16,000+ strong in India with presence in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. Description: We are seeking a Software Engineer with 4-7 year of experience to join our ETL Development team. This role will report to the Manager of data engineering and be involved in the planning, design, and implementation of our centralized data warehouse solution for ETL, reporting and analytics across all applications within the company. Qualifications: Deep knowledge and experience working with SSIS, T-SQL, Azure Databricks, Azure Data Lake, Azure Data Factory,. Experienced in writing SQL objects SP, UDF, Views Experienced in data modeling. Experience working with MS-SQL and NoSQL database systems such as Apache Parquet. Experience in Scala, SparkSQL, Airflow is preferred. Experience with acquiring and preparing data from primary and secondary disparate data sources Experience working on large scale data product implementation. Experience working with agile methodology preferred. Healthcare industry experience preferred. Responsibilities: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions Work with other team with deep experience in ETL process and data science domains to understand how to centralize their data Share your passion for staying experimenting with and learning new technologies. Perform thorough data analysis, uncover opportunities, and address business problems. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 1 month ago

Apply

3.0 - 5.0 years

3 - 7 Lacs

Noida

Work from Office

R1 is a leading provider of technology-driven solutions that help hospitals and health systems to manage their financial systems and improve patients experience. We are the one company that combines the deep expertise of a global workforce of revenue cycle professionals with the industry's most advanced technology platform, encompassing sophisticated analytics, Al, intelligent automation and workflow orchestration. R1 is a place where we think boldly to create opportunities for everyone to innovate and grow. A place where we partner with purpose through transparency and inclusion. We are a global community of engineers, front-line associates, healthcare operators, and RCM experts that work together to go beyond for all those we serve. Because we know that all this adds up to something more, a place where we're all together better. R1 India is proud to be recognized amongst Top 25 Best Companies to Work For 2024, by the Great Place to Work Institute. This is our second consecutive recognition on this prestigious Best Workplaces list, building on the Top 50 recognition we achieved in 2023. Our focus on employee wellbeing and inclusion and diversity is demonstrated through prestigious recognitions with R1 India being ranked amongst Best in Healthcare, Top 100 Best Companies for Women by Avtar & Seramount, and amongst Top 10 Best Workplaces in Health & Wellness. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare work better for all by enabling efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 16,000+ strong in India with presence in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. R1 RCM Inc. is a leading provider of technology-enabled revenue cycle management services which transform and solve challenges across health systems, hospitals and physician practices. Headquartered in Chicago, R1 is a publicly -traded organization with employees throughout the US and multiple INDIA locations.Our mission is to be the one trusted partner to manage revenue, so providers and patients can focus on what matters most. Our priority is to always do what is best for our clients, patients and each other. With our proven and scalable operating model, we complement a healthcare organizations infrastructure, quickly driving sustainable improvements to net patient revenue and cash flows while reducing operating costs and enhancing the patient experience. Description: We are seeking a Data Engineer with 3-5 year of experience to join our Data Platform team. This role will report to the Manager of data engineering and be involved in the planning, design, and implementation of our centralized data warehouse solution for ETL, reporting and analytics across all applications within the company. Qualifications: Deep knowledge and experience working with Python/Scala and Spark Experienced in Azure data factory, Azure Data bricks, Azure Blob Storage, Azure Data Lake, Delta lake. Experience working on Unity Catalog, Apache Parquet Experience with Azure cloud environments Experience with acquiring and preparing data from primary and secondary disparate data sources Experience working on large scale data product implementation, responsible for technical delivery. Experience working with agile methodology preferred Healthcare industry experience preferredResponsibilitiesCollaborate with and across Agile teams to design, develop, test, implement, and support technical solutions Work with other team with deep experience in ETL process, distributed microservices, and data science domains to understand how to centralize their data Share your passion for staying experimenting with and learning new technologies. Perform thorough data analysis, uncover opportunities, and address business problems.Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 1 month ago

Apply

4.0 - 7.0 years

3 - 7 Lacs

Noida

Work from Office

Role Objective: R1 RCM Inc. is a leading provider of technology-enabled revenue cycle management services which transform and solve challenges across health systems, hospitals and physician practices. Headquartered in Chicago, R1 is a publicly-traded organization with employees throughout the US and multiple INDIA locations. Our mission is to be the one trusted partner to manage revenue, so providers and patients can focus on what matters most. Our priority is to always do what is best for our clients, patients and each other. With our proven and scalable operating model, we complement a healthcare organizations infrastructure, quickly driving sustainable improvements to net patient revenue and cash flows while reducing operating costs and enhancing the patient experience. Description: We are seeking a Software Data Engineer with 4-7 year of experience to join our Data Platform team. This role will report to the Manager of data engineering and be involved in the planning, design, and implementation of our centralized data warehouse solution for ETL, reporting and analytics across all applications within the company. Qualifications: Deep knowledge and experience working with Python/Scala and Apache Spark Experienced in Azure data factory, Azure Data bricks, Azure Blob Storage, Azure Data Lake, Delta lake. Experienced in orchestration tool Apache Airflow . Experience working with SQL and NoSQL database systems such as MongoDB, Apache Parquet Experience with Azure cloud environments Experience with acquiring and preparing data from primary and secondary disparate data sources Experience working on large scale data product implementation. Experience working with agile methodology preferred. Healthcare industry experience preferred. Responsibilities: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions Work with other team with deep experience in ETL process and data science domains to understand how to centralize their data Share your passion for staying experimenting with and learning new technologies. Perform thorough data analysis, uncover opportunities, and address business problems. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 1 month ago

Apply

4.0 - 6.0 years

3 - 7 Lacs

Noida

Work from Office

R1 RCM India is proud to be recognized amongst India's Top 50 Best Companies to Work For TM 2023 by Great Place To Work Institute. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare simpler and enable efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 14,000 strong in India with offices in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. We are seeking a Data Engineer with 4-6 years of experience to join our Data Platform team. This role will report to the Manager of data engineering and be involved in the planning, design, and implementation of our centralized data warehouse solution for ETL, reporting and analytics across all applications within the company. Deep knowledge and experience working with Scala and Spark. Experienced in Azure data factory, Azure Data bricks, Azure Synapse Analytics, Azure Data Lake. Experience working in Full stack development in .Net & Angular. Experience working with SQL and NoSQL database systems such as MongoDB, Couchbase. Experience in distributed system architecture design. Experience with cloud environments (Azure Preferred). Experience with acquiring and preparing data from primary and secondary disparate data sources (real-time preferred). Experience working on large scale data product implementation, responsible for technical delivery, mentoring and managing peer engineers. Experience working with Databricks is preferred. Experience working with agile methodology is preferred. Healthcare industry experience is preferred. Job Responsibilities Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions. Work with other team with deep experience in ETL process, distributed microservices, and data science domains to understand how to centralize their data. Share your passion for staying experimenting with and learning new technologies. Perform thorough data analysis, uncover opportunities, and address business problems. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 1 month ago

Apply

9.0 - 12.0 years

11 - 14 Lacs

Noida

Work from Office

R1 RCM India is proud to be recognized amongst India's Top 50 Best Companies to Work For TM 2023 by Great Place To Work Institute. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare simpler and enable efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 14,000 strong in India with offices in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. JD Lead Data Engineer R1 RCM Inc. is a leading provider of technology-enabled revenue cycle management services which transform and solve challenges across health systems, hospitals and physician practices. Headquartered in Chicago, R1 is a publicly-traded organization with employees throughout the US and multiple INDIA locations. Our mission is to be the one trusted partner to manage revenue, so providers and patients can focus on what matters most. Our priority is to always do what is best for our clients, patients and each other. With our proven and scalable operating model, we complement a healthcare organizations infrastructure, quickly driving sustainable improvements to net patient revenue and cash flows while reducing operating costs and enhancing the patient experience. Description: We are seeking a Lead Data Engineer with 9-12 year of experience to join our Data Platform team. This role will report to the Manager of data engineering and be involved in the planning, design, and implementation of our centralized data warehouse solution for ETL, reporting and analytics across all applications within the company. Qualifications: Deep knowledge and experience working with Python/Scala and Spark Experienced in Azure data factory, Azure Data bricks, Azure Data Lake, Blob Storage, Delta Lake, Airflow. Experience working with SQL and NoSQL database systems such as MongoDB, Apache Parquet Experience in distributed system architecture design Experience with AZURE cloud environments Experience with acquiring and preparing data from primary and secondary disparate data sources. Experience working on large scale data product implementation, responsible for technical delivery and mentoring peer engineers. Knowledge of Azure DevOps and version control systems, ideally Git Experience working with agile methodology preferred Excellent understanding of OOPs and design patterns Healthcare industry experience preferred Excellent communication and presentation skills. Responsibilities: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions Work with other team with deep experience in ETL process, distributed microservices, and data science domains to understand how to centralize their data Share your passion for staying experimenting with and learning new technologies. Perform thorough data analysis, uncover opportunities, and address business problems. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 1 month ago

Apply

1.0 - 4.0 years

3 - 7 Lacs

Noida

Work from Office

Key duties & responsibilities Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions Work with other team with deep experience in ETL process, distributed microservices, and data science domains to understand how to centralize their data Share your passion for staying experimenting with and learning new technologies. Perform thorough data analysis, uncover opportunities, and address business problems. Qualification B.E/B. Tech/MCA or equivalent professional degree Experience, Skills and Knowledge Deep knowledge and experience working with SSIS, T-SQL Experienced in Azure data factory, Azure Data bricks & Azure Data Lake. Experience working with any language like Python/SCALA Experience working with SQL and NoSQL database systems such as MongoDB Experience in distributed system architecture design Experience with cloud environments (Azure Preferred) Experience with acquiring and preparing data from primary and secondary disparate data sources (real-time preferred) Experience working on large scale data product implementation, responsible for technical delivery, mentoring and managing peer engineers Experience working with Databricks preferred Experience working with agile methodology preferred Healthcare industry experience preferred Key competency profile Spot new opportunities by anticipating change and planning accordingly Find ways to better serve customers and patients. Be accountable for customer service of highest quality Create connections across teams by valuing differences and including others Own your developmentby implementing and sharing your learnings Motivate each other to perform at our highest level Help people improve by learning from successes and failures Work the right way by acting with integrity and living our values every day Succeed by proactively identifying problems and solutions for yourself and others. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 19 Lacs

Bengaluru

Work from Office

Overview The Gen AI engineer will be part of a team which designs, builds, and operates the AI services of Siemens Healthineers (SHS). The ideal candidate for this role will have experience with AI services. The role will require the candidate to develop and maintain artificial intelligence systems and applications that help businesses and organizations solve complex problems. The role also requires expertise in machine learning, deep learning, natural language processing, computer vision, and other AI technologies. Task and Responsibilities Generative AI Engineer is responsible for designing, architecting, and development of AI product/service. The main responsibilities include Designing, developing, and deploying Azure-based AI solutions, including machine learning models, cognitive services, and data analytics solutions. Collaborating with cross-functional teams, such as data scientists, business analysts, and developers, to design and implement AI solutions that meet business requirements. Building and training machine learning models using Azure Machine Learning, and tuning models for optimal performance. Developing and deploying custom AI models using Azure Cognitive Services, such as speech recognition, language understanding, and computer vision. Creating data pipelines to collect, process, and prepare data for analysis and modeling using Azure data services, such as Azure Data Factory and Azure Databricks. Implementing data analytics solutions using Azure Synapse Analytics or other Azure data services. Deploying and managing Azure services and resources using Azure DevOps or other deployment tools. Monitoring and troubleshooting deployed solutions to ensure optimal performance and reliability. Ensuring compliance with security and regulatory requirements related to AI solutions. Staying up-to-date with the latest Azure AI technologies and industry developments, and sharing knowledge and best practices with the team. Qualifications Overall 5+ years combined experience in IT and recent 3 years as AI engineer. Bachelor's or master's degree in computer science, information technology, or a related field. Experience in designing, developing, and delivering successful AI Services. Experience with cloud computing technologies, such Azure. Relevant industry certifications, such as Microsoft CertifiedAzure AI Engineer, Azure Solution Architect etc. is a plus. Excellent written and verbal communication skills to collaborate with cross-functional teams and communicate technical information to non-technical stakeholders. Technical skills Proficiency in programming languagesYou should be proficient in programming language such as Python and R. Experience of Azure AI servicesYou should have experience with Azure AI services such as Azure Machine Learning, Azure Cognitive Services, and Azure Databricks. Data handling and processingYou should be proficient in data handling and processing techniques such as data cleaning, data normalization, and feature extraction. Knowledge of SQL, NoSQL, and big data technologies such as Hadoop and Spark are also beneficial. Experience with Cloud platformYou should have a good understanding of cloud computing concepts and experience working with Azure services such as containers, Kubernetes, Web Apps, Azure Front Door, CDN, Web Application firewalls etc. DevOps and CI/CDYou should be familiar with DevOps practices and CI/CD pipelines. This includes tools such as Azure DevOps & Git. Security and complianceYou should be aware of security and compliance considerations when building and deploying AI models in the cloud. This includes knowledge of Azure security services, compliance frameworks such as HIPAA and GDPR, and best practices for securing data and applications in the cloud. Machine learning algorithms and frameworksKnowledge of machine learning algorithms and frameworks such as TensorFlow, Keras, PyTorch, and Scikit-learn is a plus.

Posted 1 month ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Overview We are seeking a highly skilled and experienced Azure open AI Archirtect to join our growing team. You will play a key role in designing, developing, and implementing Gen AI solutions across various domains, including chatbots. The ideal candidate for this role will have experience with latest natural language processing, generative AI technologies and the ability to produce diverse content such as text, audio, images, or video. You will be responsible for integrating general-purpose AI models into our systems and ensuring they serve a variety of purposes effectively. Task and Responsibilities Collaborate with cross-functional teams to design and implement Gen AI solutions that meet business requirements. Developing, training, testing, and validating the AI system to ensure it meets the required standards and performs as intended. Design, develop, and deploy Gen AI solutions using advanced LLMs like OpenAI Models, Open Source LLMs ( Llama2 , Mistral ,"), and frameworks like Langchain and Pandas . Leverage expertise in Transformer/Neural Network models and Vector/Graph Databases to build robust and scalable AI systems. Integrate AI models into existing systems to enhance their capabilities. Creating data pipelines to ingest, process, and prepare data for analysis and modeling using Azure services, such as Azure AI Document Intelligence , Azure Databricks etc. Integrate speech-to-text functionality using Azure native services to create user-friendly interfaces for chatbots. Deploying and managing Azure services and resources using Azure DevOps or other deployment tools. Monitoring and troubleshooting deployed solutions to ensure optimal performance and reliability. Ensuring compliance with security and regulatory requirements related to AI solutions. Staying up-to-date with the latest Azure AI technologies and industry developments, and sharing knowledge and best practices with the team. Qualifications Overall 8+ years combined experience in IT and recent 5 years as AI engineer. Bachelor's or master's degree in computer science, information technology, or a related field. Experience in designing, developing, and delivering successful GenAI Solutions. Experience with Azure Cloud Platform and Azure AI services such as Azure AI Search, Azure OpenAI , Document Intelligence, Speech, Vision , etc. Experience with Azure infrastructure and solutioning. Familiarity with OpenAI Models, Open Source LLMs, and Gen AI frameworks like Langchain and Pandas . Solid understanding of Transformer/Neural Network architectures and their application in Gen AI. Hands-on experience with Vector/Graph Databases and their use in semantic & vector search. Proficiency in programming languages like Python (essential). Relevant industry certifications, such as Microsoft CertifiedAzure AI Engineer, Azure Solution Architect etc. is a plus. Excellent problem-solving, analytical, and critical thinking skills. Strong communication and collaboration skills to work effectively in a team environment. A passion for innovation and a desire to push the boundaries of what's possible with Gen AI.

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Microsoft Azure Databricks Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to deliver high-quality applications that meet user expectations and business goals. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Engage in continuous learning to stay updated with the latest technologies and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks.- Strong understanding of cloud computing concepts and services.- Experience with data integration and ETL processes.- Familiarity with application development frameworks and methodologies.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 3 years of experience in Microsoft Azure Databricks.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

10.0 - 15.0 years

5 - 9 Lacs

Hyderabad

Work from Office

: Design, develop, and maintain data pipelines and ETL processes using Databricks. Manage and optimize data solutions on cloud platforms such as Azure and AWS. Implement big data processing workflows using PySpark. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver effective solutions. Ensure data quality and integrity through rigorous testing and validation. Optimize and tune big data solutions for performance and scalability. Stay updated with the latest industry trends and technologies in big data and cloud computing. : Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Proven experience as a Big Data Engineer or similar role. Strong proficiency in Databricks and cloud platforms (Azure/AWS). Expertise in PySpark and big data processing. Experience with data modeling, ETL processes, and data warehousing. Familiarity with cloud services and infrastructure. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Preferred Qualifications: Experience with other big data technologies and frameworks. Knowledge of machine learning frameworks and libraries. Certification in cloud platforms or big data technologies.

Posted 1 month ago

Apply

9.0 - 14.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Kafka Data Engineer Data Engineer to build and manage data pipelines that support batch and streaming data solutions. The role requires expertise in creating seamless data flows across platforms like Data Lake/Lakehouse in Cloudera, Azure Databricks, Kafka for both batch and stream data pipelines etc. Responsibilities Strong experience in develop, test, and maintain data pipelines (batch & stream) using Cloudera, Spark, Kafka and Azure services like ADF, Cosmos DB, Databricks, NoSQL DB/ Mongo DB etc. Strong programming skills in spark, python or scala & SQL. Optimize data pipelines to improve speed, performance, and reliability, ensuring that data is available for data consumers as required. Create ETL pipelines for downstream consumers by transform data as per business logic. Work closely with Data Architects and Data Analysts to align data solutions with business needs and ensure the accuracy and accessibility of data. Implement data validation checks and error handling processes to maintain high data quality and consistency across data pipelines. Strong analytical and problem solving skills, with a focus on optimizing data flows and addressing impacts in the data pipeline. Qualifications 8+ years of IT experience with at least 5+ years in data engineering and cloud-based data platforms. Strong experience with Cloudera/any Data Lake, Confluent/Apache Kafka, and Azure Data Services (ADF, Databricks, Cosmos DB). Deep knowledge of NoSQL databases (Cosmos DB, MongoDB) and data modeling for performance and scalability. Proven expertise in designing and implementing batch and streaming data pipelines using Databricks, Spark, or Kafka. Experience in creating scalable, reliable, and high-performance data solutions with robust data governance policies. Strong collaboration skills to work with stakeholders, mentor junior Data Engineers, and translate business needs into actionable solutions. Bachelors or masters degree in computer science, IT, or a related field.

Posted 1 month ago

Apply

8.0 - 13.0 years

8 - 12 Lacs

Pune

Work from Office

Must have 5+ years of experience in data engineer role Strong background in Relational Databases ( Microsoft SQL) and strong ETL (Microsoft SSIS) experience. Strong hand on T-SQL programming language Ability to develop reports using Microsoft Reporting Services (SSRS) Familiarity with C# is preferred Strong Analytical and Logical Reasoning skills Should be able to build processes that support data transformation, workload management, data structures, dependency and metadata Should be able to develop data models to answer questions for the business users Should be good at performing root cause analysis on internal/external data and processes to answer specific business data questions. Excellent communication skills to work with business users independently.

Posted 1 month ago

Apply

8.0 - 13.0 years

5 - 9 Lacs

Hyderabad

Work from Office

1. Data Engineer Azure Data Services 2. Data Modelling NO SQL and SQL 3. Good understanding of Spark, Spark stream 4. Hands on with Python / Pandas / Data Factory / Cosmos DB / Data Bricks / Event Hubs / Stream Analytics 5. Knowledge of medallion architecture, data vaults, data marts etc. 6. Preferably Azure Data associate exam certified.

Posted 1 month ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering,BCom,BBA,BSc,MSc,MCA Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional : Primary skills:Technology-Cloud Platform-Azure Development & Solution Architecting Preferred Skills: Technology-Cloud Platform-Azure Development & Solution Architecting

Posted 1 month ago

Apply

2.0 - 5.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering,BCom,BSc,BBA,MSc,MCA Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional : Primary skills:Technology-Cloud Platform-Azure Development & Solution Architecting Preferred Skills: Technology-Cloud Platform-Azure Development & Solution Architecting

Posted 1 month ago

Apply

7.0 - 12.0 years

20 - 25 Lacs

Chennai

Remote

Skills: Azure Data Factory, Data Lake, Synapse SQL DWH, Databricks Airflow, Python, Pyspark, SQL, Terraform Experience in designing and building highly scalable data platforms pipelines Hands on experience in developing end to end big data pipelines using Azure, AWS and or open source bigdata tools and technologies. Experience in data lake , data warehousing solutions Experience in extracting data from APIs, Cloud services Salesforce, Eloqua, S3, SQL and NoSQL onprem cloud databases using Azure Data factory, Glue ,Opensource data ingestion tools Experience in creating complex data processing pipelines ETLs using PySpark Scala Databricks Glue EMR In depth knowledge of spark architecture and experience in improving the performance and optimization Experience using cloud data warehouses e.g. Synapse ,SQL DWH, Redshift, Snowflake, build and manage data models and to present data securely Good understanding of Distributed Data Processing , MPP Knowledge of common DevOps skills and methodologies Experience in Azure DevOps, GitHub, Gitlab Experience in using Azure ARM Templates or Infrastructure as Code Terraform Indepth understanding of OLTP, OLAP, Data warehousing, Data modeling and strong analytical and problem solving skills Hands on experience using MySQL, MS SQL Server, Oracle or similar RDBMS platform. Highly self motivated, self directed, and attentive to detail Ability to effectively prioritize and execute tasks. Additional skillsets Good to have experience in Docker and Kubernetes Experience working across Azure IaaS, Azure PaaS, Azure Networking, and other areas of the platform Databricks delta lake and lake house Familiarity of Data Quality Management methodology and supporting technology tools. Familiarity with data visualization tools e.g. PowerBI, Tableau etc

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Role and Responsibilities Develop, execute, and maintain automated test scripts using Selenium, Python, or Java. Conduct thorough testing of various modules, including Reporting, Chart of Accounts, General Ledger, Accounting Model, IFRS17 and Actuarial, Insurance, Re-insurance, Investment, Payroll, Procurement, Expense Allocation, and Consolidation. Ensure seamless integration with our Azure Databricks Data Repository. Collaborate with Agile teams to ensure comprehensive test coverage and timely delivery. Identify, document, and track defects and issues. Perform regression testing to ensure existing functionality is not affected by new changes. Work closely with developers, business analysts, and other stakeholders to understand requirements and provide feedback. Continuously improve test processes and methodologies. Qualifications Bachelors degree in Computer Science, Information Technology, Finance, or a related field. Proven experience as a Test Engineer, preferably in the finance or insurance industry. Strong knowledge of test automation tools and frameworks, particularly Selenium, Python, and Java. Familiarity with Oracle Fusion and its modules. Experience with Azure Databricks and data integration processes. Strong analytical and problem-solving abilities. Excellent communication and interpersonal skills. Ability to work under pressure and manage multiple priorities. Preferred skills Certification in software testing (e.g., ISTQB). Experience with IFRS17 and actuarial processes. Knowledge of insurance and re-insurance operations. Familiarity with investment, payroll, procurement, and expense allocation processes.

Posted 1 month ago

Apply

6.0 - 7.0 years

14 - 18 Lacs

Pune

Work from Office

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Ahmedabad

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Analytics Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : BE Summary :As an Application Lead for Packaged Application Development, you will be responsible for designing, building, and configuring applications using Microsoft Azure Analytics Services. Your typical day will involve leading the effort to deliver high-quality applications, acting as the primary point of contact for the project team, and ensuring timely delivery of project milestones. Roles & Responsibilities:- Lead the effort to design, build, and configure applications using Microsoft Azure Analytics Services.- Act as the primary point of contact for the project team, ensuring timely delivery of project milestones.- Collaborate with cross-functional teams to ensure the successful delivery of high-quality applications.- Provide technical guidance and mentorship to team members, ensuring adherence to best practices and standards. Professional & Technical Skills: - Must To Have Skills: Strong experience with Microsoft Azure Analytics Services.- Good To Have Skills: Experience with other Azure services such as Azure Data Factory, Azure Databricks, and Azure Synapse Analytics.- Experience in designing, building, and configuring applications using Microsoft Azure Analytics Services.- Must have databricks and pyspark Skills.- Strong understanding of data warehousing concepts and best practices.- Experience with ETL processes and tools such as SSIS or Azure Data Factory.- Experience with SQL and NoSQL databases.- Experience with Agile development methodologies. Additional Information:- The candidate should have a minimum of 5 years of experience in Microsoft Azure Analytics Services.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering high-quality applications.- This position is based at our Bengaluru office. Qualification BE

Posted 1 month ago

Apply

6.0 - 7.0 years

14 - 18 Lacs

Bengaluru

Work from Office

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 1 month ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Databricks Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that project goals are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also engage in strategic planning sessions to align project objectives with organizational goals, ensuring that all stakeholders are informed and involved in the decision-making process. Your role will require you to balance technical oversight with team management, fostering an environment of innovation and collaboration. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks.- Good To Have Skills: Experience with cloud computing platforms.- Strong understanding of application development methodologies.- Experience in managing cross-functional teams.- Familiarity with Agile and DevOps practices. Additional Information:- The candidate should have minimum 7.5 years of experience in Microsoft Azure Databricks.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

2.0 - 5.0 years

13 - 17 Lacs

Gurugram

Work from Office

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong MS SQL, Azure Databricks experience Implement and manage data models in DBT, data transformation and alignment with business requirements. Ingest raw, unstructured data into structured datasets to cloud object store. Utilize DBT to convert raw, unstructured data into structured datasets, enabling efficient analysis and reporting. Write and optimize SQL queries within DBT to enhance data transformation processes and improve overall performance Preferred technical and professional experience Establish best DBT processes to improve performance, scalability, and reliability. Design, develop, and maintain scalable data models and transformations using DBT in conjunction with Databricks Proven interpersonal skills while contributing to team effort by accomplishing related results as required

Posted 1 month ago

Apply

2.0 - 5.0 years

2 - 6 Lacs

Bengaluru

Work from Office

Apply your engineering expertise using the latest Data Engineering and Data Science technologies as Business Intelligence Development Analyst at the JLL Technologies Global Centre of Expertise in Bangalore, India. The JLL Technologies Product Engineering team aims to bring successful technology-based products to market in a high-growth environment. The team's mission is focused on accelerating technology adoption in commercial real estate by bringing creative, innovative and technical solutions to solve large, complex problems for our clients. Shape the future of real estate for a better world by contributing to the creation of globally scalable products used by JLLs client customers the most respected brands in the world. Experience & Education Experience with one or more public clouds such as Azure, AWS and GCP (Azure preferred). Reliable, self-motivated and self-disciplined individual capable of planning and executing multiple projects simultaneously within a fast-paced environment. Bachelor's degree in Computer Science or related discipline or Electronics & Communication Engineering. Advanced degree preferred. 6 months+ of experience Capable of rapid self-learning of new software applications and programming languages. Effective written and verbal communication skills, including technical documentation. Excellent technical, analytical, time management, and organizational skills Requires excellent collaboration, presentation and communication skills. Technical Skills & Competencies Strong experience in Data tools and technologies particularly MS Azure Databricks, SQL. Strong experience in Python, PySpark. Strong experience in building and maintaining data pipelines. Strong knowledge and working experience in DW/BI, Data Engineering and/or Data Science using different tools and in different domains. Good knowledge of GitHub, Agile methodologies and tools. Nice to have: Experience in Data Science, AI/ML - good understanding and demonstrated application of the concepts, various models and algorithms. Nice to have: Experience in Tableau, Power BI or other reporting tools.

Posted 1 month ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Mumbai

Work from Office

The leader must demonstrate an ability to anticipate, understand, and act on evolving customer needs, both stated and unstated. Through this, the candidate must create a customer-centric organization and use innovative thinking frameworks to foster value-added relations. With the right balance of bold initiatives, continuous improvement, and governance, the leader must adhere to the delivery standards set by the client and eClerx by leveraging the knowledge of market drivers and competition to effectively anticipate trends and opportunities. Besides, the leader must demonstrate a capacity to transform, align, and energize organization resources, and take appropriate risks to lead the organization in a new direction. As a leader, the candidate must build engaged and high-impact direct, virtual, and cross-functional teams, and take the lead towards raising the performance bar, build capability and bring out the best in their teams. By collaborating and forging partnerships both within and outside the functional area, the leader must work towards a shared vision and achieve positive business outcomes. Associate Program Manager Role and responsibilities: Represent eClerx in client pitches and external forums. Own platform and expertise through various COE activities and content generation to promote practice and business development. Lead continuous research and assessments to explore best and latest platforms, approaches, and methodologies. Contribute to developing the practice area through best practices, ideas, and Point of Views in the form of white papers and micro articles. Lead/partner in multi-discipline assessments and workshops at client sites to identify new opportunities. Lead key projects and provide development/technical leadership to junior resources. Drive solution design and build to ensure scalability, performance, and reuse. Design robust data architectures, considering performance, data quality, scalability, and data latency requirements. Recommend and drive consensus around preferred data integration and platform approaches, including Azure and Snowflake. Anticipate data bottlenecks (latency, quality, speed) and recommend appropriate remediation strategies. This is a hands-on position with a significant development component, and the ideal candidate is expected to lead the technical development and delivery of highly visible and strategic projects. Technical and Functional skills: Bachelor's Degree with at least 2-3 large-scale Cloud implementations within Retail, Manufacturing, or Technology industries. 10+ years of overall experience with data management and cloud engineering. Expertise in Azure Cloud, Azure Data Lake, Databricks, Snowflake, Teradata, and compatible ETL technologies. Strong attention to detail and ability to collaborate with multiple parties, including analysts, data subject matter experts, external labs, etc.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies