Jobs
Interviews

1016 Etl Process Jobs - Page 36

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 - 15.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Mentor junior professionals in best practices for data engineering.- Continuously evaluate and improve data processes to enhance efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data pipeline architecture and design.- Experience with ETL processes and data integration techniques.- Familiarity with data quality frameworks and data governance practices.- Knowledge of cloud platforms and services related to data analytics. Additional Information:- The candidate should have minimum 12 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

5.0 - 9.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Data Engineering Lead Analyst - HIH - Evernorth About Evernorth:Evernorth Health Services, a division of The Cigna Group (NYSECI), creates pharmacy, care, and benefits solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention, and treatment of illness and disease more accessible to millions of people.POSITION SUMMARY: The Data Engineering Lead Analyst will be a key contributor to ensure the successful delivery of ETL Production Services and Application Support. To be successful in this role, you must understand ETL processes and be experienced in supporting Production Services applications. You must also have excellent communication skills and be able to build and maintain relationships with both internal and external stakeholders.They will be responsible for partnering with key players in Customer Service, Client Service, Provider Service, and Workforce Planning, while drawing on support from Technology, Finance, Strategy, Operational Readiness, and Solutions Delivery. ESSENTIAL FUNCTIONS:Production support activities such as trouble shooting, operational efficiencies, release management, etc. Develop SSIS code, steward existing ETL jobs, manage DB space and tune queries. Provide support for defect resolution and triage.Participate in proof of concepts for software tools/technologies related to application or process development. Provide support on CI/CD pipelines and automation. Ultimately responsible for maintaining solutions and maintain documentation related to production systems, including standard operating procedures, and troubleshooting guides.QUALIFICATIONS:Experience in Production support role.Performance Tuning and optimizing experience of SQL, PL/SQL code to reduce execution time or improve efficiency.Hands-on experience in SSIS, SQL Server, Oracle, and SQL.Experience with scheduling tools such as CA Workload Automation. Strong understanding of database concepts and experience with query optimization and performance tuning.Excellent troubleshooting and problem-solving skills, with the ability to analyze complex issues and implement effective solutions.Strong communication and interpersonal skills, with the ability to work collaboratively in a fast-paced environment. Primary Skills: SSIS, SQL (SQL Server, Oracle, Teradata)Job Scheduling (Airflow, DAG, Control M, etc.)Performance Tuning, Site Reliability Engineering, Issue ResolutionPython, DevOps, Basic Cloud Experience (AWS/Azure/GCP)Additional Skills: Databricks, Cloud Certification (AWS Cloud Practitioner or similar), Apache Spark About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 2 months ago

Apply

5.0 - 8.0 years

11 - 16 Lacs

Bengaluru

Work from Office

With 54 facilities worldwide, Eurofins BioPharma Product Testing (BPT) is the largest network of bio/pharmaceutical GMP product testing laboratories providing comprehensive laboratory services for the world's largest pharmaceutical, biopharmaceutical, and medical device companies. Behind the scenes, BPT is enabled by global engineering teams working on next-generation applications like Eurofins Quality Management System(eQMS). eQMS is sophisticated web application that will be used by our scientists, engineers, and technicians to manage several quality and compliance management processes. This role reports to a Engineering Manager. Required Experience and Skills: 4 to 7 years of experience with designing, planning and executing large scale data migrations Advanced skills with ETL process( Extarct, Trasform and Load) tools like (Azure Data Factory) Strong working knowledge in relational database technologies like SQL Server and NoSQL database technologies like MongoDB etc Strong working knowledge and profiency in scritping and programming languages such as Powershell and SQL. Familiar shell scripting automation tasks. Experience with version control systems (e.g., Tfs Git). Able to provide technical recommendations and solve technical problems Should have worked in an AGILE practice methodology (preferably SCRUM) Excellent problem-solving, debugging and attention to details. Experience with performance tuning of data migration process. Experience in data validation, cleansing, and testing for accuracy. Understanding of data security and compliance requirements. Attention to detail and a passion for creating exceptional user experiences. Desirable Experience Knowledge of data governance practices and a strong understanding of data privacy and ethical considerations. Familiarity with database backup strategies, data validation, and quality assurance. Optimize data migration process to minimize downtime and maximize efficeny. Good working knowledge of Databases Mongo DB, Cosmos DB, MSSQL SQL etc. PERFORMANCE APPRAISAL CRITERIA Eurofins has a strong focus on Performance Management system. This includes quarterly calibrations, half-yearly reviews and annual reviews. The KPIs shall be set and may vary slightly between projects. These will be clearly communicated, documented during the first 30 days of your joining. Qualifications Bachelors in Engineering, Computer Science or equivalent.

Posted 2 months ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Kolkata

Work from Office

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your role Develop and maintain data pipelines tailored to Azure environments, ensuring security and compliance with client data standards. Collaborate with cross-functional teams to gather data requirements, translate them into technical specifications, and develop data models. Leverage Python libraries for data handling, enhancing processing efficiency and robustness. Ensure SQL workflows meet client performance standards and handle large data volumes effectively. Build and maintain reliable ETL pipelines, supporting full and incremental loads and ensuring data integrity and scalability in ETL processes. Implement CI/CD pipelines for automated deployment and testing of data solutions. Optimize and tune data workflows and processes to ensure high performance and reliability. Monitor, troubleshoot, and optimize data processes for performance and reliability. Document data infrastructure, workflows, and maintain industry knowledge in data engineering and cloud tech. Your Profile Bachelors degree in computer science, Information Systems, or a related field 4+ years of data engineering experience with a strong focus on Azure data services for client-centric solutions. Extensive expertise in Azure Synapse, Data Lake Storage, Data Factory, Databricks, and Blob Storage, ensuring secure, compliant data handling for clients. Good interpersonal communication skills Skilled in designing and maintaining scalable data pipelines tailored to client needs in Azure environments. Proficient in SQL and PL/SQL for complex data processing and client-specific analytics. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.

Posted 2 months ago

Apply

9.0 - 12.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Datastage admin to provisioning user accounts, access, installing Datastage on Linux or Windows Servers. Installation, Configuration, and Upgrades: Install, configure, and upgrade IBM InfoSphere DataStage (IIS) and related products. Configure DataStage for various data sources, including plug-ins, connectors, and ODBC configurations. Manage and maintain the DataStage environment, including servers and infrastructure. User and Security Management: Create, manage, and delete DataStage projects. Assign roles and permissions to users within the DataStage environment. Manage user access and security, including creating users and groups. Performance Monitoring and Optimization: Monitor the performance of DataStage jobs and processes. Identify and resolve performance bottlenecks. Implement performance tuning techniques. Troubleshooting and Problem Resolution: Diagnose and resolve DataStage issues, including job failures, locks, and other problems. Work with IBM support to resolve complex issues. Skills Required Strong understanding of ETL concepts and processes. Experience with IBM InfoSphere DataStage (IIS). Experience with Linux/Unix systems administration. Experience with scripting languages (e.g., Bash, Python, PowerShell). Experience with SQL and databases. Excellent problem-solving and troubleshooting skills. Good communication and interpersonal skills. Tools and Technologies: IBM InfoSphere DataStage (IIS). DataStage Director. DataStage Administrator client. Linux/Unix. Scripting languages (e.g., Bash, Python, PowerShell). Database tools and technologies.

Posted 2 months ago

Apply

0.0 - 1.0 years

0 Lacs

Ahmedabad

Work from Office

Job Title: Data Engineer Intern Location: Ahmedabad ( Work from Office) Duration: 5 months Start Date: Immediate or As per Availability Company: FX31 Labs Role Overview: We are looking for a curious and detail-oriented Data Engineer Intern to join our data team. This internship will give you the opportunity to work on large-scale data projects , build ETL pipelines , manage databases, and support the team in building data solutions that power decision-making. Key Responsibilities: Assist in designing and developing ETL pipelines for collecting, transforming, and loading data. Support data integration from multiple sources like APIs, files, and databases. Clean, validate, and preprocess large datasets for analytical or operational use. Help maintain and optimize data warehouses or data lakes . Collaborate with Data Scientists and Analysts to provide necessary datasets. Document data flows, data dictionaries, and pipeline architecture. Required Skills: Basic knowledge of SQL and data querying . Exposure to Python (especially for data handling pandas, NumPy, etc.). Understanding of databases (MySQL, PostgreSQL, MongoDB, etc.). Awareness of ETL processes , data modeling , and data wrangling . Familiarity with cloud platforms (AWS, GCP, or Azure) is a plus. Strong analytical mindset and eagerness to learn new data technologies. Good to Have: Exposure to Big Data tools like Spark, Hadoop, or Airflow. Experience with Linux/Shell scripting for automation. Knowledge of version control (Git) and basic DevOps practices. Interest in data visualization or reporting tools (Tableau, Power BI, etc.). Eligibility Criteria: Pursuing or recently completed a degree in Computer Science, IT, Data Science, or related field . Available to work full-time from the Ahmedabad office for the duration of the internship. Perks: Certificate of Internship & Letter of Recommendation upon successful completion. Mentorship from experienced engineers on real-time data projects . Opportunity to receive PPO (Pre-Placement Offer) based on performance. Gain practical exposure to industry-grade tools , data challenges, and teamwork. About FX31 Labs: FX31 Labs is a fast-growing tech company delivering cutting-edge solutions in AI , data engineering , and digital product development . Our mission is to provide a learning-rich environment , encouraging innovation and hands-on experience with real-world data challenges. Warm Regards, Team Fx31labs

Posted 2 months ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Coimbatore

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP Data Services Development Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring that the applications are developed according to specifications and delivered on time. Your typical day will involve collaborating with the team, making team decisions, and engaging with multiple teams to contribute to key decisions. You will also provide solutions to problems for your immediate team and across multiple teams, showcasing your expertise in SAP Data Services Development. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Design, build, and configure applications to meet business process and application requirements- Ensure that applications are developed according to specifications- Deliver applications on time Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Data Services Development- Strong understanding of data integration and ETL processes- Experience in designing and implementing data migration solutions- Knowledge of SAP Data Services architecture and components- Experience in troubleshooting and resolving data quality issues Additional Information:- The candidate should have a minimum of 5 years of experience in SAP Data Services Development- This position is based at our Hyderabad office- A 15 years full time education is required Qualification 15 years full time education

Posted 2 months ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : Data Warehouse ETL TestingMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with teams to ensure the successful development of applications. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead and mentor junior professionals- Conduct code reviews and ensure best practices are followed Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio- Good To Have Skills: Experience with Data Warehouse ETL Testing- Strong understanding of data integration and ETL processes- Hands-on experience in developing and implementing data pipelines- Knowledge of data quality and data governance principles Additional Information:- The candidate should have a minimum of 5 years of experience in Ab Initio- This position is based at our Pune office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 months ago

Apply

4.0 - 7.0 years

9 - 13 Lacs

Pune

Work from Office

Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : SAP BusinessObjects Data Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Lead, you will be responsible for developing and configuring software systems, either end-to-end or for specific stages of the product lifecycle. Your typical day will involve collaborating with various teams, applying your knowledge of technologies and methodologies, and ensuring that the software solutions meet client requirements effectively and efficiently. You will engage in problem-solving and decision-making processes that contribute to the overall success of the projects you oversee. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing and mentoring within the team to enhance overall performance.- Monitor project progress and ensure alignment with project goals and timelines.IDQ Skills: - utilize the IDQ platform to profile ECC and S4 data rules into IDQ.- Measure whether the SAP ECC Source data is Technically complying with the S4 data model (e.g. all mandatory fields filled, data type and length valid) - Prior Knowledge of IDQ (and ideally IDP platform as a whole)SAP Data migration Skills: - Knowledge of data modeling- SAP data model Knowledge is highly valued. (knowing typical concepts, Tables, etc) -- SAP functional / config Knowledge less relevant- Strong analytical- Ability to communicate with client personnel Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services.- Strong understanding of data integration and ETL processes.- Experience with data quality management and data profiling.- Familiarity with database management systems and SQL.- Ability to troubleshoot and resolve technical issues efficiently.- Experience on ECC and S/4 HANA and migration projects Additional Information:- The candidate should have minimum 5 years of experience in SAP BusinessObjects Data Services.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : Unix Shell Scripting, Hadoop Administration, PySparkMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement efficient and scalable application solutions.- Collaborate with cross-functional teams to analyze and address technical issues.- Conduct code reviews and provide constructive feedback to team members.- Stay updated on industry trends and best practices to enhance application development processes.- Assist in troubleshooting and resolving application-related issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Good To Have Skills: Experience with Unix Shell Scripting, Hadoop Administration, PySpark.- Strong understanding of ETL processes and data integration.- Experience in developing and optimizing data pipelines.- Knowledge of data warehousing concepts and methodologies.- Familiarity with database technologies and SQL queries. Additional Information:- The candidate should have a minimum of 3 years of experience in Ab Initio.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : Data Warehouse ETL TestingMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating innovative solutions to address various business needs and ensuring seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead and mentor junior professionals- Conduct code reviews and ensure adherence to coding standards Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio- Good To Have Skills: Experience with Data Warehouse ETL Testing- Strong understanding of data integration and ETL processes- Experience in developing and implementing data solutions- Knowledge of data modeling and database design Additional Information:- The candidate should have a minimum of 5 years of experience in Ab Initio- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 months ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement efficient Ab Initio applications.- Collaborate with team members to troubleshoot and resolve application issues.- Conduct regular code reviews to ensure quality and efficiency.- Stay updated on industry trends and best practices in application development.- Provide mentorship and guidance to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of ETL processes and data integration.- Experience with data quality and data governance principles.- Knowledge of SQL and database management systems.- Good To Have Skills: Experience with data modeling and data warehousing. Additional Information:- The candidate should have a minimum of 3 years of experience in Ab Initio.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP BusinessObjects Data Services Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing project progress, coordinating with teams, and ensuring successful application delivery. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Lead the effort to design, build, and configure applications- Act as the primary point of contact- Ensure successful application delivery- Coordinate with cross-functional teams Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services- Strong understanding of data integration and ETL processes- Experience in leading application development projects- Knowledge of SAP BusinessObjects platform- Hands-on experience in configuring and optimizing applications Additional Information:- The candidate should have a minimum of 12 years of experience in SAP BusinessObjects Data Services- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 months ago

Apply

7.0 - 12.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Data Engineering Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years of full time education Summary :As a Data Engineering Application Lead, you will be responsible for leading the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve working with data pipelines, ETL processes, and data warehousing to ensure data quality and integrity. Roles & Responsibilities:- Lead the design, development, and implementation of data pipelines, ETL processes, and data warehousing solutions to ensure data quality and integrity.- Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and design scalable solutions.- Develop and maintain data models, schemas, and data dictionaries to ensure consistency and accuracy across the organization.- Manage and mentor a team of data engineers, providing technical guidance and support to ensure successful project delivery.- Stay updated with the latest advancements in data engineering and big data technologies, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: - Must To Have Skills: Strong experience in data engineering, including data pipelines, ETL processes, and data warehousing.- Must To Have Skills: Proficiency in programming languages such as Python, Java, or Scala.- Good To Have Skills: Experience with big data technologies such as Hadoop, Spark, or Kafka.- Good To Have Skills: Experience with cloud-based data platforms such as AWS, Azure, or GCP.- Strong understanding of database technologies such as SQL, NoSQL, and columnar databases.- Solid grasp of data modeling, schema design, and data dictionary management.- Experience with data visualization tools such as Tableau or Power BI.- Experience with agile development methodologies and project management tools such as JIRA or Trello. Additional Information:- The candidate should have a minimum of 7.5 years of experience in data engineering.- The ideal candidate will possess a strong educational background in computer science, software engineering, or a related field, along with a proven track record of delivering impactful data-driven solutions.- This position is based at our Bengaluru office. Qualification 15 years of full time education

Posted 2 months ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop and enhance applications for various business needs. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Implement best practices for application design and development- Ensure timely delivery of high-quality applications Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio with minimum of 5 years of experience in Ab Initio- Strong understanding of ETL processes- Experience with data integration and data warehousing- Hands-on experience in designing and developing applications using Ab Initio- Knowledge of data modeling and database concepts Additional Information:- This position is based at our Pune, Bengaluru & Chennai office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 months ago

Apply

1.0 - 4.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP BW/4HANA Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code across multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of enhancements and maintenance tasks, while also contributing to the development of new features that meet client needs. You will be responsible for troubleshooting issues and ensuring the quality of the application through rigorous testing and validation processes, all while adapting to the evolving requirements of the projects you are involved in. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Collaborate with cross-functional teams to gather requirements and provide technical insights. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA.- Strong understanding of data modeling and ETL processes.- Experience with SAP HANA database and its functionalities.- Familiarity with reporting tools and techniques within the SAP ecosystem.- Ability to troubleshoot and optimize performance issues in SAP BW/4HANA. Additional Information:- The candidate should have minimum 2 years of experience in SAP BW/4HANA.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

Pune

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AWS Glue Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the decision-making process. Your role will require a balance of technical expertise and leadership skills to drive project success and foster a collaborative team environment. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Glue.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with data warehousing concepts and best practices.- Ability to troubleshoot and optimize data workflows. Additional Information:- The candidate should have minimum 7.5 years of experience in AWS Glue.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving discussions, contribute to the overall project strategy, and continuously refine your skills to enhance application performance and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud-based data solutions and analytics.- Familiarity with programming languages such as Python or Scala.- Knowledge of data visualization techniques and tools. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

8.0 - 12.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Good to have skills required : Cloud, SQL , data analysis skills Location : Pune - Kharadi - WFO - 3 days/week. Job Description : We are seeking a highly skilled and experienced Python Lead to join our team. The ideal candidate will have strong expertise in Python coding and development, along with good-to-have skills in cloud technologies, SQL, and data analysis. Key Responsibilities : - Lead the development of high-quality, scalable, and robust Python applications. - Collaborate with cross-functional teams to define, design, and ship new features. - Ensure the performance, quality, and responsiveness of applications. - Develop RESTful applications using frameworks like Flask, Django, or FastAPI. - Utilize Databricks, PySpark SQL, and strong data analysis skills to drive data solutions. - Implement and manage modern data solutions using Azure Data Factory, Data Lake, and Data Bricks. Mandatory Skills : - Proven experience with cloud platforms (e.g. AWS) - Strong proficiency in Python, PySpark, R, and familiarity with additional programming languages such as C++, Rust, or Java. - Expertise in designing ETL architectures for batch and streaming processes, database technologies (OLTP/OLAP), and SQL. - Experience with the Apache Spark, and multi-cloud platforms (AWS, GCP, Azure). - Knowledge of data governance and GxP data contexts; familiarity with the Pharma value chain is a plus. Good to Have Skills : - Experience with modern data solutions via Azure. - Knowledge of principles summarized in the Microsoft Cloud Adoption Framework. - Additional expertise in SQL and data analysis. Educational Qualifications : Bachelor's/Master's degree or equivalent with a focus on software engineering. If you are a passionate Python developer with a knack for cloud technologies and data analysis, we would love to hear from you. Join us in driving innovation and building cutting-edge solutions! Apply Insights Follow-up Save this job for future reference Did you find something suspicious? Report Here! Hide This Job? Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Role Proficiency: Act creatively to develop applications and select appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions account for others' developmental activities Outcomes: Interpret the application/feature/component design to develop the same in accordance with specifications. Code debug test document and communicate product/component/feature development stages. Validate results with user representatives; integrates and commissions the overall solution Select appropriate technical options for development such as reusing improving or reconfiguration of existing components or creating own solutions Optimises efficiency cost and quality. Influence and improve customer satisfaction Set FAST goals for self/team; provide feedback to FAST goals of team members Measures of Outcomes: Adherence to engineering process and standards (coding standards) Adherence to project schedule / timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues On time completion of mandatory compliance trainings Outputs Expected: Code: Code as per design Follow coding standards templates and checklists Review code - for team and peers Documentation: Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation r and requirements test cases/results Configure: Define and govern configuration management plan Ensure compliance from the team Test: Review and create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain relevance: Advise Software Developers on design and development of features and components with a deep understanding of the business problem being addressed for the client. Learn more about the customer domain identifying opportunities to provide valuable addition to customers Complete relevant domain certifications Manage Project: Manage delivery of modules and/or manage user stories Manage Defects: Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate: Create and provide input for effort estimation for projects Manage knowledge: Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release: Execute and monitor release process Design: Contribute to creation of design (HLD LLD SAD)/architecture for Applications/Features/Business Components/Data Models Interface with Customer: Clarify requirements and provide guidance to development team Present design options to customers Conduct product demos Manage Team: Set FAST goals and provide feedback Understand aspirations of team members and provide guidance opportunities etc Ensure team is engaged in project Certifications: Take relevant domain/technology certification Skill Examples: Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort required for developing / debugging features / components Perform and evaluate test in the customer or target environment Make quick decisions on technical/project related challenges Manage a Team mentor and handle people related issues in team Maintain high motivation levels and positive dynamics in the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback to team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers addressing customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning; handling multiple tasks. Build confidence with customers by meeting the deliverables on time with quality. Estimate time and effort resources required for developing / debugging features / components Make on appropriate utilization of Software / Hardware's. Strong analytical and problem-solving abilities Knowledge Examples: Appropriate software programs / modules Functional and technical designing Programming languages - proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile - Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Knowledge of customer domain and deep understanding of sub domain where problem is solved Additional Comments: Mandatory Skills, Knowledge, and Experience: - Python Development (6+ years): Strong backend development experience, including RESTful APIs and FastAPI. - Generative AI & OpenAI: Practical experience in working with Gen AI models and integrating OpenAI APIs into real-world applications. - API Development: Proven track record of building and maintaining REST APIs with FastAPI, including authentication, authorization, and rate-limiting. - Data Engineering: Expertise in ETL processes, data transformation, and analysis using Pandas. - LLM Prompt Engineering: Experience in prompt design and optimization for large language models. - Python Data Science Libraries: Proficient in Pandas, NumPy, and other data tools for processing and analysis. - Version Control & CI/CD: Proficient with Git and CI/CD pipelines for automated deployment and testing. - Agile/Scrum: 3+ years of experience working in Agile/Scrum environments. - Testing & Automation: Experience in unit, integration, and automated testing with pytest and unittest. - Communication: Strong verbal and written communication, with the ability to explain technical concepts to diverse stakeholders. - Non-Functional Requirements: Experience with performance optimization, scalability, and security in data-centric applications. Nice to Have Skills: - Cloud Platforms: Familiarity with AWS or GCP, particularly in scalable APIs, serverless architecture, and data storage. - Data Pipelines: Knowledge of Apache Airflow, Kafka, or similar tools for data workflow orchestration. - ML Frameworks: Experience with scikit-learn, TensorFlow, or PyTorch for model training and deployment. - Code Quality Tools: Familiarity with SonarQube, ESLint, or similar tools for maintaining high code quality. Required Skills Python,Generative AI,API,Etl Tools

Posted 2 months ago

Apply

8.0 - 13.0 years

12 - 22 Lacs

Bengaluru

Work from Office

Your Responsibilities: Designing and implementing scalable and reliable data pipelines on the Azure platform Developing and maintaining data integration solutions using Azure Data Factory, Azure Databricks, and other Azure services Ensuring data quality and integrity by implementing best practices in data collection, processing, and storage Collaborating with data scientists, data analysts, and other stakeholders to understand their data needs and deliver actionable insights Managing and optimizing Azure data storage solutions such as Azure SQL Database, Azure Data Lake, and Azure Cosmos DB Monitoring the performance of data pipelines and implementing strategies for continuous improvement Developing and maintaining ETL processes to support data warehousing and analytics Implementing best practices for data governance, security, and compliance Staying up-to-date with the latest industry trends and technologies to continuously improve data engineering practices and methodologies Living Hitachi Energys core values of safety and integrity, which means taking responsibility for your own actions while caring for your colleagues and the business. Your Background: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field 8+ years of experience in data engineering, with a focus on Azure data services Relevant certifications in Azure data services or cloud computing will be an added advantage Proficiency in programming and scripting languages such as Python, SQL, or Scala Experience with Azure data services, including Azure Data Factory, Azure Databricks, Azure Synapse Analytics, and Azure Data Lake Strong understanding of data modeling, ETL processes, and data warehousing concepts Experience with big data technologies such as Hadoop and Spark Knowledge of data governance, security, and compliance best practices Familiarity with monitoring and logging tools such as Azure Monitor and Log Analytics Strong problem-solving and troubleshooting skills Excellent communication and collaboration skills to work effectively with cross-functional team. Strong attention to detail and organizational skills Ability to articulate and present ideas to senior management Problem-solving mindset with the ability to work independently and as part of a team Eagerness to learn and enhance knowledge unassisted Strong networking skills and global orientation Ability to coach and mentor team members Effective collaboration with internal and external stakeholders Adaptability to manage and lead transformational projects Proficiency in both spoken & written English language is required

Posted 2 months ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Lead the migration of the ETLs from on-premises SQLServer based data warehouse to Azure Cloud, Databricks and Snowflake Design, develop, and implement data platform solutions using Azure Data Factory (ADF), Self-hosted Integration Runtime (SHIR), Logic Apps, Azure Data Lake Storage Gen2 (ADLS Gen2), Blob Storage, and Databricks (Pyspark) Review and analyze existing on-premises ETL processes developed in SSIS and T-SQL Implement DevOps practices and CI/CD pipelines using GitActions Collaborate with cross-functional teams to ensure seamless integration and data flow Optimize and troubleshoot data pipelines and workflows Ensure data security and compliance with industry standards Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications 6+ years of experience as a Cloud Data Engineer Hands-on experience with Azure Cloud data tools (ADF, SHIR, Logic Apps, ADLS Gen2, Blob Storage) and Databricks Solid experience in ETL development using on-premises databases and ETL technologies Experience with Python or other scripting languages for data processing Experience with Agile methodologies Proficiency in DevOps and CI/CD practices using GitActions Proven excellent problem-solving skills and ability to work independently Proven solid communication and collaboration skills Proven solid analytical skills and attention to detail Proven ability to adapt to new technologies and learn quickly Preferred Qualifications Certification in Azure or Databricks Experience with data modeling and database design Experience with development in Snowflake for data engineering and analytics workloads Knowledge of data governance and data quality best practices Familiarity with other cloud platforms (e.g., AWS, Google Cloud)

Posted 2 months ago

Apply

4.0 - 7.0 years

12 - 17 Lacs

Noida

Work from Office

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. The CMDB Analyst is essential in maintaining the integrity and accuracy of the ServiceNow Configuration Management Database (CMDB). This role focuses on enhancing data integration and management through robust ETL processes and detailed data analysis. Primary Responsibilities Oversee the ServiceNow CMDB operations and data integrity Develop and refine ETL processes for optimal data integration Regularly audit CMDB data to ensure accuracy and reliability Collaborate with IT teams to manage configuration item relationships in the CMDB Provide training and support to internal customers on CMDB functions Align CMDB management processes with organizational goals Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelors degree or equivalent degree Proven experience in managing ServiceNow CMDB Expertise in SQL and PL-SQL, focusing on ETL process optimization Solid background in database development and data analysis within large-scale environments Familiarity with modern, open-source technologies and integration into existing systems Demonstrated ability to develop and deploy database solutions in distributed systems Proven excellent customer service skills and the ability to influence successful outcomes At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 2 months ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

Chennai

Work from Office

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Technical Leadership Technical GuidanceProvide technical direction and guidance to the development team, ensuring that best practices are followed in coding standards, architecture, and design patterns Architecture DesignDesign and oversee the architecture of software solutions to ensure they are scalable, reliable, and performant Technology StackMake informed decisions on the technology stack (.Net for backend services, React for frontend development) to ensure it aligns with project requirements Code ReviewsConduct regular code reviews to maintain code quality and provide constructive feedback to team members Hands-on DevelopmentEngage in hands-on coding and development tasks, particularly in complex or critical areas of the project Project Management Task PlanningBreak down project requirements into manageable tasks and assign them to team members while tracking progress Milestone TrackingMonitor project milestones and deliverables to ensure timely completion of projects Data Pipeline & ETL Management Data Pipeline DesignDesign robust data pipelines that can handle large volumes of data efficiently using appropriate technologies (e.g., Apache Kafka) ETL ProcessesDevelop efficient ETL processes to extract, transform, and load data from various sources into the analytics platform Product Development Feature DevelopmentLead the development of new features from concept through implementation while ensuring they meet user requirements Integration TestingEnsure thorough testing (unit tests, integration tests) is conducted for all features before deployment Collaboration Cross-functional CollaborationCollaborate closely with product managers, UX/UI designers, QA engineers, and other stakeholders to deliver high-quality products Stakeholder CommunicationCommunicate effectively with stakeholders regarding project status updates, technical challenges, and proposed solutions Quality Assurance Performance OptimizationIdentify performance bottlenecks within applications or data pipelines and implement optimizations Bug ResolutionTriage bugs reported by users or QA teams promptly and ensure timely resolution Innovation & Continuous Improvement Stay Updated with TrendsKeep abreast of emerging technologies in .Net, React, Data Pipelines/ETL tools (like Apache Kafka or Azure Data Factory) that could benefit the product Process ImprovementContinuously seek ways to improve engineering processes for increased efficiency and productivity within the team Mentorship & Team Development MentorshipMentor junior developers by providing guidance on their technical growth as well as career development opportunities Team Building ActivitiesFoster a positive team environment through regular meetings (stand-ups), brainstorming sessions/workshops focusing on problem-solving techniques related specifically towards our tech stack needs (.Net/React/Data pipeline) Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so These responsibilities collectively ensure that the Lead Software Engineer not only contributes technically but also plays a crucial role in guiding their team towards successful project delivery for advanced data analytics products utilizing modern technologies such as .Net backend services combined seamlessly alongside frontend interfaces built using React coupled together via robustly engineered pipelines facilitating efficient ETL processes necessary powering insightful analytical outcomes beneficial end-users alike! Required Qualifications Bachelor’s DegreeA Bachelor’s degree in Computer Science, Software Engineering, Information Technology, or a related field Professional Experience8+ years of experience in software development with significant time spent on both backend (.Net) and frontend (React) technologies Leadership ExperienceProven experience in a technical leadership role where you have led projects or teams Technical Expertise: Extensive experience with .Net framework (C#) for backend development Proficiency with React for frontend development Solid knowledge and hands-on experience with data pipeline technologies (e.g., Apache Kafka) Solid understanding of ETL processes and tools such as DataBricks, ADF, Scala/Spark Technical Skills Architectural KnowledgeExperience designing scalable and high-performance architectures Cloud ServicesExperience with cloud platforms such as Azure, AWS or Google Cloud Platform Software Development LifecycleComprehensive understanding of the software development lifecycle (SDLC), including Agile methodologies Database ManagementProficiency with SQL and NoSQL databases (e.g., SQL Server, MongoDB) Leadership AbilitiesProven solid leadership skills with the ability to inspire and motivate teams Communication Skills: Proven superior verbal and written communication skills for effective collaboration with cross-functional teams and stakeholders Problem-Solving AbilitiesProven solid analytical and problem-solving skills Preferred Qualification Advanced Degree (Optional)A Master’s degree in a relevant field

Posted 2 months ago

Apply

5.0 - 10.0 years

8 - 13 Lacs

Gurugram

Work from Office

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. We are seeking a highly skilled and experienced Senior Cloud Data Engineer to join our team for a Cloud Data Modernization project. The successful candidate will be responsible for migrating our on-premises Enterprise Data Warehouse (SQLServer) to a modern cloud-based data platform utilizing Azure Cloud data tools, Delta lake and Snowflake. Primary Responsibilities Lead the migration of the ETLs from on-premises SQLServer based data warehouse to Azure Cloud, Databricks and Snowflake Design, develop, and implement data platform solutions using Azure Data Factory (ADF), Self-hosted Integration Runtime (SHIR), Logic Apps, Azure Data Lake Storage Gen2 (ADLS Gen2), Blob Storage, and Databricks (Pyspark) Review and analyze existing on-premises ETL processes developed in SSIS and T-SQL Implement DevOps practices and CI/CD pipelines using GitActions Collaborate with cross-functional teams to ensure seamless integration and data flow Optimize and troubleshoot data pipelines and workflows Ensure data security and compliance with industry standards Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications 6+ years of experience as a Cloud Data Engineer Hands-on experience with Azure Cloud data tools (ADF, SHIR, Logic Apps, ADLS Gen2, Blob Storage) and Databricks Solid experience in ETL development using on-premises databases and ETL technologies Experience with Python or other scripting languages for data processing Experience with Agile methodologies Proficiency in DevOps and CI/CD practices using GitActions Proven excellent problem-solving skills and ability to work independently Solid communication and collaboration skills Solid analytical skills and attention to detail Ability to adapt to new technologies and learn quickly Preferred Qualifications Certification in Azure or Databricks Experience with data modeling and database design Experience with development in Snowflake for data engineering and analytics workloads Knowledge of data governance and data quality best practices Familiarity with other cloud platforms (e.g., AWS, Google Cloud) At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies