Jobs
Interviews

340 Etl Development Jobs - Page 8

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

30 - 40 Lacs

Hyderabad

Work from Office

Support enhancements to the MDM platform Develop pipelines using snowflake python SQL and airflow Track System Performance Troubleshoot issues Resolve production issues Required Candidate profile 5+ years of hands on expert level Snowflake, Python, orchestration tools like Airflow Good understanding of investment domain Experience with dbt, Cloud experience (AWS, Azure) DevOps

Posted 1 month ago

Apply

3.0 - 6.0 years

12 - 13 Lacs

Bengaluru

Hybrid

Hi all , we are looking for a role ETL Developer experience : 3 - 6 years notice period : Immediate - 15 days location : Bengaluru Core Technical Expertise Data warehousing & migration : Deep expertise in ETL tools like Informatica PowerCenter, relational databases, data modeling, data cleansing, SQL optimization, and performance tuning. Programming & scripting : Strong SQL programming skills, shell scripting (Unix), debugging, and handling large datasets. Toolset : Experience with JIRA, Confluence, GIT; working knowledge of scheduling tools and integration of multiple data sources. Bonus skills : Familiarity with Talend Enterprise and Azure/cloud/BigData technologies.

Posted 1 month ago

Apply

6.0 - 11.0 years

18 - 33 Lacs

Kolkata, Bengaluru, Delhi / NCR

Work from Office

Minimum 6 yrs of experience in building ETL pipeline using Azure data Factory, Azure Synapse Minimum 6 yrs of ETL development using PL/SQL Exposure in Azure data brick is added advantage Above average communication skill IC Role Can Join in 2-3 weeks Required Candidate profile Expert in SSIS and Azure Synapse ETL development. Significant experience in developing Python or pySpark. Significant experience in Database and Data Storage Platforms including Azure Data Lake.

Posted 1 month ago

Apply

7.0 - 12.0 years

5 - 9 Lacs

Noida

Work from Office

We are looking for a skilled ETL Ab Initio Developer with 7 to 12 years of hands-on experience in Ab Initio ETL development to join our team for a high-impact mainframe-to-Ab Initio data transformation project. The ideal candidate will have deep technical knowledge and hands-on experience in Ab Initio and will play a critical role in designing, developing, and optimizing complex ETL workflows. Roles and Responsibility Lead and contribute to the development of large-scale mainframe-to-Ab Initio transformation projects. Design, develop, and maintain robust ETL workflows using Ab Initio tools for data extraction, transformation, and loading from various structured/unstructured sources to target platforms. Build reusable, generic Ab Initio components and leverage ExpressIt, continuous and batch flows effectively. Collaborate with business analysts, data architects, and stakeholders to understand data requirements and translate them into effective ETL solutions. Perform performance tuning and optimization of existing Ab Initio graphs to ensure scalability and efficiency. Implement complex data cleansing, transformation, and aggregation logic. Ensure code quality through unit testing, debugging, and peer code reviews. Troubleshoot and resolve production issues with a strong sense of urgency and accountability. Continuously seek process improvements and automation opportunities in ETL workflows. Job Minimum 7 years of hands-on experience in Ab Initio ETL development. Strong experience in designing and building modular and reusable Ab Initio components. In-depth knowledge of Ab Initio GDE, EME, ExpressIt, Continuous Flows, and Testing Frameworks. Solid understanding of data warehousing concepts, data modeling, and performance tuning. Excellent analytical and problem-solving skills with strong attention to detail. Ability to work independently with minimal supervision and collaborate in a team setting. Effective communication and stakeholder management skills. Experience with production support and real-time issue resolution. Familiarity with Agile methodologies and working in Agile/Scrum teams. Experience with mainframe data sources and legacy systems integration is preferred. Prior experience in large enterprise-scale ETL transformation initiatives is preferred. Exposure to cloud platforms or data migration to cloud-based data lakes is a plus.

Posted 1 month ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Gurugram

Work from Office

What this job involves: Are you comfortable working independently without close supervision We offer an exciting role where you can enhance your skills and play a crucial part in delivering consistent, high-quality administrative and support tasks for the EPM team The Senior ETL Developer/SSIS Administrator will lead the design of logical data models for JLL's EPM Landscape system. This role is responsible for implementing physical database structures and constructs, as well as developing operational data stores and data marts. The role entails developing and fine-tuning SQL procedures to enhance system performance. The individual will support functional tasks of medium-to-high technological complexity and build SSIS packages and transformations to meet business needs. This position contributes to maximizing the value of SSIS within the organization and collaborates with cross-functional teams to align data integration solutions with business objectives. Responsibilities The Senior ETL Developer will be responsible for: Gathering requirements and processing information to design data transformations that will effectively meet end-user needs. Designing, developing, and testing ETL processes for large-scale data extraction, transformation, and loading from source systems to the Data Warehouse and Data Marts. Creating SSIS packages to clean, prepare, and load data into the data warehouse and transfer data to EPM, ensuring data integrity and consistency throughout the ETL process. Monitoring and optimizing ETL performance and data quality. Creating routines for importing data using CSV files. Mapping disparate data sources - relational DBs, text files, Excel files - onto the target schema. Scheduling the packages to extract data at specific time intervals. Planning, coordinating, and supporting ETL processes, including architecting table structure, building ETL processes, documentation, and long-term preparedness. Extracting complex data from multiple data sources into usable and meaningful reports and analyses by implementing PL/SQL queries. Ensuring that the data architecture is scalable and maintainable. Troubleshooting data integration and data quality issues and bugs, analyzing reasons for failure, implementing optimal solutions, and revising procedures and documentation as needed. Utilizing hands-on SQL features Stored Procedures, Indexes, Partitioning, Bulk loads, DB configuration, Security/Roles, Maintenance. Developing queries and procedures, creating custom reports/views, and assisting in debugging. The developer will also be responsible for designing SSIS packages and ensuring their stability, reliability, and performance. Sounds like you To apply, you need to have: 8+ years of experience in Microsoft SQL Server Management Studio: administration and development Bachelors degree or equivalent Competency in Microsoft Office and Smart View Experience with Microsoft SQL databases and SSIS / SSAS development. Experience working with Microsoft SSIS to create and deploy packages and deploy for ETL processes. Experience in writing and troubleshooting SQL statements, creating stored procedures, views, and SQL functions Experience with data analytics and development. Strong SQL coding experience with performance optimization experience for data queries. Experience creating and supporting SSAS Cubes. Knowledge of Microsoft PowerShell and Batch scripting Good to have Power BI development experience Strong critical and analytical thinking and problem-solving skills Ability to multi-task and thrive in fast-paced, rapidly changing, and complex environment Good written and verbal communication skills Ability to learn new skills quickly to make a measurable difference Strong team player - proven success in contributing to a team-oriented environment Excellent communication (written and oral) and interpersonal skills Excellent troubleshooting and problem resolution skills

Posted 1 month ago

Apply

6.0 - 11.0 years

6 - 9 Lacs

Gurugram

Work from Office

The Business Intelligence (BI) Specialist is responsible for the design, development, implementation, management and support of mission- critical enterprise BI reporting and Extract, Transform, Load (ETL) processes and environments. Job Description Exposure to one or more implementations using OBIEE Development and Administration. Must have 6+ Years Development experience in PL/SQL. Experience in developing OBIEE Repository at three layers (Physical, Business model and Presentation Layers), Interactive Dashboards and drill down capabilities using global and Filters and Security Setups. Must have 3+ year of experience in Data Modeling, ETL Development (Preferably OWB), Etl and BI Tools installation and configuration & Oracle APEX. Experience in developing OBIEE Analytics Interactive Dashboards with Drill-down capabilities using global and local Filters, OBIEE Security setup (users/ group,access/ query privileges), configuring OBIEE Analytics Metadata objects (Subject Area, Table, Column), Presentation Services/ Web Catalog objects (Dashboards,Pages, Folders, Reports). Hands on development experience on OBIEE (version 11g or higher), Data Modelling. Experience in installing and configuring Oracle OBIEE in multiple life cycle environments.. Experience creating system architecture design documentation.. Experience presenting system architectures to management and technical stakeholders. Technical and Functional Understanding of Oracle OBIEE Technologies. Good knowledge of OBIEE Admin, best practices, DWBI implementation challenges. Understanding and knowledge of Data warehouse . Must have OBIEE Certification on version 11g or higher. Experience with ETL tools. Experience on HP Vertica. Domain knowledge on Supply Chain, Retail, Manufacturing.. Developing architectural solutions utilizing OBIEE. Work with project management to provide effort estimates and timelines. Interact with Business and IT team members to move the project forward on a daily basis.. Lead the development of OBIEE dashboard and reports . Work with Internal stakeholder and development teams during project lifecycle.

Posted 1 month ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Noida

Work from Office

company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=5 to 12 , jd= - Azure Data Engineer Job Type:- Full Time Job Location:- Bangalore JD:- We are looking for a skilled Azure Data Engineer to design, develop, and maintain data solutions on the Microsoft Azure cloud platform. The ideal candidate will have experience in data engineering, data pipeline development, ETL/ELT processes, and cloud-based data services. They will be responsible for implementing scalable and efficient data architectures, ensuring data quality, and optimizing data workflows. Key Responsibilities: Design and implement data pipelines using Azure Data Factory (ADF), Azure Databricks, and Azure Synapse Analytics . Develop and optimize ETL/ELT processes to extract, transform, and load data from various sources into Azure Data Lake, Azure SQL Database, and Azure Synapse . Work with Azure Data Lake Storage (ADLS) and Azure Blob Storage to manage large-scale structured and unstructured data. Implement data modeling, data partitioning, and indexing techniques for optimized performance in Azure-based databases. Develop and maintain real-time and batch processing solutions using Azure Stream Analytics and Event Hub . Implement data governance, data security, and compliance best practices using Azure Purview, RBAC, and encryption mechanisms . Optimize query performance and improve data accessibility through SQL tuning and indexing strategies . Collaborate with data scientists, analysts, and business stakeholders to define and implement data solutions that support business insights and analytics. Monitor and troubleshoot data pipeline failures, performance issues, and cloud infrastructure challenges . Stay updated with the latest advancements in Azure data services, big data technologies, and cloud computing . Required Skills & Qualifications: Bachelor’s or master’s degree in computer science , Information Technology, Data Science, or a related field. 5-8 years of experience in data engineering, cloud data platforms, and ETL development . Strong expertise in Azure services such as: Azure Data Factory (ADF) Azure Synapse Analytics Azure Databricks (PySpark, Scala, or Python) Azure Data Lake Storage (ADLS) Azure Blob Storage Azure SQL Database / Cosmos DB Azure Functions & Logic Apps Azure DevOps for CI/CD automation Proficiency in SQL, Python, Scala, or Spark for data transformation and processing. Experience with Big Data frameworks (Apache Spark, Hadoop) and data pipeline orchestration. Hands-on experience in data warehousing concepts, dimensional modelling, and performance optimization . Understanding of data security, governance, and compliance frameworks . Experience with CI/CD pipelines, Terraform, ARM templates, or Infrastructure as Code (Isac) . Knowledge of Power BI or other visualization tools is a plus. Strong problem-solving and troubleshooting skills. Preferred Qualifications: Familiarity with SAP, Salesforce, or third-party APIs for data integration. , Title=Azure Data Engineer, ref=6566567

Posted 1 month ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Noida

Work from Office

company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=3 to 8 , jd= Job Title:- IBM TM1 Consultant Job Location:- Gurgaon, Mumbai Job Type:- Full Time JD:- 6-8 Years of experience in developing Planning Analytics reports for budgeting, forecasting and actual reports and provideongoing maintenance Collaborate with various internal and external stakeholders to understand the businessrequirements and develop financial reports and reporting solutions Connect to various data sources as needed for developing necessary reports Ensure the accuracy of the deliverables through quality assurance practice Develop strong understanding of business requirements to create necessary calculationlogics for various metrics Run various queries to check any report/data issues for punctual reporting and escalation Support and troubleshoot Planning Analytics / TM1 related issues Ability to explain and elaborate complex ideas effectively to various stakeholders Experience in ETL development Work Location Gurgaon/Mumbai Hybrid Timings probably 11AM to 8PM , Title=IBM TM1 Consultant, ref=6566427

Posted 1 month ago

Apply

4.0 - 7.0 years

8 - 18 Lacs

Bengaluru

Remote

Role & Responsibilities: Must Have Skills: 1. Data Transformation & Correction: Proven experience executing complex data migrations, implementing data corrections, and performing large-scale data transformations with accuracy and efficiency 2. SQL Mastery: Over 5 years of hands-on experience writing advanced, high-performance T-SQL across diverse platforms, including Microsoft SQL Server. 3. ETL/ELT Development: Demonstrated expertise in architecting, developing, and maintaining robust, scalable ETL/ELT pipelines in enterprise-grade environments. 4. Scripting & Workflow Orchestration: Proficient in scripting languages such as Python, and with practical knowledge of orchestration frameworks like Apache Airflow. 5. CI/CD & Version Control: Deep understanding of Git-based workflows and best practices, with experience building and managing automated CI/CD pipelines for database deployments. 6. Customer Engagement: Adept at working directly with clients to gather requirements, communicate technical solutions clearly, and ensure timely project delivery. Work Timings: 2 PM - 11 PM India Standard Time (IST)

Posted 1 month ago

Apply

8.0 - 13.0 years

2 - 30 Lacs

Bengaluru

Work from Office

Experience: 4 6 Years Location: Bangalore ( Hybrid) Shift :Night Shift Employment Type: Full-time About the Role: We are seeking a skilled and motivated Analytics Engineer with 46 years of experience to join our data team in Bangalore The ideal candidate will possess a strong mix of data engineering, analytics, and stakeholder collaboration skills You will play a key role in designing scalable data solutions and enabling data-driven decision-making across the organization Key Responsibilities: Collaborate with business and technical stakeholders to gather requirements and deliver analytics solutions Design and implement scalable data models using star schema and dimensional modeling approaches Develop and optimize ETL pipelines using Apache Spark for both batch and real-time processing(experience with Apache Pulsar is preferred) Write efficient, production-grade Python scripts and advanced SQL queries for data transformation and analysis Manage workflows and ensure data pipeline reliability using tools like Airflow, DBT, or similar orchestration frameworks Implement best practices in data quality, testing, and observability across all data layers Work with cloud-native data lakes/warehouses such as Redshift, BigQuery, Cassandra, and cloud storage platforms (S3, Azure Blob, GCS) Leverage relational databases such as PostgreSQL/MySQL for operational data tasks Nice to Have: Exposure to containerization technologies like Docker and Kubernetes for scalable deployment Experience working in cloud-native analytics ecosystems Required Skills: Strong experience in data modeling, ETL development, and data warehouse design Proven expertise in Python and SQL Hands-on experience with Apache Spark (ETL tuning), Airflow, DBT, or similar tools Practical knowledge of data quality frameworks, monitoring, and data observability Familiarity with both batch and streaming data architectures What We Offer: Opportunity to work on cutting-edge data platforms Collaborative and inclusive team culture Competitive salary and benefits Career growth in the modern data engineering space

Posted 1 month ago

Apply

8.0 - 13.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Role & responsibilities Designing and developing Teradata databases Configuring and testing Teradata systems Troubleshooting Teradata systems Liaising with Teradata support staff and other technical teams Providing training and support to end users Requirements & Skills B.Tech/BE in Computer Science or related field 5+ years of experience in Teradata development Strong experience of SQL Good understanding of data warehousing concepts Experience in using Teradata utilities Excellent problem-solving skills Preferred candidate profile Immediate Joiners Only/Open for Bangalore location

Posted 1 month ago

Apply

5.0 - 10.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Role & responsibilities 5+ years of experience as a Data Engineer , focusing on ETL development. 3+ years of experience in SQL and writing complex queries for data retrieval and manipulation. 3+ years of experience in Linux command-line and bash scripting. Familiarity with data modelling in analytical databases. Strong understanding of backend data structures, with experience collaborating with data engineers ( Teradata, Databricks, AWS S3 parquet/CSV ). Experience with RESTful APIs and AWS services like S3, Glue, and Lambda Experience using Confluence for tracking documentation. Strong communication and collaboration skills, with the ability to interact effectively with stakeholders at all levels. Ability to work independently and manage multiple tasks and priorities in a dynamic environment. Bachelors degree in Computer Science, Engineering, Information Technology, or a related field.

Posted 1 month ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Gurugram

Work from Office

About the Role: Grade Level (for internal use): 09 S&P Global Mobility The Role: ETL Developer The Team The ETL team forms an integral part of Global Data Operations (GDO) and caters to the North America & EMEA automotive business line. Core responsibilities include translating business requirements into technical design and ETL jobs along with unit testing, integration testing, regression testing, deployments & production operations. The team has an energetic and dynamic group of individuals, always looking to work through a challenge. Ownership, raising the bar and innovation is what the team runs on! The Impact The ETL team, being part of GDO, caters to the automotive business line and helps stakeholders with an optimum solution for their data needs. The role requires close coordination with global teams such as other development teams, research analysts, quality assurance analysts, architects etc. The role is vital for the automotive business as it involves providing highly efficient data solutions with high accuracy to various stakeholders. The role forms a bridge between the business and technical stakeholders. Whats in it for you Constant learning, working in a dynamic and challenging environment! Total Rewards. Monetary, beneficial, and developmental rewards! Work Life Balance. You can't do a good job if your job is all you do! Diversity & Inclusion. HeForShe! Internal Mobility. Grow with us! Responsibilities Using prior experience with file loading, cleansing and standardization, should be able to translate business requirements into ETL design and efficient ETL solutions using Informatica Powercenter (mandatory) and Talend Enterprise (preferred). Knowledge of tibco would be a preferred skill as well. Understand relational database technologies and data warehousing concepts and processes. Using prior experiences with High Volume data processing, be able to deal with complex technical issues Works closely with all levels of management and employees across the Automotive business line. Participates as part of cross-functional teams responsible for investigating issues, proposing solutions and implementing corrective actions. Good communication skills required for interface with various stakeholder groups; detail oriented with analytical skills What Were Looking For The ETL development team within the Mobility domain is looking for a Software Engineer to work on design, development & operations efforts in the ETL (Informatica) domain. Primary Skills and qualifications required: Experience with Informatica and/or Talend ETL tools Bachelors degree in Computer Science, with at least 3+ years of development and maintenance of ETL systems on Informatica PowerCenter and 1+ year of SQL experience. 3+ years of Informatica Design and Architecture experience and 1+ years of Optimization and Performance tuning of ETL code on Informatica 1+ years of python development experience and SQL, XML experience Working knowledge or greater of Cloud Based Technologies, Development, Operations a plus. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- , SWP Priority Ratings - (Strategic Workforce Planning)

Posted 1 month ago

Apply

4.0 - 6.0 years

6 - 9 Lacs

Pune, Chennai, Bengaluru

Work from Office

Data Engineer Number of Open Position : 4 Key Responsibilities Develop and maintain ETL pipelines for multiple source systems: Examples - SAP, Korber WMS, OpSuite ePOS, and internal BI systems. Design and implement SSIS-based data workflows including staging, data quality rules, and CDC logic. Collaborate with the DBA on schema design, indexing, and performance tuning for SQL Server 2022. Build reusable components and scripts for data loading, transformation, and validation. Support development of CDC (Change Data Capture) solutions for near real-time updates. Perform unit testing, documentation, and version control of data solutions using GitLab/Jenkins CI/CD. Ensure data security, masking, and encryption in accordance with project policies (TLS 1.3). Work closely with backend developers and analysts to align data models with reporting needs. Troubleshoot and resolve data-related issues during development and post-deployment. Required Skills & Experience 46 years of experience in data engineering or ETL development Strong hands-on expertise in: SSIS (SQL Server Integration Services) SQL Server 2019/2022 (T-SQL, stored procedures, indexing, CDC) ETL development for ERP/warehouse systems (SAP preferred) Experience working with source systems like SAP, WMS, POS or retail systems is highly desirable. Proficiency in data quality frameworks, staging strategies, and workflow orchestration. Familiarity with CI/CD for data workflows (GitLab, Jenkins, etc.) Good understanding of data warehousing concepts and performance optimization. Strong communication and documentation skills.

Posted 1 month ago

Apply

4.0 - 7.0 years

3 - 7 Lacs

Noida

Work from Office

R1 is a leading provider of technology-driven solutions that help hospitals and health systems to manage their financial systems and improve patients experience. We are the one company that combines the deep expertise of a global workforce of revenue cycle professionals with the industry's most advanced technology platform, encompassing sophisticated analytics, Al, intelligent automation and workflow orchestration. R1 is a place where we think boldly to create opportunities for everyone to innovate and grow. A place where we partner with purpose through transparency and inclusion. We are a global community of engineers, front-line associates, healthcare operators, and RCM experts that work together to go beyond for all those we serve. Because we know that all this adds up to something more, a place where we're all together better. R1 India is proud to be recognized amongst Top 25 Best Companies to Work For 2024, by the Great Place to Work Institute. This is our second consecutive recognition on this prestigious Best Workplaces list, building on the Top 50 recognition we achieved in 2023. Our focus on employee wellbeing and inclusion and diversity is demonstrated through prestigious recognitions with R1 India being ranked amongst Best in Healthcare, Top 100 Best Companies for Women by Avtar & Seramount, and amongst Top 10 Best Workplaces in Health & Wellness. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare work better for all by enabling efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 16,000+ strong in India with presence in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. Description: We are seeking a Software Engineer with 4-7 year of experience to join our ETL Development team. This role will report to the Manager of data engineering and be involved in the planning, design, and implementation of our centralized data warehouse solution for ETL, reporting and analytics across all applications within the company. Qualifications: Deep knowledge and experience working with SSIS, T-SQL, Azure Databricks, Azure Data Lake, Azure Data Factory,. Experienced in writing SQL objects SP, UDF, Views Experienced in data modeling. Experience working with MS-SQL and NoSQL database systems such as Apache Parquet. Experience in Scala, SparkSQL, Airflow is preferred. Experience with acquiring and preparing data from primary and secondary disparate data sources Experience working on large scale data product implementation. Experience working with agile methodology preferred. Healthcare industry experience preferred. Responsibilities: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions Work with other team with deep experience in ETL process and data science domains to understand how to centralize their data Share your passion for staying experimenting with and learning new technologies. Perform thorough data analysis, uncover opportunities, and address business problems. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 1 month ago

Apply

5.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional : Primary skills:Technology-Data Management - Data Integration-DataStage Preferred Skills: Technology-Data Management - Data Integration-DataStage

Posted 1 month ago

Apply

8.0 - 13.0 years

8 - 13 Lacs

Hyderabad

Work from Office

P2-C3-STS JD Data Warehouse. In this role you will be part of a team working to develop solutions enabling the business to leverage data as an asset at the bank. As a Lead ETL Developer, you will be leading teams to develop, maintain and enhance code ensuring all IT SDLC processes are documented and practiced, working closely with multiple technologies teams across the enterprise. The Lead ETL Developer should have extensive knowledge of data warehousing cloud technologies. If you consider data as a strategic asset, evangelize the value of good data and insights, have a passion for learning and continuous improvement, this role is for you. Key Responsibilities: Translate requirements and data mapping documents to a technical design. Develop, enhance and maintain code following best practices and standards. Create and execute unit test plans. Support regression and system testing efforts. Debug and problem solve issues found during testing and or production. Communicate status, issues and blockers with project team. Support continuous improvement by identifying and solving opportunities. Basic Qualifications Bachelor degree or military experience in related field (preferably computer science). At least 5 years of experience in ETL development within a Data Warehouse. Deep understanding of enterprise data warehousing best practices and standards. Strong experience in software engineering comprising of designing, developing and operating robust and highly-scalable cloud infrastructure services. Strong experience with Python/PySpark, DataStage ETL and SQL development. Proven experience in cloud infrastructure projects with hands on migration expertise on public clouds such as AWS and Azure, preferably Snowflake. Knowledge of Cybersecurity organization practices, operations, risk management processes, principles, architectural requirements, engineering and threats and vulnerabilities, including incident response methodologies. Understand Authentication & Authorization Services, Identity & Access Management. Strong communication and interpersonal skills. Strong organization skills and the ability to work independently as well as with a team. Preferred Qualifications AWS Certified Solutions Architect Associate, AWS Certified DevOps Engineer Professional and/or AWS Certified Solutions Architect Professional Experience defining future state roadmaps for data warehouse applications. Experience leading teams of developers within a project. Experience in financial services (banking) industry. Mandatory Skills ETL - Datawarehouse concepts AWS, Glue SQL python SNOWFLAKE CI/CD Tools (Jenkins, GitHub) Secondary Skills zena pyspark infogix

Posted 1 month ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Hyderabad

Work from Office

Design, develop, and deploy ETL processes using SSIS/Azure Data Factory. 6+ years of experience in ETL development using SSIS or 2+ years of experience in Azure data factory Monitor and troubleshoot ETL jobs and data flows. Implement data quality checks and ensure data integrity. Maintain documentation of ETL processes and data flow diagrams Design, develop, and maintain interactive Power BI reports and dashboards to visualize key performance indicators (KPIs) and business metrics. Translate complex business requirements into technical specifications for data extraction, transformation, and reporting. Collaborate with cross-functional teams to understand their data and reporting needs. Write complex SQL queries for data extraction, manipulation, and analysis from various relational databases. Resource needs to have good insurance domain knowledge

Posted 1 month ago

Apply

8.0 - 13.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Data Analyst WorkMode :Hybrid Work Location Chennai / Hyderabad / Bangalore / Pune Work Timing 2 PM to 11 PM Primary DataAnalyst Minimum 6 years of experience as DataAnalyst with at least 3+ Years experience in Data Migration initiatives. Experience in Migrating COTS/legacy systems including large volumes of data without compromising its accuracy and completeness Technical expertise regarding data models, database design development, data mining and segmentation techniques Experience with ETL development both on premises and in the cloud Strong functional understanding of RDBMS DWH-BI conceptual knowledge

Posted 1 month ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Hyderabad

Work from Office

P2-C3-STS JD In this role you will be part of a team working to develop solutions enabling the business to leverage data as an asset at the bank.The Senior ETL Developer should have extensive knowledge of data warehousing cloud technologies. If you consider data as a strategic asset, evangelize the value of good data and insights, have a passion for learning and continuous improvement, this role is for you. Translate requirements and data mapping documents to a technical design. Develop, enhance and maintain code following best practices and standards. Create and execute unit test plans. Support regression and system testing efforts. Debug and problem solve issues found during testing and/or production. Communicate status, issues and blockers with project team. Support continuous improvement by identifying and solving opportunities. Basic Qualifications Bachelor degree or military experience in related field (preferably computer science). At least 5 years of experience in ETL development within a Data Warehouse. Deep understanding of enterprise data warehousing best practices and standards. Strong experience in software engineering comprising of designing, developing and operating robust and highly-scalable cloud infrastructure services. Strong experience with Python/PySpark, DataStage ETL and SQL development. Proven experience in cloud infrastructure projects with hands on migration expertise on public clouds such as AWS and Azure, preferably Snowflake. Knowledge of Cybersecurity organization practices, operations, risk management processes, principles, architectural requirements, engineering and threats and vulnerabilities, including incident response methodologies. Understand Authentication & Authorization Services, Identity & Access Management. Strong communication and interpersonal skills. Strong organization skills and the ability to work independently as well as with a team. Preferred Qualifications AWS Certified Solutions Architect Associate, AWS Certified DevOps Engineer Professional and/or AWS Certified Solutions Architect Professional Experience defining future state roadmaps for data warehouse applications. Experience leading teams of developers within a project. Experience in financial services (banking) industry. Mandatory Skills ETL - Datawarehouse concepts Snowflake AWS, Glue CI/CD Tools (Jenkins, GitHub) python Datastage Secondary Skills zena pyspark infogix

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

Process Manager - AWS Data Engineer Mumbai/Pune| Full-time (FT) | Technology Services Shift Timings - EMEA(1pm-9pm)|Management Level - PM| Travel - NA The ideal candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. The role enables to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, candidate must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Roles and responsibilities: Understand clients requirement and provide effective and efficient solution in AWS using Snowflake. Assembling large, complex sets of data that meet non-functional and functional business requirements Using Snowflake / Redshift Architect and design to create data pipeline and consolidate data on data lake and Data warehouse. Demonstrated strength and experience in data modeling, ETL development and data warehousing concepts Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions Perform data quality testing and assurance as a part of designing, building and implementing scalable data solutions in SQL Technical and Functional Skills: AWS ServicesStrong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc. Programming LanguagesProficiency in programming languages commonly used in data engineering such as Python, SQL, Scala, or Java. Data WarehousingExperience in designing, implementing, and optimizing data warehouse solutions on Snowflake/ Amazon Redshift. ETL ToolsFamiliarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue) for building and managing data pipelines. Database ManagementKnowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts. Big Data TechnologiesUnderstanding of big data technologies such as Hadoop, Spark, Kafka, etc., and their integration with AWS. Version ControlProficiency in version control tools like Git for managing code and infrastructure as code (e.g., CloudFormation, Terraform). Problem-solving Skills: Ability to analyze complex technical problems and propose effective solutions. Communication Skills: Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders. Education and ExperienceTypically, a bachelors degree in Computer Science, Engineering, or a related field is required, along with 5+ years of experience in data engineering and AWS cloud environments. About eClerx eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. About eClerx Technology eClerxs Technology Group collaboratively delivers Analytics, RPA, AI, and Machine Learning digital technologies that enable our consultants to help businesses thrive in a connected world. Our consultants and specialists partner with our global clients and colleagues to build and implement digital solutions through a broad spectrum of activities. To know more about us, visit https://eclerx.com eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law

Posted 1 month ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Hyderabad

Hybrid

5+ Years of strong experience in Informatica CloudExp in Informatica Cloud Designer & Informatica Cloud Portal Exp in Transformations, Mapping Configuration Task, Task Flows and Parameterized Templates Experience in Informatica Power Center and DesignerGood knowledge in Oracle, SQL,PL/SQL Should have experience in scheduling the Informatica cloud ETL mappings Experience in Integration of Informatica cloud with SFDC, SAP etc. as sources Experience in Business Objects and other business intelligence platforms is an advantage Should be good in understanding of functional requirements and Business

Posted 1 month ago

Apply

7.0 - 10.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Urgent Requirement for Datastage Developer. Experience:7+ Years Location: Pan India. Role Overview - Design and development of ETL and BI applications in DataStage - Design/develop testing processes to ensure end to end performance, data integrity and usability. - Carry out performance testing, integration and system testing - Good SQL Knowledge is mandatory - Basic Unix knowledge is required .

Posted 1 month ago

Apply

6.0 - 9.0 years

7 - 11 Lacs

Hyderabad

Work from Office

Job Details: Skill: Sr Developer - ODI Experience: 6-9 Years Location: - PAN INDIA Notice Period: Immediate Joiners Employee type : C2H Job Description: Sr. Developer - ODI Candidate should have technical knowledge with experience in the Oracle Data Integrator and Oracle Middleware technologies. Should have good expertise on Oracle Data Integrator (ODI) and Data warehousing with relevant experience 6 to 9 years Experience in Designing, implementing, and maintaining ODI load plan and process Experience in ETL development, PL/SQL & support To co-ordinate with Team Lead to ensure implementation as per stated requirements Good to have knowledge in Oracle MFT, Oracle Database, Oracle SOA Suite, BPEL, XML, WSDL/XSD, Adapters. Support and manage already developed ODI applications, perform testing on DEV/UAT/SIT/PROD environments, work on tickets assigned to you. Ready to work on shifts, support on-call depending on project and production releases. Should be able to work as an independent team member, capable of applying judgment to plan and execute your tasks. Timely completion of quality deliverables, good communication skills, professional conduct, ability to talk/handle business and cross functional teams To ensure correctness and completeness of Data loading (Full load & Incremental load) Optimizing execution time of interfaces by implementing best practices.

Posted 1 month ago

Apply

5.0 - 6.0 years

7 - 8 Lacs

Kolkata

Work from Office

Use Talend Open Studio to design, implement, and manage data integration solutions. Develop ETL processes to ensure data is accurately extracted, transformed, and loaded into various systems for analysis.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies