Home
Jobs

175 Etl Development Jobs - Page 2

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

10 - 20 Lacs

Chennai

Hybrid

Naukri logo

Hi [Candidate Name], We are hiring for a Data Engineering role with a leading organization working on cutting-edge cloud and data solutions. If you're an experienced professional looking for your next challenge, this could be a great fit! Key Skills Required: Strong experience in Data Engineering and Cloud Data Pipelines Proficiency in at least 3 languages : Java, Python, Spark, Scala, SQL Hands-on with tools like Google BigQuery, Apache Kafka, Airflow, GCP Pub/Sub Knowledge of Microservices architecture , REST APIs , and DevOps tools (Docker, GitHub Actions, Terraform) Exposure to relational databases : MySQL, PostgreSQL, SQL Server Prior experience in onshore/offshore model is a plus If this sounds like a match for your profile, reply with your updated resume or apply directly. Looking forward to connecting! Best regards, Mahesh Babu M Senior Executive - Recruitment maheshbabu.muthukannan@sacha.solutions

Posted 2 weeks ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

The CDP ETL & Database Engineer will specialize in architecting, designing, and implementing solutions that are sustainable and scalable. The ideal candidate will understand CRM methodologies, with an analytical mindset, and a background in relational modeling in a Hybrid architecture. The candidate will help drive the business towards specific technical initiatives and will work closely with the Solutions Management, Delivery, and Product Engineering teams. The candidate will join a team of developers across the US, India & Costa Rica. Responsibilities : ETL Development The CDP ETL & Database Engineer will be responsible for building pipelines to feed downstream data processes. They will be able to analyze data, interpret business requirements, and establish relationships between data sets. The ideal candidate will be familiar with different encoding formats and file layouts such as JSON and XML. I mplementations & Onboarding Will work with the team to onboard new clients onto the ZMP/CDP+ platform. The candidate will solidify business requirements, perform ETL file validation, establish users, perform complex aggregations, and syndicate data across platforms. The hands-on engineer will take a test-driven approach towards development and will be able to document processes and workflows. Incremental Change Requests The CDP ETL & Database Engineer will be responsible for analyzing change requests and determining the best approach towards implementation and execution of the request. This requires the engineer to have a deep understanding of the platform's overall architecture. Change requests will be implemented and tested in a development environment to ensure their introduction will not negatively impact downstream processes. Change Data Management The candidate will adhere to change data management procedures and actively participate in CAB meetings where change requests will be presented and approved. Prior to introducing change, the engineer will ensure that processes are running in a development environment. The engineer will be asked to do peer-to-peer code reviews and solution reviews before production code deployment. Collaboration & Process Improvement The engineer will be asked to participate in knowledge share sessions where they will engage with peers, discuss solutions, best practices, overall approach, and process. The candidate will be able to look for opportunities to streamline processes with an eye towards building a repeatable model to reduce implementation duration. Job Requirements : The CDP ETL & Database Engineer will be well versed in the following areas: Relational data modeling ETL and FTP concepts Advanced Analytics using SQL Functions Cloud technologies - AWS, Snowflake Able to decipher requirements, provide recommendations, and implement solutions within predefined timeframes. The ability to work independently, but at the same time, the individual will be called upon to contribute in a team setting. The engineer will be able to confidently communicate status, raise exceptions, and voice concerns to their direct manager. Participate in internal client project status meetings with the Solution/Delivery management teams. When required, collaborate with the Business Solutions Analyst (BSA) to solidify requirements. Ability to work in a fast paced, agile environment; the individual will be able to work with a sense of urgency when escalated issues arise. Strong communication and interpersonal skills, ability to multitask and prioritize workload based on client demand. Familiarity with Jira for workflow mgmt., and time allocation. Familiarity with Scrum framework, backlog, planning, sprints, story points, retrospectives etc. Required Skills : ETL ETL tools such as Talend (Preferred, not required) DMExpress Nice to have Informatica Nice to have Database - Hands on experience with the following database Technologies Snowflake (Required) MYSQL/PostgreSQL Nice to have Familiar with NOSQL DB methodologies (Nice to have) Programming Languages Can demonstrate knowledge of any of the following. PLSQL JavaScript Strong Plus Python - Strong Plus Scala - Nice to have AWS Knowledge of the following AWS services: S3 EMR (Concepts) EC2 (Concepts) Systems Manager / Parameter Store Understands JSON Data structures, key value pair. Working knowledge of Code Repositories such as GIT, Win CVS, SVN. Workflow management tools such as Apache Airflow, Kafka, Automic/Appworx Jira Minimum Qualifications Bachelor's degree or equivalent 2-4 Years' experience Excellent verbal & written communications skills Self-Starter, highly motivated Analytical mindset.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

2 - 6 Lacs

Gurugram

Work from Office

Naukri logo

Skills: Primary Skills: Enhancements, new development, defect resolution, and production support of ETL development using AWS native services Integration of data sets using AWS services such as Glue and Lambda functions. Utilization of AWS SNS to send emails and alerts Authoring ETL processes using Python and PySpark ETL process monitoring using CloudWatch events Connecting with different data sources like S3 and validating data using Athena. Experience in CI/CD using GitHub Actions Proficiency in Agile methodology Extensive working experience with Advanced SQL and a complex understanding of SQL. Competencies / Experience: Deep technical skills in AWS Glue (Crawler, Data Catalog)5 years. Hands-on experience with Python and PySpark3 years. PL/SQL experience3 years CloudFormation and Terraform2 years CI/CD GitHub actions1 year Experience with BI systems (PowerBI, Tableau)1 year Good understanding of AWS services like S3, SNS, Secret Manager, Athena, and Lambda2 years

Posted 2 weeks ago

Apply

5.0 - 8.0 years

15 - 18 Lacs

Navi Mumbai, Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

Greetings!!! This is in regards to a Job opportunity for ETL Developer with Datamatics Global Services Ltd. Position: ETL Developer Website: https://www.datamatics.com/ Job Location: Mumbai/Bangalore Job Description: 5 years experience Minimum 3 years of experience in Talend & Datastage development Expertise in designing and implementing Talend & Datastage ETL jobs Strong analytical and problem-solving skills Design, develop, and maintain Talend integration solutions Collaborate with business stakeholders and IT teams to gather requirements and recommend solutions Create and maintain technical documentation Perform unit testing and troubleshoot issues

Posted 2 weeks ago

Apply

5.0 - 9.0 years

10 - 15 Lacs

Hyderabad

Hybrid

Naukri logo

Talend Developer Job Summary: We are looking for an experienced Talend ETL Developer with strong SQL skills to design, develop, and maintain robust data integration solutions. The ideal candidate will be proficient in Talend Data Integration tools and have a solid foundation in SQL for data manipulation, transformation, and reporting. Key Responsibilities: Design, develop, and deploy Talend ETL jobs for data extraction, transformation, and loading from various sources. Write complex SQL queries to support ETL workflows, data quality checks, and reporting requirements. Work with relational databases like MySQL, PostgreSQL, Oracle, or SQL Server. Collaborate with business analysts and data architects to understand data requirements and translate them into technical solutions. Optimize ETL performance, handle large datasets, and ensure data accuracy and integrity. Implement error handling, logging, and exception handling in ETL processes. Maintain and troubleshoot existing Talend jobs and SQL scripts. Participate in code reviews, unit testing, and deployment planning. Required Skills: Strong experience with Talend Open Studio or Talend Data Integration (DI). Proficient in writing and optimizing complex SQL queries and stored procedures. Good understanding of data warehousing concepts, ETL best practices, and data modeling. Experience with version control tools like Git. Familiarity with scheduling tools (like Control-M, AutoSys, or Talend scheduler).

Posted 2 weeks ago

Apply

6.0 - 11.0 years

14 - 24 Lacs

Bengaluru

Work from Office

Naukri logo

ETL Developer (Informatica power center with Big data experience) This is an on site opportunity for Jeddah ( Saudi Arabia ) Job Description Job Summary: We are seeking a skilled and experienced in Senior ETL Developer with strong expertise in Informatica PowerCenter and enterprise data warehousing concepts. The ideal candidate will design, develop, and maintain scalable ETL pipelines, ensuring data integrity and optimal performance. You will work closely with data architects, analysts, and business stakeholders to support data integration and analytics initiatives. Key Responsibilities: Design, develop, test, and maintain ETL workflows using Informatica PowerCenter (or Informatica Intelligent Cloud Services). Build and optimize data pipelines and integration processes for data ingestion , transformation , and loading . Collaborate with data architects and business analysts to understand data requirements and build solutions accordingly. Work with relational databases (SQL Server, Oracle, etc.) to extract and load data using Informatica. Perform data profiling , data quality checks , and data validation . Optimize ETL jobs for performance, scalability, and fault tolerance. Support production ETL jobs , troubleshoot issues, and perform root cause analysis. Develop and maintain documentation for ETL processes and data flow. Assist in migrating legacy ETL processes to modern data platforms (if required).

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

Pune, Bengaluru

Hybrid

Naukri logo

Description for Internal Candidates POSITION SUMMARY: We are looking for an experienced ETL (SSIS) Developer to join our Data Management team. This role will work with cross-functional teams to understand their data needs and transform raw data from SQL, Oracle, AWS S3 into our DataWarehouse. JOB FUNCTION AND RESPONSIBILITIES: To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. Examples below: Develop, develop, and optimize ETL workflows to generate valuable data and reports. Integrate data from SQL database, Oracle database, Flat files, AWS S3. Troubleshoot and debug existing SSIS packages to improve performance and reliability. Performance tune SQL queries and views. Work closely with onshore and offshore data team resources. Collaborate with business and sales teams to understand their data and automation needs. Create and maintain comprehensive documentation for all systems and jobs. QUALIFICATION: To perform this job successfully, an individual must have the following education and/or experience: Bachelors Degree is needed. 5+ years of ETL (SSIS) developer experience integrating with various databases and flat files is a must. Expert knowledge of SQL is a must. Should be fluent in speaking English and be able to communicate well with business stakeholders. Experience in Agile Project Management methodology is a plus. Experience connecting to AWS services. AI/ML experience and knowledge in Python or R for data analysis is nice to have. WORK SCHEDULE OR TRAVEL REQUIREMENTS: General work schedule (for shift roles). Around 3 hours of overlap between onshore and offshore work hours is expected. 1:30 PM IST to 10:30 PM IST.

Posted 2 weeks ago

Apply

15.0 - 20.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Building Tool Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor and evaluate team performance to ensure alignment with project goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Building Tool.- Strong understanding of data modeling and architecture principles.- Experience with data integration techniques and tools.- Familiarity with cloud-based data platforms and services.- Ability to troubleshoot and resolve data-related issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in Data Building Tool.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

12.0 - 15.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Building Tool Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities and understanding.- Monitor and evaluate team performance to ensure alignment with project goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Building Tool.- Strong understanding of data architecture principles and best practices.- Experience with data integration techniques and tools.- Familiarity with cloud-based data platforms and services.- Ability to troubleshoot and resolve data-related issues efficiently. Additional Information:- The candidate should have minimum 12 years of experience in Data Building Tool.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

12.0 - 15.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Building Tool Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Your role will require you to navigate complex data environments, providing insights and recommendations that drive effective data management and governance practices. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities and foster a culture of continuous improvement.- Monitor and evaluate the performance of data platform components, making recommendations for enhancements and optimizations. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Building Tool.- Strong understanding of data architecture principles and best practices.- Experience with data integration techniques and tools.- Familiarity with cloud-based data platforms and services.- Ability to analyze and troubleshoot data-related issues effectively. Additional Information:- The candidate should have minimum 12 years of experience in Data Building Tool.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 10.0 years

12 - 18 Lacs

Hyderabad

Remote

Naukri logo

Role: Senior Data Engineer Azure/Snowflake Duration: 6+ Months Location: Remote Working Hours: 12:30pm IST - 9:30pm IST (3am - 12pm EST) Job Summary: We are seeking a Senior Data Engineer with advanced hands-on experience in Snowflake and Azure to support the development and optimization of enterprise-grade data pipelines. This role is ideal for someone who enjoys deep technical work and solving complex data engineering challenges in a modern cloud environment. Key Responsibilities: Build and enhance scalable data pipelines using Azure Data Factory, Snowflake, and Azure Data Lake Develop and maintain ELT processes to ingest and transform data from various structured and semi-structured sources Write optimized and reusable SQL for complex data transformations in Snowflake Collaborate closely with analytics teams to ensure clean, reliable data delivery Monitor and troubleshoot pipeline performance, data quality, and reliability Participate in code reviews and contribute to best practices around data engineering standards and governance Qualifications: 5+ years of data engineering experience in enterprise environments Deep hands-on experience with Snowflake, Azure Data Factory, Azure Blob/Data Lake, and SQL Proficient in scripting for data workflows (Python or similar) Strong grasp of data warehousing concepts and ELT development best practices Experience with version control tools (e.g., Git) and CI/CD processes for data pipelines Detail-oriented with strong problem-solving skills and the ability to work independently

Posted 2 weeks ago

Apply

15.0 - 20.0 years

9 - 13 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Building Tool Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor and evaluate team performance to ensure alignment with project goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Building Tool.- Strong understanding of data modeling techniques.- Experience with data integration and ETL processes.- Familiarity with cloud-based data platforms and services.- Ability to troubleshoot and optimize data workflows. Additional Information:- The candidate should have minimum 5 years of experience in Data Building Tool.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 4.0 years

9 - 14 Lacs

Mumbai

Work from Office

Naukri logo

Job TitleAlteryx Developer About Us Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. Job Title: Alteryx Developer Location: Mumbai, India Experience: 8+ years NP: Immediate Joiners Preferred Job Summary: We are seeking a highly experienced Alteryx Developer to join our dynamic team in Mumbai. The ideal candidate will have 8+ years of experience in data analytics, ETL processes, and workflow automation using Alteryx. The role requires strong problem-solving skills, hands-on experience in data transformation, and the ability to work in a fast-paced environment. Immediate joiners will be given preference. Key Responsibilities: Design, develop, and maintain scalable Alteryx workflows and analytical solutions. Collaborate with business stakeholders to understand data requirements and deliver efficient ETL processes. Optimize workflows for performance and maintainability. Integrate Alteryx with various data sources like SQL Server, Excel, APIs, and cloud platforms. Perform data cleansing, transformation, and validation. Document workflows, processes, and data pipelines. Work with cross-functional teams to ensure data quality and consistency. Provide production support and troubleshooting for existing workflows. Required Skills and Qualifications: 8+ years of experience in Data Analytics or ETL Development, with at least 3-4 years hands-on with Alteryx. Strong knowledge of Alteryx Designer, Server, and Gallery. Experience working with databases such as SQL Server, Oracle, and cloud data platforms. Proficient in writing complex SQL queries. Understanding of data governance, data quality, and best practices. Strong analytical and communication skills. Alteryx Certification is a plus. Experience with reporting/visualization tools like Tableau or Power BI is a bonus. Preferred Qualifications: Background in Financial Services, Banking, or Consulting domain. Experience with scripting languages (Python, R) within Alteryx is a plus. Knowledge of Capital Markets or Corporate Banking will be an added advantage. If you are keen to join us, you will be part of an organization that values your contributions, recognizes your potential, and provides ample opportunities for growth. For more information, visit www.capco.com. Follow us on Twitter, Facebook, LinkedIn, and YouTube.

Posted 2 weeks ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Coimbatore

Work from Office

Naukri logo

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Google Cloud Machine Learning Services Good to have skills : GCP Dataflow, Google Pub/Sub, Google Dataproc Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are seeking a skilled GCP Data Engineer to join our dynamic team. The ideal candidate will design, build, and maintain scalable data pipelines and solutions on Google Cloud Platform (GCP). This role requires expertise in cloud-based data engineering and hands-on experience with GCP tools and services, ensuring efficient data integration, transformation, and storage for various business use cases.________________________________________ Roles & Responsibilities: Design, develop, and deploy data pipelines using GCP services such as Dataflow, BigQuery, Pub/Sub, and Cloud Storage. Optimize and monitor data workflows for performance, scalability, and reliability. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and implement solutions. Implement data security and governance measures, ensuring compliance with industry standards. Automate data workflows and processes for operational efficiency. Troubleshoot and resolve technical issues related to data pipelines and platforms. Document technical designs, processes, and best practices to ensure maintainability and knowledge sharing.________________________________________ Professional & Technical Skills:a) Must Have: Proficiency in GCP tools such as BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Cloud Storage. Expertise in SQL and experience with data modeling and query optimization. Solid programming skills in Python ofor data processing and ETL development. Experience with CI/CD pipelines and version control systems (e.g., Git). Knowledge of data warehousing concepts, ELT/ETL processes, and real-time streaming. Strong understanding of data security, encryption, and IAM policies on GCP.b) Good to Have: Experience with Dialogflow or CCAI tools Knowledge of machine learning pipelines and integration with AI/ML services on GCP. Certifications such as Google Professional Data Engineer or Google Cloud Architect.________________________________________ Additional Information: - The candidate should have a minimum of 3 years of experience in Google Cloud Machine Learning Services and overall Experience is 3- 5 years - The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions. Qualifications 15 years full time education

Posted 2 weeks ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Karnataka

Work from Office

Naukri logo

Develop and manage ETL pipelines using Python. Responsible for transforming and loading data efficiently from source to destination systems, ensuring clean and accurate data.

Posted 2 weeks ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Mumbai

Work from Office

Naukri logo

Design and implement ETL solutions using IBM InfoSphere DataStage to integrate and process large datasets. You will develop, test, and optimize data pipelines to ensure smooth data transformation and loading. Expertise in IBM InfoSphere DataStage, ETL processes, and data integration is essential for this position.

Posted 2 weeks ago

Apply

8.0 - 12.0 years

6 - 10 Lacs

Noida

Work from Office

Naukri logo

Hands-on experience with ETL testing tools Strong SQL skills Experience with file processing (CSV, XML, JSON) and data validation techniques. Familiarity with scripting languages Knowledge of cloud platforms (AWS, Azure) is a plus.

Posted 2 weeks ago

Apply

5.0 - 9.0 years

14 - 19 Lacs

Chennai

Work from Office

Naukri logo

Project description We are seeking a highly skilled Senior Power BI Developer with strong expertise in Power BI, SQL Server, and data modeling to join our Business Intelligence team. In this role, you will lead the design and development of interactive dashboards, robust data models, and data pipelines that empower business stakeholders to make informed decisions. You will work collaboratively with cross-functional teams and drive the standardization and optimization of our BI architecture. Responsibilities Power BI Dashboard Development (UI Dashboards) Design, develop, and maintain visually compelling, interactive Power BI dashboards aligned with business needs. Collaborate with business stakeholders to gather requirements, develop mockups, and refine dashboard UX. Implement advanced Power BI features like bookmarks, drill-throughs, dynamic tooltips, and DAX calculations. Conduct regular UX/UI audits and performance tuning on reports. Data Modeling in SQL Server & Dataverse Build and manage scalable, efficient data models in Power BI, Dataverse, and SQL Server. Apply best practices in dimensional modeling (star/snowflake schema) to support analytical use cases. Ensure data consistency, accuracy, and alignment across multiple sources and business areas. Perform optimization of models and queries for performance and load times. Power BI Dataflows & ETL Pipelines Develop and maintain reusable Power BI Dataflows for centralized data transformations. Create ETL processes using Power Query, integrating data from diverse sources including SQL Server, Excel, APIs, and Dataverse. Automate data refresh schedules and monitor dependencies across datasets and reports. Ensure efficient data pipeline architecture for reuse, scalability, and maintenance. Skills Must have Experience6+ years in Business Intelligence or Data Analytics with a strong focus on Power BI and SQL Server. Technical Skills: Expert-level Power BI development, including DAX, custom visuals, and report optimization. Strong knowledge of SQL (T-SQL) and relational database design. Experience with Dataverse and Power Platform integration. Proficiency in Power Query, Dataflows, and ETL development. ModelingProven experience in dimensional modeling, star/snowflake schema, and performance tuning. Data IntegrationSkilled in connecting and transforming data from various sources, including APIs, Excel, and cloud data services. CollaborationAbility to work with stakeholders to define KPIs, business logic, and dashboard UX. Nice to have N/A OtherLanguagesEnglishC1 Advanced SenioritySenior

Posted 2 weeks ago

Apply

3.0 - 5.0 years

12 - 18 Lacs

Gurugram

Hybrid

Naukri logo

about you Key responsibilities include: Designing and developing robust ETL pipelines and data integration workflows that meet business and technical requirements Leveraging cloud-native tools and technologies to build scalable and secure data integration solutions. Collaborating with data architects, analysts, and other stakeholders to understand integration needs and ensure alignment with data strategy. Ensuring data quality, consistency, and integrity throughout the ETL process. Monitoring and maintaining ETL jobs, performing troubleshooting, and optimizing performance. Implementing best practices for version control, testing, deployment, and documentation of integration solutions. additional information Main activities Develop the interface for ETL in ABInitio with the flexibility to switch to any iPaaS tool as needed. Manage all technical and configuration aspects of the identified solution, demonstrating advanced competency in understanding functional requirements. Lead design and development activities to create the technical architecture of the interface. Collaborate closely with Solution Leads and Business Analysts to ensure a thorough understanding of requirements. Required Skills: 1. Proficiency in ABInitio development. 2. Knowledge of Java/J2EE 3. Strong knowledge of Unix shell scripting . 3. Familiarity with SQL and its applications in data management. Preferred Skills: Experience with data loading processes into Oracle SaaS applications using Fast Data Integration (FBDI). Understanding of SOAP and REST APIs. Knowledge of iPaaS tools and their application in cloud-based solutions.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Lead the migration of the ETLs from on-premises SQLServer based data warehouse to Azure Cloud, Databricks and Snowflake Design, develop, and implement data platform solutions using Azure Data Factory (ADF), Self-hosted Integration Runtime (SHIR), Logic Apps, Azure Data Lake Storage Gen2 (ADLS Gen2), Blob Storage, and Databricks (Pyspark) Review and analyze existing on-premises ETL processes developed in SSIS and T-SQL Implement DevOps practices and CI/CD pipelines using GitActions Collaborate with cross-functional teams to ensure seamless integration and data flow Optimize and troubleshoot data pipelines and workflows Ensure data security and compliance with industry standards Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications 6+ years of experience as a Cloud Data Engineer Hands-on experience with Azure Cloud data tools (ADF, SHIR, Logic Apps, ADLS Gen2, Blob Storage) and Databricks Solid experience in ETL development using on-premises databases and ETL technologies Experience with Python or other scripting languages for data processing Experience with Agile methodologies Proficiency in DevOps and CI/CD practices using GitActions Proven excellent problem-solving skills and ability to work independently Proven solid communication and collaboration skills Proven solid analytical skills and attention to detail Proven ability to adapt to new technologies and learn quickly Preferred Qualifications Certification in Azure or Databricks Experience with data modeling and database design Experience with development in Snowflake for data engineering and analytics workloads Knowledge of data governance and data quality best practices Familiarity with other cloud platforms (e.g., AWS, Google Cloud)

Posted 3 weeks ago

Apply

5.0 - 10.0 years

8 - 13 Lacs

Gurugram

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. We are seeking a highly skilled and experienced Senior Cloud Data Engineer to join our team for a Cloud Data Modernization project. The successful candidate will be responsible for migrating our on-premises Enterprise Data Warehouse (SQLServer) to a modern cloud-based data platform utilizing Azure Cloud data tools, Delta lake and Snowflake. Primary Responsibilities Lead the migration of the ETLs from on-premises SQLServer based data warehouse to Azure Cloud, Databricks and Snowflake Design, develop, and implement data platform solutions using Azure Data Factory (ADF), Self-hosted Integration Runtime (SHIR), Logic Apps, Azure Data Lake Storage Gen2 (ADLS Gen2), Blob Storage, and Databricks (Pyspark) Review and analyze existing on-premises ETL processes developed in SSIS and T-SQL Implement DevOps practices and CI/CD pipelines using GitActions Collaborate with cross-functional teams to ensure seamless integration and data flow Optimize and troubleshoot data pipelines and workflows Ensure data security and compliance with industry standards Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications 6+ years of experience as a Cloud Data Engineer Hands-on experience with Azure Cloud data tools (ADF, SHIR, Logic Apps, ADLS Gen2, Blob Storage) and Databricks Solid experience in ETL development using on-premises databases and ETL technologies Experience with Python or other scripting languages for data processing Experience with Agile methodologies Proficiency in DevOps and CI/CD practices using GitActions Proven excellent problem-solving skills and ability to work independently Solid communication and collaboration skills Solid analytical skills and attention to detail Ability to adapt to new technologies and learn quickly Preferred Qualifications Certification in Azure or Databricks Experience with data modeling and database design Experience with development in Snowflake for data engineering and analytics workloads Knowledge of data governance and data quality best practices Familiarity with other cloud platforms (e.g., AWS, Google Cloud) At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 3 weeks ago

Apply

4.0 - 8.0 years

8 - 15 Lacs

Hyderabad

Hybrid

Naukri logo

About the Company: DiLytics is a leading Information Technology (IT) Services provider completely focused on providing services in Analytics, Business Intelligence, Data Warehousing, Data Integration and Enterprise Performance Management areas. We have been growing for 12+ years and have offices in the US, Canada and India. We are an employee-friendly company that offers exciting and stress-free work culture and provides career paths where elements of job enrichment and flexibility to move across roles are inherent. Key Responsibilities: Manage a team of ETL developers, assign tasks, and ensure timely delivery of projects and PoCs. Provide technical leadership and groom a team of ETL developers. Design and develop complex mappings, Process Flows and ETL scripts. Perform data extraction and transformation using SQL query to create data set required for dashboards. Optimize ETL processes for efficiency, scalability, and performance tuning. Utilize appropriate ETL tools and technologies (e.g., ODI, ADF, SSIS, Alteryx, Talend, etc.) Stay up to date on the latest ETL trends and technologies. Exposure to designing and developing BI reports and dashboards using Power Bi/Tableau and other tools to meet business analytic needs. Skills Required: Bachelors degree in computer science or related field. Relevant experience of 4 to 8 years. Extensive experience in designing and implementing ETL processes. Experience in designing / developing ETL processes such as ETL control tables, error logging, auditing, data quality, etc. Expertise in Data Integration tool sets - Azure Data Factory, Oracle Data Integrator, SQL Server Integration Services, Talend, etc. - and PL/SQL. Exposure to one or more of these data visualization tools - OAC, Power BI, Tableau, OBIEE. Excellent written, verbal communication and interpersonal skills.

Posted 3 weeks ago

Apply

3.0 - 6.0 years

5 - 12 Lacs

Chennai

Remote

Naukri logo

Job Description We are seeking a skilled Talend Developer with 3 to 6 years of hands-on experience in designing, developing, and optimizing ETL pipelines. The ideal candidate will be proficient in working with Talend, AWS, APIs, and databases and can join on immediate to 15 days notice . Key Responsibilities: Design, develop, and maintain ETL workflows to extract data from AWS S3, transform it per business rules, load into APIs, and retrieve results. Analyze existing ETL workflows and identify areas for performance and design improvements. Build scalable, dynamic ETL pipelines from scratch with future enhancement capability. Collaborate with data engineering and data science teams to ensure data consistency and integrity. Conduct comprehensive unit testing of ETL pipelines and troubleshoot performance issues. Deploy Talend pipelines across different environments using best practices and context variables. Create clear and comprehensive documentation of ETL processes, pipelines, and methodologies. Required Skills & Experience: Minimum 3 years of Talend Development experience Strong expertise in Talend components for File, Database, and API (GET & POST) integration Experience with AWS services and incorporating them in Talend workflows Proven experience in pipeline migration and multi-environment deployment Proficient in SQL and relational databases Working knowledge of Java or Python for automation and logic handling Familiarity with Git for version control and Nexus for artifact management Strong debugging and troubleshooting skills for ETL workflows Excellent attention to detail and analytical mindset Effective communication and collaboration skills Benefits: Competitive salary and performance bonuses Work on cutting-edge data engineering projects Collaborative work culture Learning and growth opportunities How to Apply: Interested candidates meeting the criteria and ready to join within 15 days, please apply directly via Naukri or send your updated resume to hanifarsangeetha@sightspectrum.com

Posted 3 weeks ago

Apply

9.0 - 14.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

This role involves the development and application of engineering practice and knowledge in designing, managing and improving the processes for Industrial operations, including procurement, supply chain and facilities engineering and maintenance of the facilities. Project and change management of industrial transformations are also included in this role. - Grade Specific Focus on Industrial Operations Engineering. Fully competent in own area. Acts as a key contributor in a more complex/ critical environment. Proactively acts to understand and anticipates client needs. Manages costs and profitability for a work area. Manages own agenda to meet agreed targets. Develop plans for projects in own area. Looks beyond the immediate problem to the wider implications. Acts as a facilitator, coach and moves teams forward. Skills (competencies)

Posted 3 weeks ago

Apply

8.0 - 13.0 years

14 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

The Solution Architect Data Engineer will design, implement, and manage data solutions for the insurance business, leveraging expertise in Cognos, DB2, Azure Databricks, ETL processes, and SQL. The role involves working with cross-functional teams to design scalable data architectures and enable advanced analytics and reporting, supporting the company's finance, underwriting, claims, and customer service operations. Key Responsibilities: Data Architecture & Design: Design and implement robust, scalable data architectures and solutions in the insurance domain using Azure Databricks, DB2, and other data platforms. Data Integration & ETL Processes: Lead the development and optimization of ETL pipelines to extract, transform, and load data from multiple sources, ensuring data integrity and performance. Cognos Reporting: Oversee the design and maintenance of Cognos reporting systems, developing custom reports and dashboards to support business users in finance, claims, underwriting, and operations. Data Engineering: Design, build, and maintain data models, data pipelines, and databases to enable business intelligence and advanced analytics across the organization. Cloud Infrastructure: Develop and manage data solutions on Azure, including Databricks for data processing, ensuring seamless integration with existing systems (e.g., DB2, legacy platforms). SQL Development: Write and optimize complex SQL queries for data extraction, manipulation, and reporting purposes, with a focus on performance and scalability. Data Governance & Quality: Ensure data quality, consistency, and governance across all data solutions, implementing best practices and adhering to industry standards (e.g., GDPR, insurance regulations). Collaboration: Work closely with business stakeholders, data scientists, and analysts to understand business needs and translate them into technical solutions that drive actionable insights. Solution Architecture: Provide architectural leadership in designing data platforms, ensuring that solutions meet business requirements, are cost-effective, and can scale for future growth. Performance Optimization: Continuously monitor and tune the performance of databases, ETL processes, and reporting tools to meet service level agreements (SLAs). Documentation: Create and maintain comprehensive technical documentation including architecture diagrams, ETL process flows, and data dictionaries. Required Qualifications: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. Proven experience as a Solution Architect or Data Engineer in the insurance industry, with a strong focus on data solutions. Hands-on experience with Cognos (for reporting and dashboarding) and DB2 (for database management). Proficiency in Azure Databricks for data processing, machine learning, and real-time analytics. Extensive experience in ETL development, data integration, and data transformation processes. Strong knowledge of Python, SQL (advanced query writing, optimization, and troubleshooting). Experience with cloud platforms (Azure preferred) and hybrid data environments (on-premises and cloud). Familiarity with data governance and regulatory requirements in the insurance industry (e.g., Solvency II, IFRS 17). Strong problem-solving skills, with the ability to troubleshoot and resolve complex technical issues related to data architecture and performance. Excellent verbal and written communication skills, with the ability to work effectively with both technical and non-technical stakeholders. Preferred Qualifications: Experience with other cloud-based data platforms (e.g., Azure Data Lake, Azure Synapse, AWS Redshift). Knowledge of machine learning workflows, leveraging Databricks for model training and deployment. Familiarity with insurance-specific data models and their use in finance, claims, and underwriting operations. Certifications in Azure Databricks, Microsoft Azure, DB2, or related technologies. Knowledge of additional reporting tools (e.g., Power BI, Tableau) is a plus. Key Competencies: Technical Leadership: Ability to guide and mentor development teams in implementing best practices for data architecture and engineering. Analytical Skills: Strong analytical and problem-solving skills, with a focus on optimizing data systems for performance and scalability. Collaborative Mindset: Ability to work effectively in a cross-functional team, communicating complex technical solutions in simple terms to business stakeholders. Attention to Detail: Meticulous attention to detail, ensuring high-quality data output and system performance.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies