Jobs
Interviews

23 Etl Developer Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

0 Lacs

karnataka

On-site

As a Data Engineer specializing in ETL, you should possess a minimum of 7 to 8 years of relevant experience in the field. This position is open across Pan India, and immediate joiners are highly preferred. You will be expected to demonstrate expertise in a range of mandatory skills, including ETL Developer, Synapse, Pyspark, ADF, SSIS, Databricks, SQL, Apache Airflow, and proficiency in Azure & AWS. It is important to note that proficiency in all the mentioned skills is a prerequisite for this role. The selection process for this position involves a total of three rounds - L1 with the External Panel, L2 with the Internal Panel, and L3 with the Client Round. Your responsibilities will include working as an ETL Developer for at least 7+ years, demonstrating proficiency in Pyspark for 5+ years, SSIS for 3 to 4+ years, Databricks for 4 to 4+ years, SQL for 6+ years, Apache Airflow for 4+ years, and experience in Azure and AWS for 3 to 4 years. Additionally, familiarity with Synapse for 3 to 4 years is required to excel in this role.,

Posted 4 days ago

Apply

4.0 - 7.0 years

4 - 7 Lacs

Bengaluru, Karnataka, India

On-site

Job Responsibilities ? Design, build, test, and maintain cloud-based data structures such as data marts, data warehouses and data pipelines. ? Design, build, test, and maintain cloud-based data pipelines to acquire, profile, explore, cleanse, consolidate, transform, integrate data. ? Design and develop ETL processes for the Data Warehouse lifecycle (staging of data, ODS data integration, and data marts) and Data Security (Data archival, Data obfuscation, etc.) ? Build complex SQL queries on large datasets and performance tune as needed. ? Writing advanced PLSQL/TSQL programs based on the requirements. ? Maintain ETL packages and supporting data objects for our growing BI infrastructure. ? Establish efficient and reliable Data pipelines that support the organization's data-driven decision-making processes. ? Carry-out monitoring, tuning, and database performance analysis. ? Knowledge on BI visualization tools (Connecting to databases and providing datasets to prepare reports and dashboards as required). ? Proactively involved in both physical processing and storage aspects of data. ? Involved in the initial exploration and understanding of data characteristics to support downstream tasks. ? Collaborate and work alongside with other technical professionals (BI Report developers, Data Analysts, Architect). ? Knowledge on Big Data Technologies (including unstructured data) for processing and analysing large datasets. ? Leverage various cloud technologies to build scalable, flexible, and cost-effective solutions. ? Communicate clearly and effectively with stakeholders. Required Skillsets: ? 4-7 years of experience as a SQL, PLSQL/TSQL, ETL/SSIS developer, Data Engineer, or a related role ? Strong ETL experience using SSIS or another equivalent tool. ? Strong experience with building data pipelines. ? Knowledge in Data Modelling and Data warehouse concepts. ? Knowledge on Functionality of BI tools (Powerbase/Tableau/SAP/OBIEE). ? Big data technologies ? Cloud Programming (Any) ? Preferred with experience in Programming (Python/Java) ? Demonstrated ability to write SQL queries and PLSQL/TSQL programming to retrieve/modify data. ? Knowledge and know-how to troubleshoot potential issues, and experience with best practices around database operations. ? Ability to learn new tools and technologies and adapt to an evolving tech-scape.

Posted 2 weeks ago

Apply

4.0 - 7.0 years

4 - 7 Lacs

Hyderabad, Telangana, India

On-site

Design, build, test, and maintain cloud-based data structures such as data marts, data warehouses and data pipelines. Design, build, test, and maintain cloud-based data pipelines to acquire, profile, explore, cleanse, consolidate, transform, integrate data. Design and develop ETL processes for the Data Warehouse lifecycle (staging of data, ODS data integration, and data marts) and Data Security (Data archival, Data obfuscation, etc.) Build complex SQL queries on large datasets and performance tune as needed. Writing advanced PLSQL/TSQL programs based on the requirements. Maintain ETL packages and supporting data objects for our growing BI infrastructure. Establish efficient and reliable Data pipelines that support the organization's data-driven decision-making processes. Carry-out monitoring, tuning, and database performance analysis. Knowledge on BI visualization tools (Connecting to databases and providing datasets to prepare reports and dashboards as required). Proactively involved in both physical processing and storage aspects of data. Involved in the initial exploration and understanding of data characteristics to support downstream tasks. Collaborate and work alongside with other technical professionals (BI Report developers, Data Analysts, Architect). Knowledge on Big Data Technologies (including unstructured data) for processing and analysing large datasets. Leverage various cloud technologies to build scalable, flexible, and cost-effective solutions. Communicate clearly and effectively with stakeholders. Required Skillsets: 4-7 years of experience as a SQL, PLSQL/TSQL, ETL/SSIS developer, Data Engineer, or a related role Strong ETL experience using SSIS or another equivalent tool. Strong experience with building data pipelines. Knowledge in Data Modelling and Data warehouse concepts. Knowledge on Functionality of BI tools (Powerbase/Tableau/SAP/OBIEE). Big data technologies Cloud Programming (Any) Preferred with experience in Programming (Python/Java) Demonstrated ability to write SQL queries and PLSQL/TSQL programming to retrieve/modify data. Knowledge and know-how to troubleshoot potential issues, and experience with best practices around database operations. Ability to learn new tools and technologies and adapt to an evolving tech-scape.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As an ETL Developer at our client's EU subsidiary of a Global Financial Bank, you will play a crucial role in designing, building interfaces, and integrating data from various internal and external sources into the new Enterprise Data Warehouse environment. Your primary responsibility will be developing ETL solutions using Microsoft and Azure technologies while adhering to industry ETL standards, architecture, and best practices. You will act as a technical expert throughout the software development lifecycle, including designing, coding, unit testing, supporting, and debugging data warehouse software components. Your expertise in cloud and ETL engineering will be instrumental in solving problems and designing effective approaches. Additionally, you will troubleshoot and debug ETL pipelines, optimize query performance, and create unit tests. Collaborating with the Development Lead, DWH Architect, QA Engineers, and business analysts, you will contribute to planning, implementing, and delivering efficient ETL strategies that align with end-user requirements. Your role will involve creating technical documentation, reports, and dashboards in the BI portal while supporting internal audit processes. Key Mandatory Skills: - Proven work experience as an ETL Developer - Advanced knowledge of relational databases and dimensional Data Warehouse modeling - Expertise in Microsoft Data stack with experience in Azure and Synapse Analytics - Designing and implementing data transformation and ETL layers using tools like Data Factory and Notebooks - Experience with PowerBI for report and dashboard creation - Strong SQL knowledge for developing complex queries and working with stored procedures, views, indexes, etc. - Familiarity with CI/CD tools and principles, preferably Azure DevOps or Bamboo - Proficiency in at least one scripting language, with Python as an advantage - Experience with GIT repositories and version control tools like GitHub, Azure DevOps, or Bitbucket - Working in Agile projects, preferably using JIRA - Excellent problem-solving skills, communication abilities, and understanding of data governance concepts Nice-to-Have Skills: - Microsoft Fabric - Snowflake - Background in SSIS / SSAS / SSRS - Azure DevTest Labs, ARM templates - Azure PurView - Banking or finance industry experience Your ability to work independently, collaborate effectively in a team environment, and communicate complex information clearly will be essential for success in this role. If you have a passion for data engineering, a keen eye for detail, and a proactive approach to problem-solving, we encourage you to apply.,

Posted 2 weeks ago

Apply

7.0 - 10.0 years

7 - 16 Lacs

Gurgaon, Haryana, India

On-site

Insight Direct India is seeking a proactive Cloud Data Engineer III to join our team. In this role, you'll be instrumental in building and maintaining robust data pipelines, enabling faster, more data-informed decision-making for our customers enterprise business analytics. We'll count on you to collaborate with stakeholders, understanding their strategic objectives and identifying opportunities to leverage data and enhance data quality. You'll also design, develop, and maintain large-scale data solutions on the Microsoft Data platform. As a Cloud Data Engineer III, you will get to: Build and maintain data pipelines to enable faster, better, data-informed decision-making through customer enterprise business analytics. Collaborate with stakeholders to understand their strategic objectives and identify opportunities to leverage data and data quality. Design, develop, and maintain large-scale data solutions on the Microsoft Data platform. Implement ETL pipelines using Azure Data Factory, DBT , and various SQL engines including Spark SQL, Databricks SQL, and Snowflake SQL. What We're Looking For We are looking for a Cloud Data Engineer III with: Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Experience: 1+ years of experience in data engineering with 1+ years of hands-on experience using Spark SQL to model and transform data. 1+ years of experience working with DBT and a good understanding of DBT concepts. 1+ years of experience building Power BI models and visualizations with a solid understanding of DAX and Power BI visualization techniques.

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

Skill Data Engineer PowerBI Band in Infosys 5 Role Technology Lead Qualification B.E/Btech Job Description 6 to 10 years relevant experience and able to fulfill a role of managing delivery, coach team members, and lead "best practices and procedures. Power-BI, SSRS, Power-BI report Builder, AAS (SSAS Tabular Model) and MSBI especially into SQL server developer, SSIS developer, ETL developer. Well experienced in creating Data models, power-bi reports on top of model and publishing them overpower-bi services. Experience in creating workspace Vendor Rate Work Location with Zip code Pune ,Hyderabad, Bhubneshwar,

Posted 3 weeks ago

Apply

4.0 - 6.0 years

1 - 6 Lacs

Nagpur, Thane, Pune

Work from Office

Job Title: Analyst Work Location: THANE, MH/ NAGPUR, MH / Pune, MH. Skill Required Data Migration Experience Range in 4-6 Years Job Description: Data Migration Essential Skills: Data Migration

Posted 1 month ago

Apply

3.0 - 6.0 years

15 - 20 Lacs

Bengaluru

Hybrid

Description: Role: Data Engineer/ETL Developer - Talend/Power BI Job Description: 1. Study, analyze and understand business requirements in context to business intelligence and provide the end-to-end solutions. 2. Design and Implement ETL pipelines with data quality and integrity across platforms like Talend Enterprise, informatica 3. Load the data from heterogeneous sources like Oracle, MSSql, File system, FTP services, Rest APIs etc.. 4. Design and map data models to shift raw data into meaningful insights and build data catalog. 5. Develop strong data documentation about algorithms, parameters, models. 6. Analyze previous and present data for better decision making. 7. Make essential technical changes to improvise present business intelligence systems. 8. Optimizing ETL processes for improved performance, monitoring ETL jobs and troubleshooting issues. 9. Lead and oversee the Team deliverables, ensure best practices are followed for development. 10. Participate/lead in requirements gathering and analysis. Required Skillset and Experience: 1. Over all up to 3 years of working experience, preferably in SQL, ETL (Talend) 2. Must have 1+ years of experience in Talend Enterprise/Open studio and related tools like Talend API, Talend Data Catalog, TMC, TAC etc. 3. Must have understanding of database design, data modeling 4. Hands-on experience in any of the coding language (Java or Python etc.) Secondary Skillset/Good to have: 1. Experience in BI Tool like MS Power Bi. 2. Utilize Power BI to build interactive and visually appealing dashboards and reports. Required Personal & Interpersonal Skills • Strong Analytical skills • Good communication skills, both written and verbal. • Highly motivated and result-oriented • Self-driven independent work ethics that drives internal and external accountability • Ability to interpret instructions to executives and technical resources. • Advanced problem-solving skills dealing with complex distributed applications. • Experience of working in multicultural environment.Enable Skills-Based Hiring No Additional Details Planned Resource Unit : (55)IT_TRUCKS;(11)F/TC - Application Engineer - 3-6 Yrs;Business Intelligence;(Z2)3-6 Years

Posted 1 month ago

Apply

3.0 - 6.0 years

17 - 18 Lacs

Bengaluru

Hybrid

Hi all , we are looking for a role ETL Developer cum Java Support Engineer experience : 3 - 6 years notice period : Immediate - 15 days location : Bengaluru Description: ETL Developer cum Java Support Engineer Job Summary: We are seeking a versatile professional who can seamlessly blend ETL development expertise with Java-based application support. This hybrid role involves designing and maintaining ETL pipelines while also managing Java support tickets, troubleshooting issues, and ensuring smooth system operations. Key Responsibilities: ETL Development: Design, develop, and maintain ETL workflows to extract, transform, and load data from various sources. Optimize ETL processes for performance, scalability, and reliability. Collaborate with data analysts and business stakeholders to understand data requirements. Ensure data quality, integrity, and compliance with governance standards. Document ETL processes and maintain metadata. Java Support: Monitor and resolve Java application support tickets within defined SLAs. Debug and troubleshoot Java-based backend issues in production and staging environments. Collaborate with development teams to implement bug fixes and enhancements. Perform root cause analysis and provide long-term solutions. Maintain logs, reports, and documentation for support activities. Required Skills: Proficiency in ETL tools (e.g., Informatica, Talend, SSIS). Strong SQL skills and experience with relational databases (Oracle, MySQL, PostgreSQL). Solid understanding of Java and related frameworks (Spring, Hibernate). Familiarity with version control systems (Git) and ticketing tools (JIRA, ServiceNow). Excellent problem-solving and communication skills. Preferred Qualifications: Bachelors degree in Computer Science, Information Systems, or related field. Experience with cloud platforms (AWS, Azure) and data lakes is a plus. Knowledge of data warehousing concepts and data modeling. Enable Skills-Based Hiring No

Posted 1 month ago

Apply

3.0 - 6.0 years

12 - 13 Lacs

Bengaluru

Hybrid

Hi all , we are looking for a role ETL Developer experience : 3 - 6 years notice period : Immediate - 15 days location : Bengaluru Core Technical Expertise Data warehousing & migration : Deep expertise in ETL tools like Informatica PowerCenter, relational databases, data modeling, data cleansing, SQL optimization, and performance tuning. Programming & scripting : Strong SQL programming skills, shell scripting (Unix), debugging, and handling large datasets. Toolset : Experience with JIRA, Confluence, GIT; working knowledge of scheduling tools and integration of multiple data sources. Bonus skills : Familiarity with Talend Enterprise and Azure/cloud/BigData technologies.

Posted 1 month ago

Apply

5.0 - 8.0 years

13 - 20 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Work from Office

Job Description: Develop, and implement ETL processes using Python and SQL to extract, transform, and load data from various sources into our data warehouse. Optimize and maintain existing ETL workflows and data pipelines to improve performance and scalability. Design, develop, and maintain efficient, reusable, and reliable Python code and should support in python version upgrade activities. Collaborate with cross-functional teams to understand data requirements and ensure data integrity and quality. Monitor and troubleshoot data processing systems to ensure timely and accurate data delivery. Develop and maintain documentation related to ETL processes, data models, and workflows. Participate in code reviews and provide constructive feedback to team members. Stay up-to-date with industry trends and emerging technologies to continuously improve our data engineering practices. Skills & Qualifications: Bachelors degree in IT, computer science, computer engineering, or similar Proven experience as a Data Engineer or ETL Developer, with a focus on Python and SQL. Minimum 5 years of experience in ETL Proficiency in programming languages such as Python for data engineering tasks. Should be able to support in python version upgrade activities. Strong understanding of ETL concepts and data warehousing principles. Proficiency in writing complex SQL queries and optimizing database performance. Familiarity with cloud platforms such as Azure, or OCI is a plus. Demonstrated experience in design and delivering data platforms for Business Intelligence and Data Warehouse. Experience with version control systems, such as Git. Familiar with Agile methodology and Agile working environment. Ability to work alone with POs, BAs, Architects

Posted 1 month ago

Apply

5.0 - 7.0 years

14 - 18 Lacs

Pune

Work from Office

5+ Years of ETL experience working with a large scale MS SQL server data warehouse databases by using SSIS and reports are built through MS SSRS reports Good experience in managing Data ware house as part of data management principles has a techno functional knowledge (Data Governance, Data architecture, Data profiling, Data analysis, Storing and managing data, data quality validation, ETL through SSIS and T-SQL, SSRS reports development) Design, develop, and maintain ETL processes using SSIS or any other ETL tool to extract data from various sources, transform it according to business rules, and load it into target databases or data warehouses. Develop and optimize SQL queries, stored procedures, and functions. Experience in both forward engineering (Requirements to a SQL logic) and a Reverse Engineering (converting a SQL logic to a business requirement document) Should be proficient in troubleshooting and resolving production issue Having skills in documenting data flows by using a Visio, business requirement in a word and source to target mapping documents in an Excel. Should have worked on documenting ETL processes, source to target mappings, etc Strong communication skills Good Data warehousing concept SSIS hands on experience Strong hands on experience with Stored Procedure Design, develop, and maintain ETL processes using SSIS or any other ETL tool to extract data from various sources, transform it according to business rules, and load it into target databases or data warehouses. Develop and optimize SQL queries, stored procedures, and functions. Experience in both forward engineering (Requirements to a SQL logic) and a Reverse Engineering (converting a SQL logic to a business requirement document)Should be proficient in troubleshooting and resolving production issue Having skills in documenting data flows by using a Visio, business requirement in a word and source to target mapping documents in an Excel. Should have worked on documenting ETL processes, source to target mappings, etc Strong communication skills Good Data warehousing concept SSIS hands on experience Strong hands on experience with Stored Procedure

Posted 1 month ago

Apply

3.0 - 6.0 years

8 - 27 Lacs

Bengaluru, Karnataka, India

On-site

We are hiring for the role ETL and GoDAP Experience: 3-6Years Location: Bangalore Notice Period: I mmediate - 15 Days Skills: CI/CD pipeline implementation for ETL and GoDAP. Currently end-to-end workflow is not fully automated. Our next year focus should be to create standardized process establishment for on-prem as well as in cloud journey. 2. ETL Pipeline establishment/Optimization and performance tuning. 3. Unit test framework setup and integrate into CI/CD p rocess to identify and fix the incidents well advance. 4. Test coverage tools integration. 5. Real-time monitoring and logging tools setup so that failures can be diagnosed quickly and efficiently. 6. Handling environment parity (Dev and prod) through Docker/Kubernetes for local development and testing. 7. Resource optimization by focusing on process standardization and cost reduction. 8. Data workflow Orchestration. 9. Manage and optimize Docker containers for scalable and efficient application deployments. 10. Design and implement CI/CD pipelines to automate build, test, and deployment processes. 11. Develop and maintain workflows using GitHub Actions to support automation and improve efficiency. 12. Plan and execute seamless on-premises to cloud migrations, ensuring security and scalability.

Posted 1 month ago

Apply

0.0 years

0 Lacs

Gurgaon / Gurugram, Haryana, India

On-site

Job Title: Data Engineer Job Summary Data Engineers will be responsible for the design, development, testing, maintenance, and support data assets including Azure Data Lake and data warehouse development, modeling, package creation, SQL script creation, stored procedure development, integration services support among other responsibilities. Candidate have at least 3-5 years hands-on Azure experience as a Data Engineer, must be an expert in SQL and have extensive expertise building data pipelines. Candidate will be accountable for meeting deliverable commitments including schedule and quality compliance. This Candidate must have skills to plan and schedule own work activities, coordinate activities with other cross-functional team members to meet project goals. Basic understanding of : Scheduling and workflow management & working experience in either ADF, Informatica, Airflow or Similar Enterprise Data Modelling and Semantic Modelling & working experience in ERwin, ER/Studio, PowerDesigner or Similar Logical/Physical model on Big Data sets or modern data warehouse & working experience in ERwin, ER/Studio, PowerDesigner or Similar Agile Process (Scrum cadences, Roles, deliverables) & basic understanding in either Azure DevOps, JIRA or Similar Architecture and data modelling for Data Lake on cloud & working experience in Amazon WebServices (AWS), Microsoft Azure, Google Cloud Platform (GCP) Basic understanding of Build and Release management & working experience in Azure DevOps, AWS CodeCommitt or Similar Strong In: Writing code in programming language & working experience in Python, PySpakrk, Scala or Similar Big Data Framework & working experience in Spark or Hadoop or Hive (incl. derivatives like pySpark (prefered), SparkScala or SparkSQL) or Similar Data warehouse working experience of concepts and development using SQL on single (SQL Server, Oracle or Similar) and parallel platforms (Azure SQL Data Warehouse or Snowflake) Code Management & working experience in GIT Hub, Azure DevOps or Similar End to End Architecture and ETL processes & working experience in ETL Tool or Similar Reading Data Formats & working experience in JSON, XML or Similar Data integration processes (batch & real time) using tools & working experience in either Informatica PowerCenter and/or Cloud, Microsoft SSIS, MuleSoft, DataStage, Sqoop or Similar Writing requirement, functional & technical documentation & working experience in Integration design document, architecture documentation, data testing plans or Similar SQL queries & working experience in SQL code or Stored Procedures or Functions or Views or Similar Database & working experience in any of the database like MS SQL, Oracle or Similar Analytical Problem Solving skills & working experience in resolving complex problems or Similar Communication (read & write in English), Collaboration & Presentation skills & working experience as team player or Similar Good to have: Stream Processing & working experience in either Databricks Streaming, Azure Stream Analytics or HD Insight or Kinesis Data Analytics or Similar Analytical Warehouse & working experience in either SQL Data Warehouse or Amazon Athena or AWS Redshift or Big Query or Similar Real-Time Store & working experience in either Azure Cosmos DB or Amazon Dynamo-DB or Cloud Bigdata or Similar Batch Ingestion & working experience in Data Factory or Amazon Kinesis or Lambda or Cloud Pub/Sub or Similar Storage & working experience in Azure Data Lake Storage GEN1/GEN2 or Amazon S3 or Cloud Storage or Similar Batch Data Processing & working experience in either Azure Databricks or HD Insight or Amazon EMR or AWS Glue or Similar Orchestration & working experience in either Data Factory or HDInsight or Data Pipeline or Cloud composer or Similar

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Hyderabad, Bengaluru

Hybrid

Job Title: ODI + BI Developer (QlikView/Tableau + SQL) Experience Required: 5+ Years (Relevant) Location: Hyderabad / Bangalore Notice Period: Immediate to 15 Days Job Summary: We are seeking a skilled and experienced ODI + BI Developer with a strong foundation in ODI (Oracle Data Integrator) , SQL , and at least one of the BI tools (QlikView or Tableau). The ideal candidate will have a proven track record in data integration, reporting, and analytics solutions with expertise in handling large-scale data environments and business intelligence platforms. Key Responsibilities: Design, develop, and maintain ODI mappings, packages, load plans , and scenarios . Utilize ODI Knowledge Modules (KMs) , particularly those related to Big Data connectors and Oracle . Work with SQL and PL/SQL to create stored procedures, write complex queries, and perform performance tuning. Develop interactive dashboards and reports using QlikView and/or Tableau . Understand business requirements and translate them into BI solutions using visualization tools. Collaborate with business users to define functional and user requirements for BI/reporting needs. Troubleshoot data issues and implement solutions to ensure data quality and integrity. Follow best practices across the software development lifecycle analysis, design, development, testing, and deployment . Required Skills: 5+ years of experience with: Oracle Data Integrator (ODI) mappings, load plans, KMs QlikView and/or Tableau SQL / PL-SQL stored procedures, performance tuning Hands-on experience with BI and reporting systems , including user requirement gathering and data modeling. Strong problem-solving and analytical skills. Good understanding of data warehouse concepts and ETL processes . Ability to work independently in a fast-paced environment. Preferred Qualifications: Experience working with Big Data environments via ODI Knowledge of data security models and ODI topology configurations Familiarity with CI/CD practices in BI projects Interested Candidates can share your resume to subashini.gopalan@kiya.ai

Posted 1 month ago

Apply

5.0 - 10.0 years

4 - 9 Lacs

Hyderabad

Work from Office

Skills: ADF,ETL,Rest API Microsoft Power Platform / Snowflake Certification is a Plus Power Bi / Talend and Integration Power Apps SAP S4 Hana Abap,Odata Rest and Soap Job LOcation : Hyderabad

Posted 1 month ago

Apply

5.0 - 7.0 years

10 - 20 Lacs

Chennai

Work from Office

Key Responsibilities: Design, Develop and implement ETL Solutions using SSIS to integrate data for different applications. Develop easy to read, easy to maintain and clean solutions; and maintain them in source control. Provides support for testing, deployments and investigation/resolution of production issues. Provide upgrades or maintenance to existing solutions in the production environment. Work with business analysts in requirement gathering to develop applications that will solve business requirements. Ensure that all developed solutions meet the business needs and maintain data integrity. Work with the team in data modeling, database design, creating efficient SQL for fast application performance. Analyze performance optimization opportunities of stored procs, SSIS packages and do performance tuning for optimum results. Create/Maintain technical documentations for all developed applications. Work location : Chennai Required Qualifications: With 5 years minimum of ETL Development experience using SQL Server Integration Services (SSIS). With 5 years minimum experience working with RDBMS such as SQL Server. Strong knowledge of Data Warehousing concepts and have the ability to design databases using relational or dimensional modeling. Ability to develop custom database objects, stored procedures,functions and T-SQL. Experience in creating SQL Server Agent jobs for process scheduling. Ability to troubleshoot slow performing processes and provide necessary solutions. Basic technical experience and business knowledge in various SDLC methodologies including waterfall, iterative, agile software development life cycle or related disciplines/processes is preferred. Experience with task management tools such as JIRA. Good communication and interpersonal skills. Ability to communicate clearly in writing to document all technical solutions made. Open to work or be trained on new technologies. Technical and Professional Requirements: SSIS, C#, SQL Server Preferred Skills: SSAS, SOAP/REST AP Interested candidates can apply to kinnera259@gmail.com Regards, HR Manager

Posted 1 month ago

Apply

2.0 - 7.0 years

4 - 7 Lacs

Hyderabad

Work from Office

Design, develop, deploy ETL workflows mappings using Informatica PowerCenter Extract data from various source systems transform/load into target systems Troubleshoot ETL job failures resolve data issues promptly. Optimize and tune complex SQL queries Required Candidate profile Maintain detailed documentation of ETL design, mapping logic, and processes. Ensure data quality and integrity through validation and testing. Exp with Informatica PowerCenter Strong SQL knowledge Perks and benefits Perks and Benefits

Posted 2 months ago

Apply

4.0 - 6.0 years

18 - 20 Lacs

Pune, Chennai, Bengaluru

Work from Office

We are hiring experienced ETL Developers (Ab Initio) for a leading MNC, with positions open in Pune, Chennai, and Bangalore. The ideal candidate should have 5+ years of hands-on experience in ETL development, with strong proficiency in Ab Initio, Unix, and SQL. Exposure to Hadoop and scripting languages like Shell or Python is a plus. This is a work-from-office role and requires candidates to be available for a face-to-face interview. Applicants should be able to join within 15 days to 1 month. Strong development background and the ability to work in a structured, fast-paced environment are essential.

Posted 2 months ago

Apply

5.0 - 8.0 years

9 - 18 Lacs

Hyderabad

Hybrid

We are hiring for ETL QA Developer with one of Big 4 clients. Interested candidates, kindly share your resume to k.arpitha@dynpro.in/ reach out to me on 7975510903 (whats app only) Experience: 5-9 Years IMMEDIATE JOINERS ONLY Tools/Technology Skills: Demonstrates an understanding and working experience on SQL databases and ETL testing. Should be able to write queries to validate the table mappings and structures Should be able to perform schema validations Good understanding of SCD types Strong knowledge of database methodology In-depth understanding of Data Warehousing/Business intelligence concepts Working experience on cloud(Azure) based services Working experience in testing BI reports Should be able to write queries to validate the data quality during migration projects Demonstrates an understanding of any of the peripheral technologies utilized in SDC, including Peoplesoft, SAP and Aderant. Demonstrates a working understanding of tools like UFT and TFS Experience with Microsoft tools is highly desired Understands enterprise wide networks and software implementations. Must have previous experience in creating complex SQL queries for data validation. Must have testing experience in Enterprise Data Warehouse (EDW)

Posted 2 months ago

Apply

5 - 10 years

9 - 19 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Job Title: Senior Software Engineer - ETL Developer Main location: Hyderabad/ Bangalore / Chennai Employment Type: Full Time Experience: 5 to 10 yrs Role & responsibilities : Sr ETL Developer Position Description Looking for a Senior ETL Developer who has: ETL Development & Implementation Strong experience in designing, developing, and deploying ETL solutions using Informatica Cloud Services (ICS), Informatica PowerCenter, and other data integration tools. • Data Integration & Optimization Proficient in extracting, transforming, and loading (ETL) data from multiple sources, optimizing performance, and ensuring data quality. • Stakeholder Collaboration Skilled at working with cross-functional teams, including data engineers, analysts, and business stakeholders, to align data solutions with business needs. • Scripting & Data Handling Experience with SQL, PL/SQL, and scripting languages (e.g., Python, Shell) for data manipulation, transformation, and automation. • Tool Proficiency Familiarity with Informatica Cloud, version control systems (e.g., Git), JIRA, Confluence, and Microsoft Office Suite. • Agile Methodologies Knowledge of Agile frameworks (Scrum, Kanban) with experience in managing backlogs, writing user stories, and participating in sprint planning. • Testing & Validation Involvement in ETL testing, data validation, unit testing, and integration testing to ensure accuracy, consistency, and completeness of data. • Problem-Solving Skills Strong analytical mindset to troubleshoot, debug, and optimize ETL workflows, data pipelines, and integration solutions effectively. • Communication & Documentation Excellent written and verbal communication skills to document ETL processes, create technical design documents, and present data integration strategies to stakeholders. Your future duties and responsibilities Required qualifications to be successful in this role Together, as owners, lets turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, youll reach your full potential because You are invited to be an owner from day 1 as we work together to bring our Dream to life. Thats why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our companys strategy and direction. Your work creates value. Youll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. Youll shape your career by joining a company built to grow and last. Youll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our teamone of the largest IT and business consulting services firms in the world.

Posted 2 months ago

Apply

6 - 11 years

50 - 55 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Must have BODS tech & data migration expertise. Involves ERP, SAP, Syniti tools, client-facing delivery, mentoring juniors. Process: Online Test, Tech Interview, Training. BODS Technical Expertise (SAP BODS BusinessObjects Data Services) Data Migration Experience ERP Experience (preferably SAP) Syniti Tools & Methodology (as used in solution delivery) Client-Facing Consulting Experience Strong Communication Skills (written & verbal in English) Analytical and Data Quality Skills Experience in Full ERP Implementation Lifecycle Ability to Mentor Junior Team Members Location-Remote,Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad

Posted 2 months ago

Apply

5 - 10 years

5 - 15 Lacs

Mumbai Suburban, Navi Mumbai, Mumbai (All Areas)

Work from Office

Solid SQL skills, experience writing scripts for automation using python or shell script and hands on experience using any ETL tool like SSIS, Informatica etc. Role & responsibilities • Develop, design, and maintain interactive and visually compelling reports, dashboards, and analytics solutions using tools such as Power BI, and AWS Quicksight. • Collaborate with business stakeholders, data engineers, and analysts to gather requirements, understand data sources, and ensure alignment of reporting solutions with business needs. • Write simple to complex SQL queries to extract, transform, and manipulate data from various sources, ensuring accuracy, efficiency, and performance. • Identify and troubleshoot data quality and integration issues, proposing effective solutions to enhance data accuracy and reliability within reporting solutions. • Stay up-to-date with industry trends and emerging technologies related to reporting, analytics, and data visualization to continually improve and innovate reporting practices. • Work closely with the development team to integrate reporting solutions into existing applications or systems. • Perform data analysis to identify trends, patterns, and insights, providing valuable information to business stakeholders. • Collaborate with the team to document data models, report specifications, and technical processes. • Participate actively in team meetings, discussions, and knowledge-sharing sessions, contributing to the growth of the team's capabilities. Preferred candidate profile Strong proficiency in SQL with the ability to write complex queries and optimize query performance. Extensive experience with data visualization tools such as Tableau and Power BI; familiarity with AWS Quick sight is a plus. Solid understanding of data warehousing concepts, data modeling, and ETL processes. Exceptional problem-solving skills and the ability to find innovative solutions to technical challenges. English B2 Level or Higher Basic to intermediate Python and Machine learning knowledge is a plus Knowledge of AI is a plus

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies