Jobs
Interviews

1015 Etl Process Jobs - Page 7

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

2 - 5 Lacs

Bulandshahr

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Muzaffarpur

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Gautam Buddha Nagar

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Sonipat

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Meerut

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Hapur

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Gurugram

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Faridabad

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Ghaziabad

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Greater Noida

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Noida

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

8.0 - 10.0 years

10 - 15 Lacs

Noida

Work from Office

Key Responsibilities: Data Testing Strategy & Execution: Design, develop, and execute comprehensive test plans and test cases for data-centric applications, ETL processes, data warehouses, data lakes, and reporting solutions. SQL-Driven Validation: Utilize advanced SQL queries to perform complex data validation, data reconciliation, data integrity checks, and data quality assurance across various financial data sources. ETL Testing: Conduct thorough testing of ETL (Extract, Transform, Load) processes, ensuring data is accurately extracted, transformed according to business rules, and loaded correctly into target systems. Data Quality Assurance: Implement and monitor data quality checks, identify data discrepancies, anomalies, and inconsistencies, and work with development and business teams to resolve issues. Performance Testing (Data Focus): Contribute to performance testing efforts for data pipelines and database operations, ensuring optimal query and data load performance. Test Data Management: Create and manage robust test data sets for various testing phases, including positive, negative, and edge case scenarios. Defect Management: Identify, document, track, and re-test defects in data, collaborating closely with development and data engineering teams for timely resolution. Documentation & Reporting: Maintain clear and concise documentation of test plans, test cases, test results, and data quality reports. Provide regular status updates to stakeholders. Collaboration: Work effectively with business analysts, data architects, data engineers, and project managers to understand data flows, business requirements, and ensure data quality standards are met. Process Improvement: Proactively identify opportunities for process improvements in data testing methodologies and tools. Global Team Collaboration: Provide consistent overlap with EST working hours (until noon EST) to facilitate effective communication and collaboration with US-based teams. Required Skills & Experience: Experience: 8-10 years of hands-on experience in Data Quality Assurance, Data Testing, or ETL Testing roles. SQL Expertise: o Advanced proficiency in SQL: Ability to write complex queries, subqueries, analytical functions (Window functions), CTEs, and stored procedures for data validation, reconciliation, and analysis. o Experience with various SQL databases (e.g., SQL Server, Oracle, PostgreSQL, MySQL, Snowflake, BigQuery). o Strong understanding of database concepts: normalization, indexing, primary/foreign keys, and data types. Data Testing Methodologies: Solid understanding of data warehousing concepts, ETL processes, and various data testing strategies (e.g., source-to-target mapping validation, data transformation testing, data load testing, data completeness, data accuracy). Domain Expertise: o Strong understanding and proven experience in Risk and Finance IT domain: Familiarity with financial data (e.g., trading data, market data, risk metrics, accounting data, regulatory reporting). o Knowledge of financial products, regulations, and risk management concepts. Analytical & Problem-Solving Skills: Excellent ability to analyze complex data sets, identify root causes of data issues, and propose effective solutions. Communication: Strong verbal and written communication skills to articulate data issues and collaborate with diverse teams. Mandatory Competencies Database - PostgreSQL - PostgreSQL Database - Sql Server - SQL Packages ETL - ETL - Tester QA/QE - QA Automation - ETL Testing Beh - Communication

Posted 1 week ago

Apply

5.0 - 8.0 years

8 - 12 Lacs

Noida

Work from Office

Key Responsibilities: Data Testing Strategy & Execution: Design, develop, and execute comprehensive test plans and test cases for data-centric applications, ETL processes, data warehouses, data lakes, and reporting solutions. SQL-Driven Validation: Utilize advanced SQL queries to perform complex data validation, data reconciliation, data integrity checks, and data quality assurance across various financial data sources. ETL Testing: Conduct thorough testing of ETL (Extract, Transform, Load) processes, ensuring data is accurately extracted, transformed according to business rules, and loaded correctly into target systems. Data Quality Assurance: Implement and monitor data quality checks, identify data discrepancies, anomalies, and inconsistencies, and work with development and business teams to resolve issues. Performance Testing (Data Focus): Contribute to performance testing efforts for data pipelines and database operations, ensuring optimal query and data load performance. Test Data Management: Create and manage robust test data sets for various testing phases, including positive, negative, and edge case scenarios. Defect Management: Identify, document, track, and re-test defects in data, collaborating closely with development and data engineering teams for timely resolution. Documentation & Reporting: Maintain clear and concise documentation of test plans, test cases, test results, and data quality reports. Provide regular status updates to stakeholders. Collaboration: Work effectively with business analysts, data architects, data engineers, and project managers to understand data flows, business requirements, and ensure data quality standards are met. Process Improvement: Proactively identify opportunities for process improvements in data testing methodologies and tools. Global Team Collaboration: Provide consistent overlap with EST working hours (until noon EST) to facilitate effective communication and collaboration with US-based teams. ________________________________________ Required Skills & Experience: Experience: 5-8 years of hands-on experience in Data Quality Assurance, Data Testing, or ETL Testing roles. SQL Expertise: o Advanced proficiency in SQL: Ability to write complex queries, subqueries, analytical functions (Window functions), CTEs, and stored procedures for data validation, reconciliation, and analysis. o Experience with various SQL databases (e.g., SQL Server, Oracle, PostgreSQL, MySQL, Snowflake, BigQuery). o Strong understanding of database concepts: normalization, indexing, primary/foreign keys, and data types. Data Testing Methodologies: Solid understanding of data warehousing concepts, ETL processes, and various data testing strategies (e.g., source-to-target mapping validation, data transformation testing, data load testing, data completeness, data accuracy). Domain Expertise: o Strong understanding and proven experience in Risk and Finance IT domain: Familiarity with financial data (e.g., trading data, market data, risk metrics, accounting data, regulatory reporting). o Knowledge of financial products, regulations, and risk management concepts. Analytical & Problem-Solving Skills: Excellent ability to analyze complex data sets, identify root causes of data issues, and propose effective solutions. Communication: Strong verbal and written communication skills to articulate data issues and collaborate with diverse teams. Mandatory Competencies QA/QE - QA Automation - ETL Testing ETL - ETL - Tester Beh - Communication and collaboration Database - Sql Server - SQL Packages Database - PostgreSQL - PostgreSQL

Posted 1 week ago

Apply

5.0 - 8.0 years

8 - 12 Lacs

Noida

Work from Office

Key Responsibilities: Data Testing Strategy & Execution: Design, develop, and execute comprehensive test plans and test cases for data-centric applications, ETL processes, data warehouses, data lakes, and reporting solutions. SQL-Driven Validation: Utilize advanced SQL queries to perform complex data validation, data reconciliation, data integrity checks, and data quality assurance across various financial data sources. ETL Testing: Conduct thorough testing of ETL (Extract, Transform, Load) processes, ensuring data is accurately extracted, transformed according to business rules, and loaded correctly into target systems. Data Quality Assurance: Implement and monitor data quality checks, identify data discrepancies, anomalies, and inconsistencies, and work with development and business teams to resolve issues. Performance Testing (Data Focus): Contribute to performance testing efforts for data pipelines and database operations, ensuring optimal query and data load performance. Test Data Management: Create and manage robust test data sets for various testing phases, including positive, negative, and edge case scenarios. Defect Management: Identify, document, track, and re-test defects in data, collaborating closely with development and data engineering teams for timely resolution. Documentation & Reporting: Maintain clear and concise documentation of test plans, test cases, test results, and data quality reports. Provide regular status updates to stakeholders. Collaboration: Work effectively with business analysts, data architects, data engineers, and project managers to understand data flows, business requirements, and ensure data quality standards are met. Process Improvement: Proactively identify opportunities for process improvements in data testing methodologies and tools. Global Team Collaboration: Provide consistent overlap with EST working hours (until noon EST) to facilitate effective communication and collaboration with US-based teams. Required Skills & Experience: Experience: 5-8 years of hands-on experience in Data Quality Assurance, Data Testing, or ETL Testing roles. SQL Expertise: o Advanced proficiency in SQL: Ability to write complex queries, subqueries, analytical functions (Window functions), CTEs, and stored procedures for data validation, reconciliation, and analysis. o Experience with various SQL databases (e.g., SQL Server, Oracle, PostgreSQL, MySQL, Snowflake, BigQuery). o Strong understanding of database concepts: normalization, indexing, primary/foreign keys, and data types. Data Testing Methodologies: Solid understanding of data warehousing concepts, ETL processes, and various data testing strategies (e.g., source-to-target mapping validation, data transformation testing, data load testing, data completeness, data accuracy). Domain Expertise: o Strong understanding and proven experience in Risk and Finance IT domain: Familiarity with financial data (e.g., trading data, market data, risk metrics, accounting data, regulatory reporting). o Knowledge of financial products, regulations, and risk management concepts. Analytical & Problem-Solving Skills: Excellent ability to analyze complex data sets, identify root causes of data issues, and propose effective solutions. Communication: Strong verbal and written communication skills to articulate data issues and collaborate with diverse teams. Mandatory Competencies QA/QE - QA Automation - ETL Testing ETL - ETL - Tester Beh - Communication and collaboration Database - Sql Server - SQL Packages Database - PostgreSQL - PostgreSQL

Posted 1 week ago

Apply

5.0 - 8.0 years

8 - 12 Lacs

Noida

Work from Office

Key Responsibilities: Data Testing Strategy & Execution: Design, develop, and execute comprehensive test plans and test cases for data-centric applications, ETL processes, data warehouses, data lakes, and reporting solutions. SQL-Driven Validation: Utilize advanced SQL queries to perform complex data validation, data reconciliation, data integrity checks, and data quality assurance across various financial data sources. ETL Testing: Conduct thorough testing of ETL (Extract, Transform, Load) processes, ensuring data is accurately extracted, transformed according to business rules, and loaded correctly into target systems. Data Quality Assurance: Implement and monitor data quality checks, identify data discrepancies, anomalies, and inconsistencies, and work with development and business teams to resolve issues. Performance Testing (Data Focus): Contribute to performance testing efforts for data pipelines and database operations, ensuring optimal query and data load performance. Test Data Management: Create and manage robust test data sets for various testing phases, including positive, negative, and edge case scenarios. Defect Management: Identify, document, track, and re-test defects in data, collaborating closely with development and data engineering teams for timely resolution. Documentation & Reporting: Maintain clear and concise documentation of test plans, test cases, test results, and data quality reports. Provide regular status updates to stakeholders. Collaboration: Work effectively with business analysts, data architects, data engineers, and project managers to understand data flows, business requirements, and ensure data quality standards are met. Process Improvement: Proactively identify opportunities for process improvements in data testing methodologies and tools. Global Team Collaboration: Provide consistent overlap with EST working hours (until noon EST) to facilitate effective communication and collaboration with US-based teams. Required Skills & Experience: Experience : 5-8 years of hands-on experience in Data Quality Assurance, Data Testing, or ETL Testing roles. SQL Expertise: o Advanced proficiency in SQL: Ability to write complex queries, subqueries, analytical functions (Window functions), CTEs, and stored procedures for data validation, reconciliation, and analysis. o Experience with various SQL databases (e.g., SQL Server, Oracle, PostgreSQL, MySQL, Snowflake, BigQuery). o Strong understanding of database concepts: normalization, indexing, primary/foreign keys, and data types. Data Testing Methodologies: Solid understanding of data warehousing concepts, ETL processes, and various data testing strategies (e.g., source-to-target mapping validation, data transformation testing, data load testing, data completeness, data accuracy). Domain Expertise: o Strong understanding and proven experience in Risk and Finance IT domain: Familiarity with financial data (e.g., trading data, market data, risk metrics, accounting data, regulatory reporting). o Knowledge of financial products, regulations, and risk management concepts. Analytical & Problem-Solving Skills: Excellent ability to analyze complex data sets, identify root causes of data issues, and propose effective solutions. Communication: Strong verbal and written communication skills to articulate data issues and collaborate with diverse teams. Mandatory Competencies QA/QE - QA Automation - ETL Testing ETL - ETL - Tester Beh - Communication and collaboration Database - Sql Server - SQL Packages Database - PostgreSQL - PostgreSQL

Posted 1 week ago

Apply

4.0 - 6.0 years

6 - 10 Lacs

Gurugram

Work from Office

Role Description : As a Senior Big Data Platform Engineer at Incedo, you will be responsible for designing and implementing big data platforms to support large-scale data integration projects. You will work with data architects and data engineers to define the platform architecture and build the necessary infrastructure. You will be skilled in big data technologies such as Hadoop, Spark, and Kafka and have experience in cloud computing platforms such as AWS or Azure. You will be responsible for ensuring the performance, scalability, and security of the big data platform and troubleshooting any issues that arise. Roles & Responsibilities: Designing, developing and maintaining large-scale big data platforms using technologies like Hadoop, Spark and Kafka Creating and managing data warehouses, data lakes and data marts Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Troubleshooting and resolving big data platform issues Collaborating with other teams to ensure the consistency and integrity of data Technical Skills Skills Requirements: Experience with big data processing technologies such as Apache Hadoop, Apache Spark, or Apache Kafka. Understanding of distributed computing concepts such as MapReduce, Spark RDDs, or Apache Flink data streams. Familiarity with big data storage solutions such as HDFS, Amazon S3, or Azure Data Lake Storage. Knowledge of big data processing frameworks such as Apache Hive, Apache Pig, or Apache Impala. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 1 week ago

Apply

4.0 - 6.0 years

7 - 12 Lacs

Gurugram

Work from Office

Role Description : As a Senior Big Data Platform Engineer at Incedo, you will be responsible for designing and implementing big data platforms to support large-scale data integration projects. You will work with data architects and data engineers to define the platform architecture and build the necessary infrastructure. You will be skilled in big data technologies such as Hadoop, Spark, and Kafka and have experience in cloud computing platforms such as AWS or Azure. You will be responsible for ensuring the performance, scalability, and security of the big data platform and troubleshooting any issues that arise. Roles & Responsibilities: Designing, developing and maintaining large-scale big data platforms using technologies like Hadoop, Spark and Kafka Creating and managing data warehouses, data lakes and data marts Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Troubleshooting and resolving big data platform issues Collaborating with other teams to ensure the consistency and integrity of data Technical Skills Skills Requirements: Experience with big data processing technologies such as Apache Hadoop, Apache Spark, or Apache Kafka. Understanding of distributed computing concepts such as MapReduce, Spark RDDs, or Apache Flink data streams. Familiarity with big data storage solutions such as HDFS, Amazon S3, or Azure Data Lake Storage. Knowledge of big data processing frameworks such as Apache Hive, Apache Pig, or Apache Impala. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 1 week ago

Apply

4.0 - 7.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Project description Luxoft DXC Technology Company is an established company focusing on consulting and implementation of complex projects in the financial industry. At the interface between technology and business, we convince with our know-how, well-founded methodology and pleasure in success. As a reliable partner to our renowned customers, we support them in planning, designing and implementing the desired innovations. Together with the customer, we deliver top performance! For one of our Clients in the Insurance Segment we are searching for a Developer Tableau and SQL. Responsibilities You will work on CISO multi-functional Tableau reports that interface with several applications. You will report to a senior developer who has been working on the project for a few years. Proficiency in writing complex SQL queries involving multiple joins (inner, outer, cross), subqueries, and CTEs is required for this role Strong Tableau Desktop Development Skills are required. Skills Must have Proficiency in writing complex SQL queries involving multiple joins (inner, outer, cross), subqueries, and CTEs Expertise in developing SQL Stored Procedures, Functions & Views Experience in developing ETL processes to extract and transform data using SQL (aggregations, filtering, data cleansing) and loading data into database Familiarity working in Microsoft SQL Server Familiarity with data modeling concepts (star schema) Tableau Desktop Development Skills: Expertise in developing complex, interactive dashboards in Tableau incorporating filters, parameters, and actions Experience in building user-friendly dashboards with menus, tooltips, drill-down and drill-through capabilities Ability to create calculated fields, custom aggregations, table calculations and LOD expressions Knowledge of optimizing Tableau dashboards for performance Understanding of user access & user groups creation & management in Tableau Nice to have insurance domain

Posted 1 week ago

Apply

4.0 - 6.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Role Description : As a Senior Big Data Platform Engineer at Incedo, you will be responsible for designing and implementing big data platforms to support large-scale data integration projects. You will work with data architects and data engineers to define the platform architecture and build the necessary infrastructure. You will be skilled in big data technologies such as Hadoop, Spark, and Kafka and have experience in cloud computing platforms such as AWS or Azure. You will be responsible for ensuring the performance, scalability, and security of the big data platform and troubleshooting any issues that arise. Roles & Responsibilities: Designing, developing and maintaining large-scale big data platforms using technologies like Hadoop, Spark and Kafka Creating and managing data warehouses, data lakes and data marts Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Troubleshooting and resolving big data platform issues Collaborating with other teams to ensure the consistency and integrity of data Technical Skills : Experience with big data processing technologies such as Apache Hadoop, Apache Spark, or Apache Kafka. Understanding of distributed computing concepts such as MapReduce, Spark RDDs, or Apache Flink data streams. Familiarity with big data storage solutions such as HDFS, Amazon S3, or Azure Data Lake Storage. Knowledge of big data processing frameworks such as Apache Hive, Apache Pig, or Apache Impala. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 1 week ago

Apply

4.0 - 6.0 years

6 - 10 Lacs

Gurugram

Work from Office

Role Description : As a Senior Big Data Platform Engineer at Incedo, you will be responsible for designing and implementing big data platforms to support large-scale data integration projects. You will work with data architects and data engineers to define the platform architecture and build the necessary infrastructure. You will be skilled in big data technologies such as Hadoop, Spark, and Kafka and have experience in cloud computing platforms such as AWS or Azure. You will be responsible for ensuring the performance, scalability, and security of the big data platform and troubleshooting any issues that arise. Roles & Responsibilities: Designing, developing and maintaining large-scale big data platforms using technologies like Hadoop, Spark and Kafka Creating and managing data warehouses, data lakes and data marts Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Troubleshooting and resolving big data platform issues Collaborating with other teams to ensure the consistency and integrity of data Technical Skills : Experience with big data processing technologies such as Apache Hadoop, Apache Spark, or Apache Kafka. Understanding of distributed computing concepts such as MapReduce, Spark RDDs, or Apache Flink data streams. Familiarity with big data storage solutions such as HDFS, Amazon S3, or Azure Data Lake Storage. Knowledge of big data processing frameworks such as Apache Hive, Apache Pig, or Apache Impala. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 1 week ago

Apply

4.0 - 6.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Role Description : As a Senior Software Engineer - ETL - Python at Incedo, you will be responsible for designing and developing ETL workflows to extract, transform, and load data from various sources to target systems. You will work with data analysts and architects to understand business requirements and translate them into technical solutions. You will be skilled in ETL tools such as Informatica or Talend and have experience in programming languages such as SQL or Python. You will be responsible for writing efficient and reliable code that is easy to maintain and troubleshoot. Roles & Responsibilities: Develop, maintain, and enhance software applications for Extract, Transform, and Load (ETL) processes Design and implement ETL solutions that are scalable, reliable, and maintainable Develop and maintain ETL code, scripts, and jobs, ensuring they are efficient, accurate, and meet business requirements Troubleshoot and debug ETL code, identifying and resolving issues in a timely manner Collaborate with cross-functional teams, including data analysts, business analysts, and project managers, to understand requirements and deliver solutions that meet business needs Design and implement data integration processes between various systems and data sources Optimize ETL processes to improve performance, scalability, and reliability Create and maintain technical documentation, including design documents, coding standards, and best practices. Technical Skills : Proficiency in programming languages such as Python for writing ETL scripts. Knowledge of data transformation techniques such as filtering, aggregation, and joining. Familiarity with ETL frameworks such as Apache NiFi, Talend, or Informatica. Understanding of data profiling, data quality, and data validation techniques. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 1 week ago

Apply

8.0 - 10.0 years

10 - 15 Lacs

Noida

Work from Office

Key Responsibilities: Data Testing Strategy & Execution: Design, develop, and execute comprehensive test plans and test cases for data-centric applications, ETL processes, data warehouses, data lakes, and reporting solutions. SQL-Driven Validation: Utilize advanced SQL queries to perform complex data validation, data reconciliation, data integrity checks, and data quality assurance across various financial data sources. ETL Testing: Conduct thorough testing of ETL (Extract, Transform, Load) processes, ensuring data is accurately extracted, transformed according to business rules, and loaded correctly into target systems. Data Quality Assurance: Implement and monitor data quality checks, identify data discrepancies, anomalies, and inconsistencies, and work with development and business teams to resolve issues. Performance Testing (Data Focus): Contribute to performance testing efforts for data pipelines and database operations, ensuring optimal query and data load performance. Test Data Management: Create and manage robust test data sets for various testing phases, including positive, negative, and edge case scenarios. Defect Management: Identify, document, track, and re-test defects in data, collaborating closely with development and data engineering teams for timely resolution. Documentation & Reporting: Maintain clear and concise documentation of test plans, test cases, test results, and data quality reports. Provide regular status updates to stakeholders. Collaboration: Work effectively with business analysts, data architects, data engineers, and project managers to understand data flows, business requirements, and ensure data quality standards are met. Process Improvement: Proactively identify opportunities for process improvements in data testing methodologies and tools. Global Team Collaboration: Provide consistent overlap with EST working hours (until noon EST) to facilitate effective communication and collaboration with US-based teams. Required Skills & Experience: Experience: 8-10 years of hands-on experience in Data Quality Assurance, Data Testing, or ETL Testing roles. SQL Expertise: o Advanced proficiency in SQL: Ability to write complex queries, subqueries, analytical functions (Window functions), CTEs, and stored procedures for data validation, reconciliation, and analysis. o Experience with various SQL databases (e.g., SQL Server, Oracle, PostgreSQL, MySQL, Snowflake, BigQuery). o Strong understanding of database concepts: normalization, indexing, primary/foreign keys, and data types. Data Testing Methodologies: Solid understanding of data warehousing concepts, ETL processes, and various data testing strategies (e.g., source-to-target mapping validation, data transformation testing, data load testing, data completeness, data accuracy). Domain Expertise: o Strong understanding and proven experience in Risk and Finance IT domain: Familiarity with financial data (e.g., trading data, market data, risk metrics, accounting data, regulatory reporting). o Knowledge of financial products, regulations, and risk management concepts. Analytical & Problem-Solving Skills: Excellent ability to analyze complex data sets, identify root causes of data issues, and propose effective solutions. Communication: Strong verbal and written communication skills to articulate data issues and collaborate with diverse teams. Mandatory Competencies ETL - ETL - Tester QA/QE - QA Automation - ETL Testing Database - PostgreSQL - PostgreSQL Beh - Communication Database - Sql Server - SQL Packages

Posted 1 week ago

Apply

4.0 - 8.0 years

15 - 30 Lacs

Hyderabad

Remote

About the Role We are seeking a ServiceNow ITOM Consultant with a strong foundation in ServiceNow Discovery, ETL/transform maps, and CMDB/CSDM concepts. The ideal candidate will bring a combination of technical acumen and infrastructure knowledge to support ITOM implementations and CMDB enrichment efforts. Key Responsibilities : Discovery & CMDB Implementation: o Configure and manage ServiceNow Discovery jobs (agent and agentless). Contribute to CMDB population, normalization, and reconciliation efforts. Align data models with CSDM (Common Service Data Model) practices. ETL & Transform Development: o Develop and manage transform maps and data imports into the CMDB. Troubleshoot data quality issues and perform data clean-up as needed. Scripting & Configuration: o Write basic scripts (JavaScript, Glide API) to enhance data processing and platform automation. Support development of business rules, UI policies, and scheduled jobs related to ITOM. Infrastructure Awareness: o Understand basic IT infrastructure concepts (e.g., servers, network devices, cloud components). Communicate effectively with infra teams during discovery and integrations. Collaboration & Contribution: Support senior consultants and architects in ITOM and CMDB strategy execution. Contribute actively to documentation, testing, and knowledge sharing across the team. Requirements • 4+ years of experience with ServiceNow, preferably in ITOM and CMDB-related projects. • Hands-on experience with ServiceNow Discovery, including credentials and probe troubleshooting. • Good understanding and practical experience in ETL processes and transform maps. • Basic proficiency in JavaScript and ServiceNow scripting. • Exposure to CSDM framework and CMDB health practices. • Solid grasp of IT infrastructure fundamentals (e.g., OS, network layers, cloud components). • Strong communication and collaborative skills.

Posted 1 week ago

Apply

4.0 - 6.0 years

6 - 10 Lacs

Hyderabad

Work from Office

As a Senior Big Data Platform Engineer at Incedo, you will be responsible for designing and implementing big data platforms to support large-scale data integration projects. You will work with data architects and data engineers to define the platform architecture and build the necessary infrastructure. You will be skilled in big data technologies such as Hadoop, Spark, and Kafka and have experience in cloud computing platforms such as AWS or Azure. You will be responsible for ensuring the performance, scalability, and security of the big data platform and troubleshooting any issues that arise. Roles & Responsibilities: Designing, developing and maintaining large-scale big data platforms using technologies like Hadoop, Spark and Kafka Creating and managing data warehouses, data lakes and data marts Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Troubleshooting and resolving big data platform issues Collaborating with other teams to ensure the consistency and integrity of data Technical Skills Skills Requirements: Experience with big data processing technologies such as Apache Hadoop, Apache Spark, or Apache Kafka. Understanding of distributed computing concepts such as MapReduce, Spark RDDs, or Apache Flink data streams. Familiarity with big data storage solutions such as HDFS, Amazon S3, or Azure Data Lake Storage. Knowledge of big data processing frameworks such as Apache Hive, Apache Pig, or Apache Impala. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 1 week ago

Apply

7.0 - 9.0 years

6 - 10 Lacs

Chennai

Work from Office

As a Technical Lead - Cloud Data Platform (AWS) at Incedo, you will be responsible for designing, deploying and maintaining cloud-based data platforms on the AWS platform. You will work with data engineers, data scientists and business analysts to understand business requirements and design scalable, reliable and cost-effective solutions that meet those requirements. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Amazon Web Services (AWS) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Collaborating with other teams to ensure the consistency and integrity of data Troubleshooting and resolving data platform issues Technical Skills Skills Requirements: In-depth knowledge of AWS services and tools such as AWS Glue, AWS Redshift, and AWS Lambda Experience in building scalable and reliable data pipelines using AWS services, Apache Spark, and related big data technologies Familiarity with cloud-based infrastructure and deployment, specifically on AWS Strong knowledge of programming languages such as Python, Java, and SQL Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Should be open to new ideas and be willing to learn and develop new skills. Should also be able to work well under pressure and manage multiple tasks and priorities. Nice-to-have skills Qualifications Qualifications 7-9 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies