Posted:22 hours ago|
Platform:
Work from Office
Full Time
We are seeking an experienced ETL Developer with strong Pentaho Data Integration (PDI) expertise to support new solution implementations and modernization of existing data workflows. The role involves re-engineering current processes built using Shell scripts, Java, or other legacy automation tools into scalable Pentaho jobs. The candidate will also contribute to cloud migration efforts to Azure, ensuring enterprise-grade performance, security, and maintainability.
Key Responsibilities:
Design, develop, and maintain ETL workflows in Pentaho (PDI) based on existing processes in Shell scripts, Java, or other automation tools.
Implement new, efficient, and scalable data integration pipelines in Pentaho to meet evolving business requirements.
Analyze and reverse-engineer current data workflows to build equivalent solutions in Pentaho.
Support migration of existing on-prem or custom data solutions to Azure Cloud, integrating with services like Azure Blob Storage, ADF, Azure SQL, Key Vault, etc.
Work with various source and target systems such as Oracle, PostgreSQL, SQL Server, CSV, JSON, XML, APIs, etc.
Develop parameterized, modular, and reusable Pentaho transformations and jobs.
Perform data validation, reconciliation, error handling, and logging within the ETL framework.
Optimize Pentaho jobs for performance and monitor scheduled job execution.
Ensure data quality and governance, aligning with enterprise and compliance standards (e.g., GDPR, HIPAA).
Collaborate with business analysts, architects, and data engineers to deliver solutions aligned with functional needs.
Document ETL design, data flow, and operations for ongoing support and enhancements.
Participate in Agile ceremonies, provide estimates, and track tasks using tools like JIRA.
Technical Skills:
Experience in rewriting/refactoring legacy scripts into ETL jobs using visual tools like Pentaho.
Strong background in data processing workflows implemented in Shell scripts, Java, or similar tools.
Hands-on experience with Azure Cloud Services relevant to data migration:
o Azure Data Factory
o Azure Blob Storage
o Azure SQL / Synapse
o Azure Key Vault / Managed Identity
Proficient in SQL, stored procedures, and performance tuning.
Experience with data validation, audit logging, and data quality frameworks.
Knowledge of file-based, API-based, and database-based integration techniques.
Version control using Git/GitLab, and awareness of CI/CD practices for ETL deployments.
Familiarity with Agile/Scrum/SAFe methodologies, and use of JIRA/Confluence.
Familiarity with Apache HOP, PowerBI is a plus
Experience in data archival, purging, and retention policy implementation.
Roles and ResponsibilitiesWe are seeking an experienced ETL Developer with strong Pentaho Data Integration (PDI) expertise to support new solution implementations and modernization of existing data workflows. The role involves re-engineering current processes built using Shell scripts, Java, or other legacy automation tools into scalable Pentaho jobs. The candidate will also contribute to cloud migration efforts to Azure, ensuring enterprise-grade performance, security, and maintainability.
Key Responsibilities:
Design, develop, and maintain ETL workflows in Pentaho (PDI) based on existing processes in Shell scripts, Java, or other automation tools.
Implement new, efficient, and scalable data integration pipelines in Pentaho to meet evolving business requirements.
Analyze and reverse-engineer current data workflows to build equivalent solutions in Pentaho.
Support migration of existing on-prem or custom data solutions to Azure Cloud, integrating with services like Azure Blob Storage, ADF, Azure SQL, Key Vault, etc.
Work with various source and target systems such as Oracle, PostgreSQL, SQL Server, CSV, JSON, XML, APIs, etc.
Develop parameterized, modular, and reusable Pentaho transformations and jobs.
Perform data validation, reconciliation, error handling, and logging within the ETL framework.
Optimize Pentaho jobs for performance and monitor scheduled job execution.
Ensure data quality and governance, aligning with enterprise and compliance standards (e.g., GDPR, HIPAA).
Collaborate with business analysts, architects, and data engineers to deliver solutions aligned with functional needs.
Document ETL design, data flow, and operations for ongoing support and enhancements.
Participate in Agile ceremonies, provide estimates, and track tasks using tools like JIRA.
Technical Skills:
Experience in rewriting/refactoring legacy scripts into ETL jobs using visual tools like Pentaho.
Strong background in data processing workflows implemented in Shell scripts, Java, or similar tools.
Hands-on experience with Azure Cloud Services relevant to data migration:
o Azure Data Factory
o Azure Blob Storage
o Azure SQL / Synapse
o Azure Key Vault / Managed Identity
Proficient in SQL, stored procedures, and performance tuning.
Experience with data validation, audit logging, and data quality frameworks.
Knowledge of file-based, API-based, and database-based integration techniques.
Version control using Git/GitLab, and awareness of CI/CD practices for ETL deployments.
Familiarity with Agile/Scrum/SAFe methodologies, and use of JIRA/Confluence.
Familiarity with Apache HOP, PowerBI is a plus
Experience in data archival, purging, and retention policy implementation.
PureSoftware Pvt Ltd
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Java coding challenges to boost your skills
Start Practicing Java Now
gurugram
10.0 - 16.5 Lacs P.A.
5.0 - 8.0 Lacs P.A.
bengaluru
5.0 - 9.0 Lacs P.A.
5.0 - 8.0 Lacs P.A.
4.0 - 7.0 Lacs P.A.
bengaluru
15.0 - 30.0 Lacs P.A.
4.0 - 8.0 Lacs P.A.
pune, chennai, bengaluru
6.0 - 10.0 Lacs P.A.
noida, uttar pradesh
Salary: Not disclosed
gurugram, bengaluru
9.0 - 10.0 Lacs P.A.