Jobs
Interviews

2 Pentahodi Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be responsible for developing, deploying, monitoring, and maintaining ETL Jobs as well as all data engineering and pipeline activities. Your role will involve having a good understanding of DB activities and providing support in DB solutions. Additionally, you must possess proven expertise in SQL queries. Your key responsibilities will include designing and constructing various enterprise procedure constructs using any ETL tool, preferably PentahoDI. You will be expected to provide accurate work estimates, manage efforts across multiple lines of work, design and develop exception handling and data cleansing/standardization procedures, gather requirements from various stakeholders related to ETL automation, as well as design and create data extraction, transformation, and load functions. Moreover, you will be involved in data modeling of complex large data sets, conducting tests, validating data flows, preparing ETL processes according to business requirements, and incorporating all business requirements into design specifications. As for qualifications and experience, you should hold a B.E./B.Tech/MCA degree with at least 10 years of experience in designing and developing large-scale enterprise ETL solutions. Prior experience in any ETL tool, primarily PentahoDI, and a good understanding of databases along with expertise in writing SQL queries are essential. In terms of skills and knowledge, you should have experience in full lifecycle software development and production support for DWH systems, data analysis, modeling, and design specific to a DWH/BI environment. Exposure to developing ETL packages and jobs using SPOON, scheduling Pentaho ETL Jobs in crontab, as well as familiarity with Hadoop, Hive, PIG, SQL scripting, data loading tools like Flume, Sqoop, workflow/schedulers like Oozie, and migrating existing dataflows into Big Data platforms are required. Experience in any open-source BI and databases will be considered advantageous. Joining us will provide you with impactful work where you will play a pivotal role in safeguarding Tanla's assets, data, and reputation in the industry. You will have tremendous growth opportunities as part of a rapidly growing company in the telecom and CPaaS space, with chances for professional development. Moreover, you will have the opportunity to work in an innovative environment alongside a world-class team, where innovation is celebrated. Tanla is an equal opportunity employer that champions diversity and is committed to creating an inclusive environment for all employees.,

Posted 1 week ago

Apply

10.0 - 12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Overview: Responsible for developing, deploying, monitoring and maintaining ETL Jobs and all data engineering and pipeline activities. Having good knowledge of DB activities and support in DB solutions. Having proven expertise in SQL queries. Responsibilities and Duties Design and construction of various enterprise procedure constructs using any ETL tool preferably PentahoDI. Provide accurate work estimates and manager efforts across multiple lines of work Design and develop exception handling and data cleansing/standardization procedures Requirement gathering from various stakeholders related to ETL automation Designing and creating the data extraction, transformation, and load of data functions Data modelling of complex large data sets Perform tests and validate all data flows and prepare all ETL processes according to business requirements and incorporate all business requirements into all design specifications Should have knowledge and experience in latest data engineering and pipeline solutions via pyspark/Apache iceberg/Apache airflow Good knowledge in DB activities and provide solutions in DB Maintenance activities such as Installation/Backup/Purging/Data Retention. Qualification and Experience B.E./B.Tech/MCA 10 years of experience in design and development of large-scale enterprise ETL solutions Experience in any ETL tool primarily PentahoDI Good knowledge and experience in any DataBase and in writing SQL queries. Knowledge and Skills Experience in full lifecycle software development and production support for DWH systems Experience in data analysis, modelling (logical and physical data models) and design specific to a DWH/BI environment (normalized and multi-dimensional modelling) Exposure in development of ETL packages and job using SPOON Exposure in scheduling Pentaho ETL Jobs in crontab (i.e. Kitchen) Exposure in Hadoop, Hive and PIG Experience in SQL scripting for relational databases such as MySQL, PostgreSQL, etc Experience in data loading tools like Flume, Sqoop Knowledge of workflow/schedulers like Oozie Knowledge of migrating existing dataflows into Big Data platforms Experience in any opensource BI will be an added advantage Experience in any DataBase will also be an added advantage. Why join us Impactful Work: Play a pivotal role in safeguarding Tanla&aposs assets, data, and reputation in the industry. Tremendous Growth Opportunities: Be part of a rapidly growing company in the telecom and CPaaS space, with opportunities for professional development. Innovative Environment: Work alongside a world-class team in a challenging and fun environment, where innovation is celebrated. Tanla is an equal opportunity employer. We champion diversity and are committed to creating an inclusive environment for all employees. www.tanla.com Show more Show less

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies