6 - 10 years
3 - 8 Lacs
Posted:1 week ago|
Platform:
Work from Office
Full Time
Roles & Responsibilities: Oracle Warehouse Builder, OWB, Oracle Workflow Builder, Oracle TBSS Oracle Warehouse Builder 9i (Client Version 9.0.2.62.3/Repository Version 9.0.2.0.0) Oracle Warehouse Builder 4 Oracle Workflow Builder 2.6.2 Oracle Database 10gTNS for IBM/AIX RISC System/6000Version 10.2.0.5.0 - Production More than 5 years experience on Oracle Warehouse Builder (OWB) and Oracle Workflow Builder Expert Knowledge on Oracle PL/SQL to develop individual code objects to entire DataMarts. Scheduling tools Oracle TBSS (DBMS_SCHEDULER jobs to create and run) and trigger based for file sources based on control files. Must have design and development experience in data pipeline solutions from different source systems (FILES, Oracle) to data lakes. Must have involved in creating/designing Hive tables and loading analyzing data using hive queries. Must have knowledge in CA Workload Automation DE 12.2 to create jobs and scheduling. Extensive knowledge on entire life cycle of Change/Incident/Problem management by using ServiceNow. Oracle Warehouse Builder 9i (Client Version 9.0.2.62.3/Repository Version 9.0.2.0.0). Oracle Warehouse Builder 4 Oracle Workflow Builder 2.6.2 Oracle Database 10gTNS for IBM/AIX RISC System/6000Version 10.2.0.5.0 - Production. Oracle Enterprise Manager 10gR1.(Monitoring jobs and tablespaces utilization) Extensive knowledge in fetching Mainframe Cobol files(ASCII AND EBSDIC formats) to the landing area and processing(formatting) and loading(Error handling) of these files to oracle tables by using SQL*Loader and External tables. Extensive knowledge in Oracle Forms 6 to integrate with OWB 4. Extensive knowledge on entire life cycle of Change/Incident/Problem management by using Service-Now. work closely with the Business owner teams and Functional/Data analysts in the entire development/BAU process. Work closely with AIX support, DBA support teams for access privileges and storage issues etc. work closely with the Batch Operations team and MFT teams for file transfer issues. Migration of Oracle to Hadoop eco system: Must have working experience in Hadoop eco system elements like HDFS,MapReduce,YARN etc. Must have working knowledge on Scala & Spark Dataframes to convert the existing code to Hadoop data lakes. Must have design and development experience in data pipeline solutions from different source systems (FILES, Oracle) to data lakes. Must have involved in creating/designing Hive tables and loading analyzing data using hive queries. Must have knowledge in creating Hive partitions, Dynamic partitions and buckets. Must have knowledge in CA Workload Automation DE 12.2 to create jobs and scheduling. Use Denodo for Data virtualization to the required data access for end users.
GEETHA TECHNOLOGY SOLUTIONS PRIVATE LIMITED
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections GEETHA TECHNOLOGY SOLUTIONS PRIVATE LIMITED
Pune, Maharashtra, India
6.0 - 10.0 Lacs P.A.
3.0 - 8.0 Lacs P.A.
3.0 - 8.0 Lacs P.A.
16.0 - 17.0 Lacs P.A.
Hyderabad
25.0 - 30.0 Lacs P.A.
Hyderabad, Bengaluru, Mumbai (All Areas)
20.0 - 35.0 Lacs P.A.
Guwahati
10.0 - 20.0 Lacs P.A.
Hyderabad, Bengaluru
6.0 - 10.0 Lacs P.A.
7.0 - 10.0 Lacs P.A.
10.0 - 14.0 Lacs P.A.