Posted:1 month ago|
Platform:
Work from Office
Full Time
We are seeking a skilled Data Engineer with extensive experience in the Hadoop ecosystem and a strong background in integrating and managing data from FLEXCUBE core banking systems. The ideal candidate will play a key role in designing, implementing, and optimizing our data pipelines, ensuring seamless data flow and analysis. Career Level - IC3 Responsibilities Key Responsibilities: Data Integration: Lead the integration of data from FLEXCUBE core banking systems into our Hadoop-based data infrastructure. Develop and maintain efficient data ingestion processes. Hadoop Ecosystem: Design, build, and optimize data pipelines within the Hadoop ecosystem, including HDFS, Sqoop, Unix shell scripting, Python coding and Spark. Data Modelling: Create and maintain data models, schemas, and structures to support data analysis and reporting requirements. ETL Processes: Develop Extract, Transform, Load (ETL) processes to cleanse, enrich, and transform data for downstream consumption. Data Quality: Implement data quality checks and monitoring processes to ensure the accuracy, completeness, and consistency of data. Performance Optimization: Optimize data processing and query performance within the Hadoop ecosystem. Data Security: Ensure data security and compliance with data privacy regulations during data handling and processing. Documentation: Maintain thorough documentation of data pipelines, transformations, and data flow processes. Collaboration: Collaborate with cross-functional teams, including FLEXCUBE consulting, data scientists, analysts, and business stakeholders, to understand data requirements and deliver actionable insights. Mastery of the FLEXCUBE 14.7 backend tables and data model is essential. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. 5 to 8 Years of proven experience in designing and implementing data solutions within the Hadoop ecosystem. Strong expertise in Hadoop components such as HDFS, Sqoop, Unix shell scripting, Python coding and Spark Experience with FLEXCUBE integration and data extraction. Proficiency in SQL and database systems. Knowledge of data modelling and ETL processes. Strong problem-solving and analytical skills. Excellent communication and teamwork abilities. Banking or financial industry experience is a plus. Certifications in Hadoop or related technologies are beneficial. Additional Information: This role offers an exciting opportunity to work on cutting-edge data projects and contribute to data-driven decision-making in the financial sector. The candidate should be prepared to work in a dynamic and collaborative environment. Candidates with a strong background in the Hadoop ecosystem and experience with FLEXCUBE integration are encouraged to apply. We are committed to fostering professional growth and providing opportunities for skill development.
Oracle
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Bengaluru
25.0 - 30.0 Lacs P.A.
Chennai, Tamil Nadu, India
Salary: Not disclosed
Bengaluru
9.0 - 13.0 Lacs P.A.
Bengaluru
22.5 - 37.5 Lacs P.A.
Bengaluru
12.0 - 22.0 Lacs P.A.
10.0 - 20.0 Lacs P.A.
Bengaluru
4.17 - 6.9 Lacs P.A.
Bengaluru
Experience: Not specified
Salary: Not disclosed
Pune, Maharashtra, India
Experience: Not specified
Salary: Not disclosed
Pune, Maharashtra, India
Salary: Not disclosed