Data Engineer II

6 - 11 years

7.0 - 11.0 Lacs P.A.

Bengaluru

Posted:2 months ago| Platform: Naukri logo

Apply Now

Skills Required

SAPData modelingBusiness analyticsAgileUnit testingABAPAnalyticsSQLPython

Work Mode

Work from Office

Job Type

Full Time

Job Description

Standardize data and reporting across businesses and teams to enable consistency and quality of key business data Design and develop enterprise data warehouse platform that conforms to consistent methodologies, standards and industry best practices Design, develop, test and document data integrations and assist with deployment, validation and hypercare Maintain and support SAP Datasphere analytical model and Azure integrations like ADF, Databricks Develop SQL view using Standard tables to monitor the Remote tables & Remote Queries which are scheduled and create task chains to send emails alerts for any errors on data loading activities. Create SQL views to handle complex business logics and holds good hands-on SQL. Develop multiple Analytical Models to make the reporting available for SAC & Power BI. Develop Task chains to handle the Dataflow updates in a sequential order and for better monitoring. Create Multiple Dataflow to move data from MSSQL tables to persistent tables in Datasphere and incorporating diverse data cleansing logics applied. Develop procedures in DB explore of the Datasphere space to convert the ABAP programs logics to Persistent tables and facilitating streamlined reporting processes. Adept at creating and managing Task chains for efficient data loading and monitoring using DWF Task chain Monitor and involved in meticulous unit testing of dataflows and conducting thorough data reconciliations with source systems to guarantee data accuracy. Knowledge in setting up connections between Non-SAP sources & Datasphere. Integrate TPM data by using Azure as a Source and use Replication flows to move data to Datasphere. Develop replication flows for Delta enabled CDS views and move data into Datasphere. Experience in creating Models and Dimensions in SAC. Created Functional & Technical Design document for Datasphere requirements. Transporting Datasphere artifacts from one landscape to another using Import/Export folder creation. Your Profile: 4-year Bachelor s degree or equivalent in IT, Computer Science, Science, Engineering, Statistics, Programming, Business Analytics, Mathematical or related field At least 6 years of enterprise BI and Analytics technical experience implementing data warehouse and data marts in an Agile environment 5 years of recent experience using data integration, SAP Datasphere, ETL data replication, and data warehouse automation tools such as Microsoft Azure Data Factory, Databricks and BEX. 5 years of recent experience in data processing using SQL, PySpark, Python. Strong working knowledge of Data Modeling & Data Transformation in Datasphere. Extensive experience in Multiple Remote Tables with Semantic usage of (Relational Dataset, Fact, Dimension, Text, Hierarchy tables) Hands on knowledge on creation of remote tables to consume data from S4 using the CDS views & S4 Tables. Experience in dynamic filters in remote tables to optimize data retrieval from S4 CDS views and S4 tables, ensuring efficient and targeted data access. Experience in Analyzing business requirements to identify and decide which data should be stored in Datasphere. Experience in creating complex data modelling using Graphical Views (Relational Dataset/Fact/Dimension) based on the Functional design document and leveraging S4 & Non-SAP as a data source. Experience working in DevOps / CICD framework High accountability with a demonstrated ability to deliver Strong communication skills including design documentation Strong collaboration skills working with architecture, design and development teams

Agriculture & Food Processing
Decatur

RecommendedJobs for You

Chennai, Pune, Mumbai, Bengaluru, Gurgaon

Chennai, Pune, Delhi, Mumbai, Bengaluru, Hyderabad, Kolkata

Pune, Bengaluru, Mumbai (All Areas)