Home
Jobs

Data Engineer

4 - 8 years

10 - 18 Lacs

Posted:1 week ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

If Interested please fill the below application link : https://forms.office.com/r/Zc8wDfEGEH Responsibilities: Deliver projects integrating data flows within and across technology systems. Lead data modeling sessions with end user groups, project stakeholders, and technology teams to produce logical and physical data models. Design end-to-end job flows that span across systems, including quality checks and controls. Create technology delivery plans to implement system changes. Perform data analysis, data profiling, and data sourcing in relational and Big Data environments. Convert functional requirements into logical and physical data models. Assist in ETL development, testing, and troubleshooting ETL issues. Troubleshoot data issues and work with data providers for resolution; provide L3 support when needed. Design and develop ETL workflows using modern coding and testing standards. Participate in agile ceremonies and actively drive towards team goals. Collaborate with a global team of technologists. Lead with ideas and innovation. Manage communication and partner with end users to design solutions. Required Skills: Must have: Total experience required 4-10 years (relevant experience minimum 5 years) 5 years of project experience in Python/Shell scripting in Data Engineering (experience in building and optimizing data pipelines, architectures, and data sets with large data volumes). 3+ years of experience in PySpark scripting, including the architecture framework of Spark. 3-5 years of strong experience in database development (Snowflake/ SQL Server/Oracle/Sybase/DB2) in designing schema, complex procedures, complex data scripts, query authoring (SQL), and performance optimization. Strong understanding of Unix environment and batch scripting languages (Shell/Python). Strong knowledge of Big Data/Hadoop platform. Strong engineering skills with the ability to understand existing system designs and enhance or migrate them. Strong logical data modeling skills within the Financial Services domain. Experience in data integration and data conversions. Strong collaboration and communication skills. Strong organizational and planning skills. Strong analytical, profiling, and troubleshooting skills. Good to Have: Experience with ETL tools (e.g Informatica, Azure Data Factory) and pipelines across disparate sources is a plus. Experience working with Databricks is a plus. Familiarity with standard Agile & DevOps methodology & tools (Jenkins, Sonar, Jira). Good understanding of developing ETL processes using Informatica or other ETL tools. Experience working with Source Code Management solutions (e.g., Git). Knowledge of Investment Management Business. Experience with job scheduling tools (e.g., Autosys). Experience with data visualization software (e.g., Tableau). Experience with data modeling tools (e.g., Power Designer). Basic familiarity with using metadata stores to maintain a repository of Critical Data Elements. (e.g. Collibra) Familiarity with XML or other markup languages. Mandatory skill sets: ETL,Python/Shell scripting , building pipelines,pyspark, database, sql Preferred skill sets: informatica, hadoop, databricks, collibra

Mock Interview

Practice Video Interview with JobPe AI

Start Pyspark Interview Now

My Connections PwC India

Download Chrome Extension (See your connection in the PwC India )

chrome image
Download Now
PwC India
PwC India

Business Consulting and Services

Kolkata West Bengal

10001 Employees

1625 Jobs

    Key People

  • Santhosh Rao

    Partner
  • Ruchi Bhattacharya

    Partner, Head of Consulting

RecommendedJobs for You

Hyderabad, Chennai, Bengaluru

Hyderabad, Chennai, Bengaluru