At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions.
Years of Experience:
Candidates with 8-12 years of hands on experience
Position Requirements
Must Have
:
- Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, ,ETL data Pipelines, Big Data model techniques using
Python / Java
Proven experience architecting and implementing large-scale data solutions on Snowflake, including data ingestion, transformation, and optimization.Proficiency in Azure Databricks, including Spark architecture and optimization.
- Experience migrating data from relational databases to Snowflake and optimizing Snowflake’s features such as data sharing, events, and lakehouse patterns.
- Experience in Data Migration from RDBMS to Snowflake cloud data warehouse
- Hands-on expertise with Dremio for data lake query acceleration, data virtualization, and managing diverse data formats (e.g., JSON, XML, CSV).Handling large and complex sets of XML, JSON, and CSV from various sources and databases
- Rich experience working in Azure ADLS, Data Bricks, Data Flows, HDInsight, Azure Analysis services
- Experience in load from disparate data sets and translate complex functional and technical requirements into detailed design
- Knowledge of data security, access controls, and governance within cloud-native data platforms like Snowflake and Dremio.
- Exposure to cloud AWS, Azure or GCP data storage and management technologies such as S3, Blob/ADLS and Google Cloud Storage
- Should have a good understanding of Data Quality processes, methods and project lifecycle.
- Experience validating the ETL and writing SQL queries
- Strong knowledge in DWH/ODS, ETL concept and modeling structure principles
- Should have clear understanding of DW Lifecycle and contributed in preparing technical design documents and test plans
- Good analytical & problem-solving skills
- Good communication and presentation skills
- Experience in leading technical teams, guiding and mentoring team members
Good To Have Skills
- Knowledge of Snowflake Cortex, Snowflake AI features and Sigma.
- Familiarity with data visualization tools such as Tableau or Qlik.
- Experience with programming languages such as Python, Java, or PySpark.
- Certifications related to Snowflake, Databricks, or cloud platforms.
- Experience in pre-sales engagements and proposal development.
- Data Modeling and Data Architecture
- Cloud - AWS, GCP, Oracle-Cloud
- Business Intelligence - MicroStrategy, Business Objects, Cognos
Additional Skills
- Excellent communication skills, client presence, presentation skills
- People skills, Leadership skills
- Engage with business and technical teams to drive consensus
Professional And Educational Background
- BE / B.Tech / MCA / M.Sc / M.E / M.Tech / MBA