AWS Snowflake + DBT- Senior Manager

12 - 16 years

0 Lacs

Posted:3 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions.At PwC - AC, as an AWS Architect / Manager, the candidate will interact with Offshore Manager/ Onsite Business Analyst to understand the requirements and the candidate is responsible for end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake, Data hub in AWS. The candidate should have strong experience in AWS cloud technology. Strong in planning and organization skills. Ability to work as cloud Architect / lead on an agile team and provide automated cloud solutions. Monitor the systems routinely to ensure that all business goals are met as per the Business requirements.

Years of Experience:

Candidates with 12 -16 years of hands-on experience

Position Requirements

Must Have

:
  • Experience in architecting and delivering highly scalable, distributed, cloud-based enterprise data solutions
  • Strong expertise in the end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake, Data hub in AWS
  • Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe,ETL data Pipelines, Big Data model techniques using
Python / Java
  • Design scalable data architectures with Snowflake, integrating cloud technologies (AWS, Azure, GCP) and ETL/ELT tools such as DBT
  • Guide teams in proper data modeling (star, snowflake schemas), transformation, security, and performance optimization
  • Experience in load from disparate data sets and translating complex functional and technical requirements into detailed design
  • Should be aware in deploying Snowflake features such as data sharing, events and lake-house patterns
  • Should have experience with data security and data access controls and design
  • Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modeling)
  • Good to have AWS, Azure or GCP data storage and management technologies such as S3, Blob/ADLS and Google Cloud Storage
  • Proficient in Lambda and Kappa Architectures
  • Strong AWS hands-on expertise with a programming background preferably Python/Scala
  • Good knowledge of Big Data frameworks and related technologies - Experience in Hadoop and Spark is mandatory
  • Strong experience in AWS compute services like AWS EMR, Glue and Sagemaker and storage services like S3, Redshift & Dynamodb
  • Good experience with any one of the AWS Streaming Services like AWS Kinesis, AWS SQS and AWS MSK
  • Troubleshooting and Performance tuning experience in Spark framework - Spark core, Sql and Spark Streaming
  • Experience in one of the flow tools like Airflow, Nifi or Luigi
  • Good knowledge of Application DevOps tools (Git, CI/CD Frameworks) - Experience in Jenkins or Gitlab with rich experience in source code management like Code Pipeline, Code Build and Code Commit
  • Experience with AWS CloudWatch, AWS Cloud Trail, AWS Account Config, AWS Config Rules
  • Strong understanding of Cloud data migration processes, methods and project lifecycle
  • Business / domain knowledge in one of the domains -Financial Services/ Healthcare/ Consumer Market/Industrial Products/ Telecommunication,Media and Technology /Deal advisory along with technical expertise
  • Experience in leading technical teams, guiding and mentoring team members
  • Good analytical & problem-solving skills
  • Good communication and presentation skills
  • Good understanding of Data Modeling and Data Architecture

Desired Knowledge / Skills

  • Experience in building stream-processing systems, using solutions such as Storm or Spark-Streaming
  • Experience in Big Data ML toolkits, such as Mahout, SparkML, or H2O
  • Knowledge in Python
  • Certification on AWS Architecture desirable
  • Worked in Offshore / Onsite Engagements
  • Experience in AWS services like STEP & Lambda
  • Good Project Management skills with consulting experience in Complex Program Delivery

Professional And Educational Background

  • BE / B.Tech / MCA / M.Sc / M.E / M.Tech / MBA

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You