Analytics Consultant - Power BI Developer

2 - 7 years

7 - 17 Lacs

Posted:5 hours ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

About this role:

Wells Fargo is seeking a Analytics Consultant.

In this role, you will:

  • Consult with business line and enterprise functions on less complex research
  • Use functional knowledge to assist in non-model quantitative tools that support strategic decision making
  • Perform analysis of findings and trends using statistical analysis and document process
  • Present recommendations to increase revenue, reduce expense, maximize operational efficiency, quality, and compliance
  • Identify and define business requirements and translate data and business needs into research and recommendations to improve efficiency
  • Participate in all group technology efforts including design and implementation of database structures, analytics software, storage, and processing
  • Develop customized reports and ad hoc analyses to make recommendations and provide guidance to less experienced staff
  • Understand compliance and risk management requirements for supported area
  • Ensure adherence to data management or data governance regulations and policies
  • Participate in company initiatives or processes to assist in meeting risk and capital objectives and other strategic goals
  • Collaborate and consult with more experienced consultants and with partners in technology and other business groups

Required Qualifications:

  • 2+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education

Desired Qualifications:

  • Hands-on Proficiency in Business Intelligence (BI) particularly Microsoft Power BI, MS Fabric, Power Automate and Power Platforms.
  • Hands-on Proficiency in any or all of the programming languages used for analytics & data science such as SAS, Python, PySpark, Spark SQL or Scala.
  • Hands-on strong knowledge of SQL and experience with database management systems (e.g., Teradata, PostgreSQL, MySQL, or NoSQL databases).
  • Familiarity with data warehousing and big data technologies (e.g., Hadoop, Spark, Snowflake, Redshift).
  • Experience with ELT/ETL tools and data integration techniques.
  • Experience optimizing code for performance and cost.
  • Comfortable with using code and agile process management tools like GitHub and JIRA.
  • Someone with an exposure towards developing solutions for high volume, low latency applications and can operate in a fast paced, highly collaborative environment.
  • Provide production support for data assets and products as required.
  • Knowledge of data modelling and data warehousing best practices.
  • Understanding of data governance, data quality, and data security principles.
  • Strong problem-solving and communication skills.
  • Ability to work in a collaborative team environment.
  • Knowledge of cloud platforms (e.g., Azure and/OR Google Cloud) is a plus.

Job Expectations:

  • Design, develop, and maintain ETL (Extract, Transform, Load) ETL processes and data pipelines to move and transform data from various sources into a centralized data repository.
  • Design, implement, and optimize data warehouses and data lakes to ensure scalability, performance, and data consistency.
  • Create and manage data models to support business requirements, ensuring data accuracy, integrity, and accessibility.
  • Integrate data from diverse sources, including databases, APIs, third-party services, and streaming data, and ensure data quality and consistency.
  • Cleanse, transform, and enrich raw data to make it suitable for analysis and reporting.
  • Implement and enforce data security measures to protect sensitive information and ensure compliance with data privacy regulations (e.g., GDPR, HIPAA).
  • Independently build, operate, maintain, enhance, publish and sunset BI Products (own end-to-end life cycle) across enterprise stakeholders along with up-to-date maintenance of all required documentation and artefacts such as SOPs, previous versions, secondary quality reviews, etc. in various BI tools such as Tableau, PowerBI, etc.
  • Continuously monitor and optimize data pipelines and databases for improved performance and efficiency.
  • Develop and implement automated testing procedures to validate data quality and pipeline reliability.
  • Maintain thorough documentation of data processes, schemas, and data lineage to support data governance efforts.
  • Collaborate with wider team such as data scientists, analysts, software engineers, and other stakeholders to understand their data requirements and provide data solutions that meet their needs.
  • Utilize version control systems to manage code and configurations related to data pipelines.
  • Diagnose and resolve data-related issues and provide technical support as needed.

Working Hours:

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Wells Fargo logo
Wells Fargo

Banking and Financial Services

San Francisco

RecommendedJobs for You

pune, chennai, bengaluru

mumbai, bengaluru, thiruvananthapuram

chennai, tamil nadu, india