Posted:13 hours ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Key Responsibilities:

  • Design and implement scalable data architectures to support data storage, processing, and analytics.
  • Design and implement data schemas within

    Snowflake

    to effectively support analytics, reporting needs.
  • Establish and enforce data access roles and policies.
  • Develop strategies to make data AI-ready, including data cleansing, transformation, and enrichment processes.
  • Provide guidance and support for analytical development and modelling to enhance data visualization and reporting capabilities.
  • Conduct performance tuning and optimization of data models to improve query efficiency and response times.
  • Develop, maintain, and optimize ETL (Extract, Transform, Load) processes for Pacific Data Analytics Platform to ensure efficient data integration from various sources (Both internal and external datasets)
  • Manage and optimize database / data warehouse systems such as snowflake ensuring high availability and performance.
  • Analyze and tune database performance, identifying bottlenecks and implementing improvements to enhance query performance.
  • Ensure data integrity, consistency, and accuracy through rigorous data quality checks and validations.
  • Work closely with data engineers, application engineers, analysts, and other stakeholders to understand data needs and provide appropriate solutions.
  • Leverage cloud technologies (mainly AWS) for data storage, processing, and analytics, ensuring cost-effectiveness and scalability.
  • Document data processes, architectures, and workflows while establishing best practices for data management and engineering.
  • Set up monitoring solutions to track data pipelines and database performance, ensuring timely maintenance and fault resolution.
  • Ability to quickly analyze existing SQL code and make improvements to enhance performance, take advantage of new SQL features, close security gaps, and increase robustness and maintainability of the code.
  • Implement data security measures and ensure compliance with relevant regulations regarding data protection and privacy.
  • Provide guidance and mentorship to junior data engineers, fostering a culture of learning and continuous improvement.

Key Qualifications:

:

  • Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field with at least

    10+ years

    of software development experience
  • Expert knowledge in Database like

    Oracle, PostgreSQL, SQL Server

    (preferably cloud hosted), with strong

    programming experience in SQL

    .
  • Competence in data preparation and/or

    ETL

    tools like

    Snaplogic

    or

    Azure Data Factory

    or AWS Glue or SSIS (preferably strong working experience in one or more) to build and maintain data pipelines and flows.
  • Programming language experience in

    Python, shells scripts

    (bash/zsh, grep/sed/awk etc..).
  • Deep knowledge of

    databases

    ,

    stored procedures

    , optimizations of huge data
  • In-depth knowledge of

    ingestion

    techniques, data cleaning, de-dupe, partitioning.
  • Experience with building the infrastructure required for

    data ingestion and analytics

  • Solid understanding of

    normalization and denormalization of data

    , database exception handling, transactions,

    profiling queries

    , performance counters, debugging, database & query optimization techniques
  • Familiarity with

    SQL security techniques

    such as data encryption at the column level, Transparent Data Encryption (TDE), signed stored procedures, and assignment of user permissions
  • Experience in understanding the source data from various platforms and mapping them into

    Entity Relationship Models (ER)

    for data integration and reporting
  • Good understanding of

    Data

    Models, Data Architecture and Naming Conventions

  • Knowledge of data visualization tools (e.g.,

    Tableau, Power BI

    ) is a plus.
  • Exposure to Source control like

    GIT, Azure DevOps

  • Understanding of

    Agile

    methodologies (Scrum, Kanban)
  • Preferably experience with

    NoSQL

    database to migrate data into other type of databases with real time replication.
  • Experience with CI/CD automation tools

Personal Strengths:

  • Must have completed the certifications on Snowpro Advanced : Architect

  • Very good communication skills.
  • Ability to easily fit into a distributed development team.
  • Ability to manage timelines of multiple initiatives.
  • Ability to articulate insights from the data and help business teams make decisions
  • Able to work with ambiguous requirements, to seek clarity around uncertainty and to manage risks
  • Ability to communicate complex concepts to non-data audiences

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

bengaluru, karnataka, india

pune, maharashtra, india

pune, maharashtra, india

bengaluru, karnataka, india

hyderabad, chennai, mumbai (all areas)