Senior Data Engineering Professional

6 - 10 years

8 - 13 Lacs

Posted:6 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

The Role:

We are looking for a candidate to join our team who will be involved in the ongoing development of our Enterprise Data Warehouse (EDW) The Media and Marketing Data Enginee role will include participating in the loading and extraction of data, including external sources through API , Storage buckets( S3,Blob storage) and marketing specific data integrations. The ideal candidate will be involved in all stages of the project lifecycle, from initial planning through to deployment in production. A key focus of the role will be data analysis, data modeling, and ensuring these aspects are successfully implemented in the production environment.

Your Contribution:

Be Yourself. Be Open. Stay Hungry and Humble. Collaborate. Challenge. Decide and just Do. These are the behaviors youll need for success at Logitech. In this role you will:

  • Design, Develop, document, and test ETL solutions using industry standard tools.
  • Ability to design Physical and Reporting Data models for seamless cross-functional and cross-systems data reporting.
  • Enhance point-of-sale datasets with additional data points to provide stakeholders with useful insights.
  • Ensure data integrity by rigorously validating and reconciling data obtained from third-party providers.
  • Collaborate with data providers and internal teams to address customer data discrepancies and enhance data quality.
  • Work closely across our D&I teams to deliver datasets optimized for consumption in reporting and visualization tools like Tableau
  • Collaborate with data architects, analysts, and business stakeholders to gather requirements and translate them into data solutions.
  • Participate in the design discussion with enterprise architects and recommend design improvements
  • Develop and maintain conceptual, logical, and physical data models with their corresponding metadata.
  • Work closely with cross-functional teams to integrate data solutions.
  • Create and maintain clear documentation for data processes, data models, and pipelines.
  • Integrate Snowflake with various data sources and third-party tools.
  • Manage code versioning and deployment of Snowflake objects using CI/CD practices

Key Qualifications:

For consideration, you must bring the following

  • A total of 6 to 10 years of experience in ETL design, development, and populating data warehouses. This includes experience with heterogeneous OLTP sources such as Oracle R12 ERP systems and other cloud technologies.
  • At least 3 years of hands-on experience with Pentaho Data Integration or similar ETL tools.
  • Practical experience working with cloud-based Data Warehouses such as Snowflake and Redshift.
  • Significant hands-on experience with Snowflake utilities, including SnowSQL, SnowPipe, Python, Tasks, Streams, Time Travel, Optimizer, Metadata Manager, data sharing, Snowflake AI/ML and stored procedures.
  • Worked on API based integrations and marketing data
  • Design and develop complex data pipelines and ETL workflows in Snowflake using advanced SQL, UDFs, UDTFs, and stored procedures (JavaScript/SQL).
  • Comprehensive expertise in databases, data acquisition, ETL strategies, and the tools and technologies within Pentaho DI and Snowflake.
  • Experience in working with complex SQL Functions & Transformation of data on large data sets.
  • Demonstrated experience in designing complex ETL processes for extracting data from various sources, including XML files, JSON, RDBMS, and flat files.
  • Exposure to standard support ticket management tools.
  • A strong understanding of Business Intelligence and Data warehousing concepts and methodologies.
  • Extensive experience in data analysis and root cause analysis, along with proven problem-solving and analytical thinking capabilities.
  • A solid understanding of software engineering principles and proficiency in working with Unix/Linux/Windows operating systems, version control, and office software.
  • A deep understanding of data warehousing principles and cloud architecture, including SQL optimization techniques for building efficient and scalable data systems.
  • Familiarity with Snowflake’s unique features, such as its multi-cluster architecture and shareable data capabilities.
  • Excellent skills in writing and optimizing SQL queries to ensure high performance and data accuracy across all systems.
  • The ability to troubleshoot and resolve data quality issues promptly, maintaining data integrity and reliability.
  • Strong communication skills are essential for effective collaboration with both technical and non-technical teams to ensure a clear understanding of data engineering requirements.

In addition,

  • Exposure to Oracle ERP environment,
  • Basic understanding of Reporting tools like OBIEE, Tableau
  • Exposure to Marketing data platform like Adverity,Fivertran etc
  • Exposure to Customer data platform

    Education:

  • BS/BTech/MCA/MS in computer science Information Systems or a related technical field or equivalent industry expertise.
  • Fluency in English.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You