Staff Enterprise Data Engineer

8 - 13 years

9 - 13 Lacs

Posted:4 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

  • Create, maintain, and optimize data pipelines from development to production for specific use cases.
  • Use innovative and modern tools, data services, techniques, and well architected frameworks to automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity.
  • Assist with optimizing the data management infrastructure to drive automation in data integration and management.
  • Implementing and using modern data preparation, integration and AI-enabled metadata management tools and techniques. Including MLOps framework and infrastructure automation.
  • Tracking data consumption patterns
  • Performing intelligent sampling and caching
  • Monitoring schema changes
  • Recommending or sometimes even automating existing and future integration flows.
  • Collaborate in close relationship with data science teams and with business (data) analysts in refining their data requirements for various data and analytics initiatives and their data consumption requirements.
  • Build, model and curate data lake/warehouse and other data consumption methods.
  • Performs other duties as assigned
  • Complies with all policies and standards

Education Qualifications

  • Bachelors Degree in Computer Science/Engineering or equivalent experience preferred

Experience Qualifications

  • Typically 8+ years experience in implementation of Data & BI projects in a large-scale enterprise data lake/ warehouse environment required
  • Typically 8+ years experience in ETL/ELT Architecture and hands-on experience in developing the ETL/ELT jobs and tools like Informatica PowerCenter/ Informatica Cloud - IICS, AWS Glue required
  • Must have working experience in SFDC and PeopleSoft in areas of Sales, Marketing, Finance, or Support domains
  • Knowledge or experience in working with Cloud data warehouses like Snowflake, AWS Redshift and good understanding of ANSI SQL
  • Proven hands-on experience in using Informatica components such as Designer, Workflow Manager, Workflow Monitor, Repository Manager, Python etc
  • Hands on experience with database performance tuning like Oracle, Postgres and strong database languages like SQL, PL/SQL, ANSI SQL, Unix Shell and Perl Scripting

Skills and Abilities

  • Knowledge of end-to-end SDLC process in EDW, Data Lake, BI & MLOps projects (Advanced proficiency)
  • Dimensional Modeling and Data warehouse, ODS concepts, such as star schemas, snowflakes and normalized data models (Advanced proficiency)
  • Ability to coordinate effectively with on-site and offshore resources through Managed Service Providers & IT Teams (Intermediate proficiency)
  • Knowledge of Reporting tools like Tableau is desired (Intermediate proficiency)
  • Excellent verbal and written communication skills (Advanced proficiency).

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Trinet Group logo
Trinet Group

Human Resources

San Leandro

RecommendedJobs for You