Senior Data Engineer

3 - 8 years

25 - 30 Lacs

Posted:6 hours ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

  • Design, implement, and maintain a scalable Data Lake on GCP to centralize structured and unstructured data from various sources (databases, APIs, cloud storage).
  • Utilize GCP services including Big Query, Dataform, Cloud Functions, and Cloud Storage to optimize and manage data workflows, ensuring scalability, performance, and security.
  • Collaborate closely with data analytics and data science teams to ensure data is properly prepared for consumption by various systems (e.g. DOMO, Looker, Databricks)
  • Implement best practices for data quality, consistency, and governance across all data pipelines and systems, ensuring compliance with internal and external standards.
  • Continuously monitor, test, and optimize data workflows to improve performance, cost efficiency, and reliability.
  • Maintain comprehensive technical documentation of data pipelines, systems, and architecture for knowledge sharing and future development.

Requirements

  • Bachelors degree in Computer Science, Data Engineering, Data Science, or a related quantitative field (e.g. Mathematics, Statistics, Engineering).
  • 3+ years of experience using GCP Data Lake and Storage Services. Certifications in GCP are preferred (e.g. Professional Cloud Developer, Professional Cloud Database Engineer).
  • Advanced proficiency with SQL, with experience in writing complex queries, optimizing for performance, and using SQL in large-scale data processing workflows.
  • Strong programming skills in Python, with additional experience in languages such as Java or Scala encouraged. Proven ability to build scalable data pipelines, automate workflows, and integrate APIs for efficient data ingestion.
  • Proficient in Git and CI/CD practices, with experience automating testing and deployment of data systems.
  • Experience with Looker Enterprise, including developing and maintaining LookML models to enable self-service analytics and data exploration.

  • Strong data modeling skills, with experience designing scalable, maintainable models that support analytics, reporting, and business intelligence use cases across diverse teams.

  • Expertise in infrastructure automation using Terraform, with experience scripting in Python and Java to provision and deploy cloud resources efficiently.

  • Strong communication and collaboration skills, with a proven ability to work cross-functionally with teams such as data science, analytics, product, and business leadership to understand and meet their data needs.

Mock Interview

Practice Video Interview with JobPe AI

Start Business Intelligence Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Five9 logo
Five9

Cloud Computing / Software as a Service (SaaS)

San Ramon

RecommendedJobs for You

gurugram, coimbatore, bengaluru