Full Stack Data Engineer

5 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

  • Work on a Central Engineering Portfolio Team based in Chennai to deliver curated data products and perform end to end Full stack Data engineer role.
  • Work effectively with fellow data engineers, product owners, data champions and other technical experts.
  • Demonstrate technical knowledge and communication skills with the ability to advocate for well-designed solutions.
  • Develop exceptional analytical data products using both streaming and batch ingestion patterns on Google Cloud Platform with solid data warehouse principles.
  • Be the Subject Matter Expert in Data Engineering with a focus on GCP native services and other well integrated third-party technologies.

Responsibilities

  • Experience in working with architects to evaluate and productionalize data pipelines for data ingestion, curation, and consumption.
  • Experience in working with stakeholders to formulate business problems as technical data requirements, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management.
  • In-depth understanding of Google’s product technology (or other cloud platform) and underlying architectures especially Gen AI /Ford LLM
  • Experience in working with DBT/Dataform.
  • Experience with DataPlex or other data catalogs.
  • Experience with development eco-system such as Tekton, Git, Jenkins for CI/CD pipelines.
  • Experience in working with Agile and Lean methodologies.
  • Experience with performance tuning SQL queries.
  • Experience in creating and executing detailed test plans.
  • GCP Professional Data Engineer Certified
  • Master’s degree in computer science or related field

Qualifications

  • 5+ years of SQL development and analytics/data product development experience.
  • 3+ years of cloud experience (GCP preferred) with solutions designed and implemented at production scale.
  • Experience working in GCP native (or equivalent) services like Big Query, Google Cloud Storage, PubSub, Dataflow, Dataproc, Cloud Build, etc.
  • Experience working with Airflow for scheduling and orchestration of data pipelines.
  • Experience working with Terraform to provision Infrastructure as Code.
  • 2 + years professional development experience in Java or Python.
  • Bachelor’s degree in computer science or related scientific field.
  • Experience in analyzing complex data, organizing raw data, and integrating massive datasets from multiple data sources to build analytical domains and reusable data products.

Mock Interview

Practice Video Interview with JobPe AI

Start Java Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Java Skills

Practice Java coding challenges to boost your skills

Start Practicing Java Now

RecommendedJobs for You

Bangalore Urban, Karnataka, India