Data Engineer

6 years

0 Lacs

Posted:9 hours ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Contractual

Job Description

Hi

Hope you are doing good,

This is Rahul from Aveto Consulting, we have a vacancy for the Data Engineer – Remote, Let me know your Interest.



Job title: Data Engineer – Digital & Analytics Platform (GA4 + Cloud + BI Enablement)

Location: Remote

Duration -6months Contract

Only immediate joiner (Within 20days)


Years of experience - 6 to 8years


Overview


Data Engineer who combines engineering precision with analytical curiosity. The Data Engineer will design and automate pipelines that integrate digital behavior data (GA4 / clickstream) with transactional and operational data into a unified analytics layer.

The role requires strong understanding of BigQuery, Databricks , and ELT automation frameworks — with a focus on ensuring data correctness, metric consistency, and cross-system reliability.



Key Responsibilities


Design and maintain batch or federated data pipelines from cloud sources such as BigQuery and Databricks/Snowflake, ensuring scalable and cost-efficient data movement.

Develop and validate GA4 event, session, and user models — reconstructing accurate sessionization logic and resolving schema variations across properties.

Implement automated validation and reconciliation checks between GA4 event data and transactional sources to ensure 90–95% metric alignment.

Collaborate with BI Analysts to co-define and operationalize shared metric definitions (CVR, AOV, funnel conversion, engagement) in the data model.

Build curated and published data layers in Databricks Delta Lake or equivalent for BI and advanced analytics consumption.

Implement schema drift detection, completeness monitoring, and freshness alerts using tools such as dbt tests, Great Expectations, or Soda.

Partner with downstream visualization teams (Looker / Power BI) to validate dashboards against modeled datasets.

Document lineage, transformations, and data contracts in Unity Catalog or equivalent governance systems.


Required Skills


6+ years in data engineering / ELT automation on modern cloud platforms (GCP, BigQuery/Databricks)

Deep understanding of GA4 BigQuery export schema, nested JSON handling, and event parameter unnesting.

Advanced SQL and PySpark; strong understanding of data modeling and normalization.

Hands-on experience with BigQuery, GCS, dbt, or Airflow / Databricks Jobs.

Familiarity with data quality testing, observability, and pipeline monitoring practices.

Strong communication and collaboration skills with analysts and business teams.

Preferred Experience

Exposure to Looker / Power BI data modeling and semantic layers.

Experience with schema versioning, and metric governance.

Knowledge of digital analytics attribution and UTM parameter tracking.

Prior work in multi-brand or multi-property GA4 setups.

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You

hyderabad, bangalore rural, bengaluru