Remote
Contractual
Data Engineer who combines engineering precision with analytical curiosity. The Data Engineer will design and automate pipelines that integrate digital behavior data (GA4 / clickstream) with transactional and operational data into a unified analytics layer.
The role requires strong understanding of BigQuery, Databricks , and ELT automation frameworks — with a focus on ensuring data correctness, metric consistency, and cross-system reliability.
Design and maintain batch or federated data pipelines from cloud sources such as BigQuery and Databricks/Snowflake, ensuring scalable and cost-efficient data movement.
Develop and validate GA4 event, session, and user models — reconstructing accurate sessionization logic and resolving schema variations across properties.
Implement automated validation and reconciliation checks between GA4 event data and transactional sources to ensure 90–95% metric alignment.
Collaborate with BI Analysts to co-define and operationalize shared metric definitions (CVR, AOV, funnel conversion, engagement) in the data model.
Build curated and published data layers in Databricks Delta Lake or equivalent for BI and advanced analytics consumption.
Implement schema drift detection, completeness monitoring, and freshness alerts using tools such as dbt tests, Great Expectations, or Soda.
Partner with downstream visualization teams (Looker / Power BI) to validate dashboards against modeled datasets.
Document lineage, transformations, and data contracts in Unity Catalog or equivalent governance systems.
6+ years in data engineering / ELT automation on modern cloud platforms (GCP, BigQuery/Databricks)
Deep understanding of GA4 BigQuery export schema, nested JSON handling, and event parameter unnesting.
Advanced SQL and PySpark; strong understanding of data modeling and normalization.
Hands-on experience with BigQuery, GCS, dbt, or Airflow / Databricks Jobs.
Familiarity with data quality testing, observability, and pipeline monitoring practices.
Strong communication and collaboration skills with analysts and business teams.
Preferred Experience
Exposure to Looker / Power BI data modeling and semantic layers.
Experience with schema versioning, and metric governance.
Knowledge of digital analytics attribution and UTM parameter tracking.
Prior work in multi-brand or multi-property GA4 setups.
AVETO
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
chennai
6.0 - 10.0 Lacs P.A.
coimbatore
5.0 - 10.0 Lacs P.A.
hyderabad
6.0 - 11.0 Lacs P.A.
bengaluru
Experience: Not specified
2.0 - 4.0 Lacs P.A.
pune
8.0 - 12.0 Lacs P.A.
bengaluru
20.0 - 25.0 Lacs P.A.
bengaluru
6.0 - 9.0 Lacs P.A.
kochi, pune, bengaluru
20.0 - 27.5 Lacs P.A.
hyderabad, bangalore rural, bengaluru
25.0 - 25.0 Lacs P.A.
ahmedabad, mumbai (all areas)
20.0 - 30.0 Lacs P.A.