Data Engineer ;Job Title: Data EngineerLocation: RemoteExperience: 4-7 years ;Job Summary: The Data Engineer is responsible for designing and building data models and pipelines on GCP, Azure, Snowflake, BigQuery, utilizing SQL & Python expertise and following coding best practices. ;Key Responsibilities:Design and build data models and pipelines on GCP BigQuery, Azure, SnowflakeDevelop and execute SQL queries (DML) to extract insights from data.Collaborate with stakeholders to understand data requirements and provide solutions.Ensure data quality and integrity by following data modeling principles and coding best practices.Design, build, and maintain robust, scalable ETL/ELT pipelines using BigQuery and GCP-native services (e.g., Dataflow, Cloud Functions, Pub/Sub).Write efficient SQL queries for data transformation, aggregation, and reporting. ;Requirements:4-7 years of experience in data engineering or related field. ;Strong expertise in SQL with experience in Google BigQuery.Hands-on experience with GCP services related to data engineering (e.g., Dataflow, Cloud Storage, Pub/Sub, Composer/Airflow).Expert-level knowledge of SQL and PL-SQL (primarily DML).Familiarity with DBT (optional). ;Experience with GCP tech stack and coding best practices.Experience with Azure and Snowflake tech stackExpertise with Airflow, DAGs.Very strong in Python ;Key Skills:GCP BigQueryAzureSnowflakeSQL/PL-SQL expertisePython expertiseAirflowData modelling (star/snowflake, normalized/denormalized)DBT experience (optional)GCP tech stackCoding best practicesETL ;Git & GitHub