Job
Description
As a Lead Data Engineer, you will be responsible for leading cloud modernization initiatives, developing scalable data pipelines, and enabling real-time data processing for enterprise-level systems. Your expertise in Google Cloud Platform (GCP) and BigQuery will be crucial in driving the transformation of legacy infrastructure into a robust, cloud-native data ecosystem. Your key responsibilities will include analyzing legacy on-premises and hybrid cloud data warehouse environments, leading the migration of large-scale datasets to Google BigQuery, and designing data migration strategies to ensure data quality, integrity, and performance. You will also be responsible for integrating data from various structured and unstructured sources, building real-time streaming pipelines for large-scale ingestion and processing of IoT and telemetry data, and modernizing legacy SSIS packages into cloud-native ETL pipelines. To excel in this role, you should have at least 5 years of experience in Data Engineering with a strong focus on cloud and big data technologies, along with a minimum of 2 years of hands-on experience with GCP, specifically BigQuery. Your experience in migrating on-premise data systems to the cloud, development with Apache Airflow, Python, and Apache Spark, and expertise in streaming data ingestion will be highly valuable. Additionally, your strong SQL development skills and understanding of cloud architecture, data modeling, and data warehouse design will be essential for this role. Preferred qualifications include a GCP Professional Data Engineer certification, experience with modern data stack tools like dbt, Kafka, or Terraform, and exposure to ML pipelines, analytics engineering, or DataOps/DevOps methodologies. Joining us will provide you with the opportunity to work with cutting-edge technologies in a fast-paced, collaborative environment, lead cloud transformation initiatives at scale, and benefit from competitive compensation and remote flexibility with growth opportunities.,