Job Summary
As a key member of the Data Team at Equinix, we are seeking an experienced GCP Data Engineer who will lead end-to-end development of complex Data Engineering use cases and drive the evolution of Equinix's Data Lake platform. You will design and build enterprise-scale data infrastructure and analytics solutions on Google Cloud Platform while providing technical mentorship to the data engineering team. The ideal candidate combines deep technical expertise in cloud-native data technologies with proven leadership skills and a passion for building robust, scalable data platforms that drive strategic business insights.
Responsibilities
-
Participate in design and implementation of enterprise-scale data pipelines and platform architecture for End-to-End data products
-
Develop fault-tolerant, petabyte-scale data processing systems using advanced GCP services
-
Evaluate and recommend new technologies, tools, and architectural approaches
-
Build complex ETL/ELT workflows using Cloud Dataflow, Composer, Dataform/dbt, BigQuery and custom solutions
-
Implement sophisticated real-time streaming architectures with Cloud Pub/Sub, Kafka and Apache Beam
-
Design and optimize data processing workflows for performance, cost, and reliability
-
Develop advanced data transformation logic and custom processing functions
-
Implement data architectures and microservices-based data platforms
-
Design multi-environment deployment strategies and automated scaling solutions
-
Establish monitoring, alerting, and observability frameworks across the data platform
-
Lead code reviews and establish quality assurance processes
-
Provide technical guidance on complex problem-solving and optimization techniques
-
Facilitate knowledge sharing sessions and technical training programs
-
Collaborate with hiring managers on technical interviews and team expansion
-
Implement comprehensive data governance frameworks and compliance controls
-
Design fine-grained access controls and data classification systems
-
Establish automated data quality monitoring and lineage tracking
-
Ensure GDPR, SOX, and other regulatory compliance across data workflows
-
Implement privacy-preserving techniques and data anonymization strategies
-
Translate complex business requirements into scalable technical solutions
-
Coordinate cross-functional initiatives involving multiple engineering teams
Qualifications
Technical Expertise
-
GCP Services: 5+ years hands-on experience with BigQuery (advanced SQL, scripting, ML integration), Cloud Dataflow, Cloud Composer, Cloud Storage, Pub/Sub, and Vertex AI
-
Programming: Expert-level Python/Java programming, proficiency in Python/Scala for Spark development
-
Advanced Technologies: Deep experience with Apache Beam, Airflow, Kubernetes, Docker, and distributed computing
-
Infrastructure: Extensive experience with Terraform, CI/CD pipelines, and cloud infrastructure management
-
Database Systems: Advanced knowledge of both relational and NoSQL databases, data modeling, and performance optimization
Professional Experience
-
Bachelors or masters degree in computer science, Engineering, or related field
-
5+ years of data engineering experience with 4+ years specifically on Google Cloud Platform
-
Proven track record of delivering complex, enterprise-scale data solutions
-
Experience with data governance, compliance, and security frameworks
-
Excellent written and verbal communication skills with ability to present to executives
-
Experience working with cross-functional projects and managing stakeholder relationships
-
Proven ability to translate business requirements into technical solutions
Preferred Qualifications
-
Google Cloud Professional Data Engineer certification
-
Experience with MLOps, feature stores, and machine learning pipeline development
-
Knowledge of data science workflows and statistical analysis
-
Experience with BI tools (Looker, Tableau) and advanced analytics platforms