Senior Associate Software Engineer

2 years

6 - 10 Lacs

Posted:15 hours ago| Platform: GlassDoor logo

Apply

Work Mode

On-site

Job Type

Part Time

Job Description

Who are we?

Equinix is the world’s digital infrastructure company®, shortening the path to connectivity to enable the innovations that enrich our work, life and planet.

A place where bold ideas are welcomed, human connection is valued, and everyone has the opportunity to shape their future.

Help us challenge assumptions, uncover bias, and remove barriers—because progress starts with fresh ideas. You’ll find belonging, purpose, and a team that welcomes you—because when you feel valued, you’re empowered to do your best work.

Job Summary

As a key member of the Data Team at Equinix, we are seeking a skilled GCP Data Engineer who will be responsible for end-to-end development of Data Engineering use cases, Equinix Data Lake platform and tools. You will design, build, and maintain scalable data infrastructure and analytics solutions on Google Cloud Platform. The ideal candidate will have strong expertise in cloud-native data technologies and a passion for building robust, efficient data pipelines that drive business insights.


Responsibilities

  • Design, develop, and maintain scalable ETL/ELT pipelines using Cloud Dataflow, Cloud Composer (Apache Airflow), Dataform/dbt and Cloud Functions

  • Build real-time streaming data pipelines using Cloud Pub/Sub, Kafka and Dataflow

  • Implement automated data quality checks and monitoring across all data workflows

  • Optimize pipeline performance and cost efficiency through proper resource allocation and scheduling

  • Architect and implement data lake and data warehouse solutions using Dataproc, BigQuery, Cloud Storage, and Cloud SQL

  • Design optimal data models, partitioning strategies, and clustering for analytical workloads

  • Manage data lifecycle policies and implement automated archival and retention strategies

  • Ensure data security, encryption, and access control across all storage layers

  • Build and optimize BigQuery datasets for analytics and reporting use cases

  • Create and maintain dimensional models and fact tables for business intelligence

  • Implement data marts and aggregation layers for improved query performance

  • Support self-service analytics through proper data cataloging and documentation

  • Having good knowledge of Dataplex and Analytics hub

  • Integrate data from various sources including databases, APIs, SaaS applications, and file systems

  • Implement change data capture (CDC) solutions for real-time data synchronization

  • Work with third-party data providers and external data feeds

  • Implement comprehensive monitoring and alerting using Cloud Monitoring and Cloud Logging

  • Troubleshoot data pipeline issues and implement robust error handling mechanisms

  • Maintain data lineage documentation and impact analysis capabilities



Qualifications
Technical Skills

  • GCP Services: 2+ years hands-on experience with BigQuery, Cloud Dataflow Cloud Composer, Cloud Storage, Cloud Pub/Sub, Dataform/dbt and Cloud Functions

  • Programming: Proficiency in Python/Java and SQL; experience with Spark/Beam development

  • Data Technologies: Strong understanding of Apache Beam, Apache Airflow, and distributed computing concepts

  • Database Systems: Experience with both RDBMS (Cloud SQL, PostgreSQL, MySQL) and NoSQL (Bigtable, Firestore) databases

  • Infrastructure as Code: Experience with Terraform, Cloud Deployment Manager, or similar tools



Professional Experience

  • Bachelor's degree in computer science, Engineering, or related field

  • 2+ years of experience in data engineering or related roles

  • 2+ years of specific experience with Google Cloud Platform

  • Experience with version control systems (Git) and CI/CD pipelines

  • Knowledge of data modelling techniques and dimensional modelling



Preferred Qualifications

  • Google Cloud Professional Data Engineer certification

  • Experience with containerization (Docker, Kubernetes/GKE)

  • Experience with data governance and compliance frameworks (GDPR, HIPAA, SOX)

  • Familiarity with business intelligence tools (Looker, Tableau, Power BI)

  • Experience with streaming technologies beyond GCP (Kafka, Spark Streaming)

Equinix is committed to ensuring that our employment process is open to all individuals, including those with a disability. If you are a qualified candidate and need assistance or an accommodation, please let us know by completing this form.

Equinix is an Equal Employment Opportunity and, in the U.S., an Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to unlawful consideration of race, color, religion, creed, national or ethnic origin, ancestry, place of birth, citizenship, sex, pregnancy / childbirth or related medical conditions, sexual orientation, gender identity or expression, marital or domestic partnership status, age, veteran or military status, physical or mental disability, medical condition, genetic information, political / organizational affiliation, status as a victim or family member of a victim of crime or abuse, or any other status protected by applicable law.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Equinix logo
Equinix

Technology, Information and Internet

Redwood City California

RecommendedJobs for You