Data Engineer - GCP

4 years

0 Lacs

Posted:6 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Summary:

Data Engineer


Key Responsibilities:

1. Data Pipeline Development:

  • Design, build, and maintain scalable and efficient data pipelines to ingest, process, and transform large datasets from web and mobile applications.
  • Implement ETL (Extract, Transform, Load) processes to integrate data from multiple sources, including Firebase (Firestore, Firebase Analytics).
  • Optimize data workflows for performance, reliability, and cost-efficiency in cloud environments.

2. Data Infrastructure Management:

  • Develop and manage data storage solutions (databases, data warehouses, data lakes) to support back-end and analytical needs.
  • Configure and maintain cloud-based data infrastructure, ensuring scalability, security, and high availability.
  • Automate infrastructure tasks such as schema migrations, data partitioning, and backup management for seamless operations.

3. Data Integration & API Development:

  • Build and maintain APIs and data access layers to enable seamless data integration with React-based and cloud-native applications.
  • Consolidate and unify data from disparate sources for analytics and product functionalities.
  • Ensure data consistency, integrity, and synchronization across systems, supporting both batch and real-time data processing.

4. Collaboration & Communication

  • Partner with data scientists, software developers, and product managers to understand data requirements and deliver effective data solutions.
  • Provide regular updates on data infrastructure health, data availability, and pipeline performance.
  • Collaborate with engineering teams to integrate pipelines with production environments, including Firebase-hosted systems.

5. Data Quality & Monitoring

  • Implement robust data quality checks and validation mechanisms.
  • Monitor and troubleshoot pipeline performance and data latency using cloud monitoring tools.
  • Maintain documentation, metadata, and data lineage for compliance and traceability.


Required Skills:

  • Minimum 4 years of professional experience in Data Engineering.
  • Hands-on experience with at least one public cloud platform — Google Cloud Platform (preferred), AWS, or Azure.
  • Strong knowledge of ETL development, SQL/NoSQL databases, data modeling, and pipeline orchestration.
  • Experience with Firebase, Firestore, and React-based data integrations is a strong plus.
  • Proficiency in Python, SQL, and cloud-native data tools (e.g., Dataflow, BigQuery, Airflow).
  • Familiarity with CI/CD, API development, and infrastructure automation.


Preferred Qualifications:

  • GCP certification (Data Engineer or Architect).
  • Experience with real-time data streaming (e.g., Pub/Sub, Kafka).
  • Strong analytical and problem-solving skills with a focus on performance optimization.


Qualification:

  • Strong analytical and problem-solving skills.
  • Desire and ability to rapidly learn a wide variety of new technical skills.
  • Self-motivated, takes initiative, assumes ownership.
  • Enthusiastic, professional, with a focus on customer success.
  • Passion for solving client challenges and commitment to client delight.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

hyderabad, telangana, india

hyderabad, telangana, india