Data Engineer - GCP, BiqQuery, Dataflow, Pub/Sub, Stardog, Composer

3 - 8 years

6 - 10 Lacs

Posted:4 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description


Before you apply to a job, select your language preference from the options available at the top right of this page.
Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrowpeople with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level.About The Role :
About The Role We're seeking a talented Data Engineer with hands-on experience in the Google Cloud data ecosystem and a proven track record of working with Vector Database, Knowledge Graphs like Neo4j and StarDog. You'll be instrumental in designing, building, and maintaining our data infrastructure and pipelines, enabling critical insights and supporting data-driven initiatives across the organization.
Responsibilities

  • Data Pipeline Development: Design, build, and optimize robust and scalable data pipelines to ingest, transform, and load data from various sources into our data warehouse and knowledge graphs.

  • Cloud Data Stack Expertise: Implement and manage data solutions using Google Cloud Platform (GCP) services such as
    BigQuery, Dataflow, Pub/Sub, Cloud Storage, Spanner and Dataproc and Azure Cloud Services

  • Knowledge Graph Engineering: Develop and maintain data models, ingest data, and create efficient queries within
    Neo4j and/or
    Stardog. Leverage your expertise to build and expand our enterprise knowledge graph.

  • Data Quality & Governance: Implement best practices for data quality, data validation, and data governance, ensuring data accuracy, consistency, and reliability.

  • Performance Optimization: Continuously monitor and optimize the performance of data pipelines and database queries, identifying and resolving bottlenecks.

  • Collaboration: Work closely with data scientists, analysts, and other engineering teams to understand data requirements and deliver effective data solutions.

  • Documentation: Create and maintain comprehensive documentation for data pipelines, data models, and knowledge graph schemas.

  • Required QualificationsEducation:Bachelors degree in computer science, Engineering, or a related quantitative field.Experience:

  • 3+ years of professional experience as a Data Engineer or in a similar role.

  • Strong hands-on experience with Google Cloud Platform (GCP) data services, including BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Composer (Apache Airflow).

  • Demonstrable experience working with Knowledge Graph technologies, specifically
    Neo4j and/or
    Stardog. This includes data modeling, query language proficiency (Cypher for Neo4j, SPARQL for Stardog), and integration.

  • Proficiency in Python for data manipulation and pipeline orchestration.

  • Experience with SQL and data warehousing concepts.

  • Familiarity with data modeling techniques for both relational and graph databases.

  • Experience with version control systems (e.g., Git).

  • Preferred Qualifications

  • Experience with other GCP services like Data Catalog, Looker, or Vertex AI.

  • Knowledge of streaming data technologies (e.g., Kafka, Google Cloud Dataflow streaming).

  • Experience with other graph databases or semantic web technologies.

  • Familiarity with data governance tools and principles.

  • Certifications in Google Cloud Platform data engineering.
    PermanentUPS is committed to providing a workplace free of discrimination, harassment, and retaliation.

    Mock Interview

    Practice Video Interview with JobPe AI

    Start Python Interview
    cta

    Start Your Job Search Today

    Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

    Job Application AI Bot

    Job Application AI Bot

    Apply to 20+ Portals in one click

    Download Now

    Download the Mobile App

    Instantly access job listings, apply easily, and track applications.

    coding practice

    Enhance Your Python Skills

    Practice Python coding challenges to boost your skills

    Start Practicing Python Now
    UPS Supply Chain Solutions (UPS SCS) logo
    UPS Supply Chain Solutions (UPS SCS)

    Logistics and Supply Chain

    Atlanta

    RecommendedJobs for You