Home
Jobs

Data Engineer

1 - 2 years

7 - 9 Lacs

Posted:5 hours ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Job Title: Data Engineer

Experience:

About Tredence

Tredence focuses on last-mile delivery of powerful insights into profitable actions by uniting its strengths in business analytics, data science, and software engineering. The largest companies across industries are engaging with us and deploying their prediction and optimization solutions at scale. Headquartered in the San Francisco Bay Area, we serve clients in the US, Canada, Europe, and Southeast Asia.

Tredence is an equal opportunity employer. We celebrate and support diversity and are committed to creating an inclusive environment for all employees.
Visit our website for more details: https://www.tredence.com

Role Overview

Data Engineer

As a Data Engineer at Tredence, you will work on ingesting, processing, and modeling large-scale data, implementing scalable data pipelines, and applying foundational data warehousing principles. This role also includes direct collaboration with cross-functional teams and client stakeholders.

Key Responsibilities

  • Develop robust and scalable

    data pipelines

    using

    PySpark

    in cloud platforms like

    Azure Databricks

    or

    GCP Dataflow

    .
  • Write optimized

    SQL queries

    for data transformation, analysis, and validation.
  • Implement and support data warehouse models and principles, including:
    • Fact and Dimension modeling
    • Star and Snowflake schemas
    • Slowly Changing Dimensions (SCD)
    • Change Data Capture (CDC)
    • Medallion Architecture
  • Monitor, troubleshoot, and improve pipeline performance and data quality.
  • Work with teams across analytics, business, and IT functions to deliver data-driven solutions.
  • Communicate technical updates and contribute to sprint-level delivery.

Mandatory Skills

  • Strong hands-on experience with

    SQL

    and

    Python

  • Working knowledge of

    PySpark

    for data transformation
  • Exposure to at least one cloud platform:

    Azure

    or

    GCP

    .
  • Good understanding of data engineering and warehousing fundamentals
  • Excellent debugging and problem-solving skills
  • Strong written and verbal communication skills

Preferred Skills

  • Experience working with

    Databricks Community Edition

    or enterprise version
  • Familiarity with data orchestration tools like

    Airflow

    or

    Azure Data Factory

  • Exposure to CI/CD processes and version control (e.g., Git)
  • Understanding of Agile/Scrum methodology and collaborative development
  • Basic knowledge of handling structured and semi-structured data (JSON, Parquet, etc.)

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Tredence
Tredence

Business Consulting and Services

San Jose California

RecommendedJobs for You

Hyderabad, Chennai, Bengaluru

Pune, Chennai, Bengaluru