Data Engineer

3 years

0 Lacs

Posted:3 weeks ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Silverpush is at the forefront of AI-powered video advertising, delivering sophisticated video ad solutions that empower brands to achieve impactful campaigns within a privacy-centric environment. Operating across 30+ countries, we specialize in creating contextually relevant advertising experiences that drive genuine engagement and conversion. Silverpush's commitment to innovation and technological advancement enables us to navigate the evolving digital landscape, providing our partners with the tools necessary to connect with audiences on a global scale. We are dedicated to fostering a culture of creativity and excellence, driving the future of ad tech with integrity and foresight.



Job Title: Data Engineer

Location: Gurgaon

Experience Level: 2–3 Years

Job Summary We are seeking a highly motivated Data Engineer with 2–3 years of experience in building robust data pipelines, integrating APIs, and managing cloud-based data infrastructure. You will play a crucial role in designing and maintaining our data architecture to enable real-time analytics and business insights. This role requires proficiency in Python and SQL, familiarity with Airbyte, Airflow, and cloud platforms (preferably GCP), and a strong understanding of ETL/ELT workflows.


Key Responsibilities ● Design, build, and maintain scalable and reliable data pipelines using Python.

● Develop and manage ETL/ELT workflows using Airbyte and Apache Airflow.

● Write and optimize complex SQL queries for data extraction, transformation, and reporting.

● Integrate data from various third-party APIs and internal sources.

● Collaborate with data analysts, data scientists, and product teams to deliver high-quality datasets.

● Manage and optimize BigQuery data storage and schema design for performance and cost efficiency.

● Monitor data quality and implement validation checks to ensure data integrity.

● Maintain documentation for data pipelines, workflows, and data models.


Required Skills & Qualifications

● Programming: Strong proficiency in Python for data pipeline development and automation.

● SQL: Expertise in writing efficient queries for data manipulation and aggregation.

● ETL Tools: Hands-on experience with Airbyte and Apache Airflow.

● Data Warehousing/Databases: Experience with PostgreSQL,MongoDB, Google BigQuery or similar cloud-based data warehouses.

● Data Integration: Familiarity with integrating data via RESTful APIs.

● Visualization: Experience using Google Data Studio; familiarity with Power BI is a plus.

● Version Control: Proficient in Git for code versioning and collaboration.


Preferred Skills

● Cloud Platforms: Working knowledge of Google Cloud Platform (GCP),AWS.

● Data Modeling: Understanding of dimensional modeling and performance optimization techniques.


Nice to Have

● Experience in streaming data architecture (e.g., using Kafka or Pub/Sub).

● Exposure to Terraform or Infrastructure as Code (IaC) tools.

● Familiarity with data governance and compliance standards.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

vellore, tamil nadu, india

madurai, tamil nadu, india

coimbatore, tamil nadu, india

faridabad, haryana, india

pune, maharashtra, india

pune, maharashtra, india