Posted:1 hour ago|
Platform:
Work from Office
Full Time
Job Title: Data Engineer
Location: Gurgaon
Experience Level: 3+ Years
Job Summary We are seeking a highly motivated Data Engineer with 3+ years of experience in building robust data pipelines, integrating APIs, and managing cloud-based data infrastructure. You will play a crucial role in designing and maintaining our data architecture to enable real-time analytics and business insights. This role requires proficiency in Python and SQL, familiarity with Airbyte, Airflow, and cloud platforms (preferably GCP), and a strong understanding of ETL/ELT workflows.
Key Responsibilities
Design, build, and maintain scalable and reliable data pipelines using Python.
Develop and manage ETL/ELT workflows using Airbyte and Apache Airflow.
Write and optimize complex SQL queries for data extraction, transformation, and reporting.
Integrate data from various third-party APIs and internal sources.
Collaborate with data analysts, data scientists, and product teams to deliver high-quality datasets.
Manage and optimize BigQuery data storage and schema design for performance and cost efficiency.
Monitor data quality and implement validation checks to ensure data integrity.
Maintain documentation for data pipelines, workflows, and data models.
Required Skills & Qualifications
Programming: Strong proficiency in Python for data pipeline development and automation.
SQL: Expertise in writing efficient queries for data manipulation and aggregation.
ETL Tools: Hands-on experience with Airbyte and Apache Airflow.
Data Warehousing/Databases: Experience with PostgreSQL,MongoDB, Google BigQuery or similar cloud-based data warehouses.
Data Integration: Familiarity with integrating data via RESTful APIs.
Visualization: Experience using Google Data Studio; familiarity with Power BI is a plus.
Version Control: Proficient in Git for code versioning and collaboration.
Preferred Skills
Cloud Platforms: Working knowledge of Google Cloud Platform (GCP),AWS.
Data Modeling: Understanding of dimensional modeling and performance optimization techniques.
Nice to Have
Experience in streaming data architecture (e.g., using Kafka or Pub/Sub).
Exposure to Terraform or Infrastructure as Code (IaC) tools.
Familiarity with data governance and compliance standards.
Silveredge Technologies
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Nowgurugram
20.0 - 25.0 Lacs P.A.
15.0 - 20.0 Lacs P.A.
bengaluru
25.0 - 40.0 Lacs P.A.
9.0 - 10.0 Lacs P.A.
gurgaon
1.0 - 10.0 Lacs P.A.
bengaluru
10.0 - 20.0 Lacs P.A.
Experience: Not specified
1.2 - 1.8 Lacs P.A.
gurugram
15.0 - 25.0 Lacs P.A.
pune
10.0 - 20.0 Lacs P.A.
bengaluru
10.0 - 18.0 Lacs P.A.