2 - 7 years

4 - 9 Lacs

Posted:-1 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Location: Sector 63, Noida (WFO)
Timings: Mon - Fri; 10:30 AM to 7:30 PM

About the Role

We are seeking a skilled Data Engineer with hands-on experience in building and maintaining scalable data pipelines and analytics solutions. The ideal candidate will be highly proficient in PySpark and Data Build Tool (DBT), with strong experience in managing large datasets, data warehousing, and modern data stack technologies.

Key Responsibilities

  • Design, build, and maintain ETL/ELT pipelines for efficient data ingestion, transformation, and storage.
  • Collaborate with data analysts, data scientists, and stakeholders to understand data requirements and deliver actionable solutions.
  • Implement and maintain data models using DBT for analytics and reporting.
  • Optimize performance of large-scale distributed data processing jobs using PySpark.
  • Ensure data quality, consistency, and integrity through validation and monitoring.
  • Work with cloud-based data platforms (AWS, GCP, or Azure) to manage data storage and pipelines.
  • Troubleshoot and resolve data-related technical issues.
  • Participate in code reviews, documentation, and best practice enforcement for data engineering workflows.

Required Skills and Qualifications

  • Minimum 2 years of experience in Data Engineering or a related role.
  • Experience writing code in Python language.
  • Proficiency in PySpark for distributed data processing.
  • Expertise in Data Build Tool (DBT) for data modeling and transformation.
  • Strong knowledge of SQL and relational databases.
  • Familiarity with data warehousing solutions such as Snowflake, BigQuery, or Redshift.
  • Experience with cloud platforms like AWS, GCP, or Azure.
  • Understanding of data pipelines, ETL/ELT processes, and data architecture principles.
  • Knowledge of version control systems such as Git.
  • Strong problem-solving skills and attention to detail.

Preferred Skills

  • Experience with orchestration tools like Airflow or Prefect.
  • Knowledge of streaming data pipelines such as Kafka or Kinesis.
  • Familiarity with containerization using Docker or Kubernetes.
  • Understanding of data governance, security, and compliance practices.

Perks and benefits of working at Algoscale:

  • Opportunity to collaborate with leading companies across the globe.
  • Opportunity to work with the latest and trending technologies.
  • Competitive salary and performance-based bonuses.
  • Comprehensive group health insurance.
  • Flexible working hours and remote work options. (For some positions only)
  • Generous vacation and paid time off.
  • Professional learning and development programs and certifications.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Algoscale Technologies, Inc. logo
Algoscale Technologies, Inc.

Information Technology

Tech City

RecommendedJobs for You