Posted:21 hours ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

This role is for one of the Weekday's clients

Min Experience: 3 yearsLocation: BangaloreJobType: full-timeWe are seeking a highly skilled and motivated

Data Engineer

to join our data team. In this role, you will design, build, and maintain scalable data infrastructure that powers data-driven decision-making across the organization.

Requirements

Key Responsibilities

  • Manage and optimize relational databases (PostgreSQL, MySQL) and NoSQL databases (MongoDB), including performance tuning and schema evolution.
  • Leverage cloud platforms (AWS, Azure, GCP) for data storage, processing, and analytics—optimizing cost, performance, and scalability using cloud-native services.
  • Design, develop, and maintain robust, fault-tolerant data pipelines using modern orchestration tools (Apache Airflow, Apache Flink, Dagster).
  • Implement and manage real-time data streaming solutions (Apache Kafka, Kinesis, Pub/Sub).
  • Design data models that support efficient querying and integrate with BI tools such as Metabase, Power BI, Looker, or QuickSight.
  • Collaborate with Data Scientists, Analysts, and business teams to translate requirements into technical solutions.
  • Apply DataOps principles to improve automation, collaboration, and reliability of data processes.
  • Work with containerization (Docker) and orchestration (Kubernetes) for deploying and managing data services.
  • Implement data governance practices, including data lineage, metadata management, and quality frameworks.
  • Stay updated on emerging data engineering tools and best practices, recommending adoption where relevant.

Skills & Qualifications

Essential

  • Strong proficiency in Python, SQL, and MongoDB.
  • Hands-on experience with relational databases (PostgreSQL, MySQL) and NoSQL databases.
  • Solid understanding of data modeling, data warehousing principles, and ETL/ELT methodologies.
  • Expertise with cloud platforms (AWS, Azure, GCP), including services for data storage (S3, ADLS Gen2, GCS), data warehouses (Redshift, Snowflake, BigQuery), and managed data processing services (AWS Glue, Azure Data Factory, Google Dataflow).
  • Experience in building and monitoring data pipelines with automated quality checks.
  • Familiarity with BI tools (Metabase, Power BI, Looker, QuickSight).
  • Strong analytical and troubleshooting skills for complex pipeline issues.

Preferred

  • Experience with Data Lake, Data Lakehouse, or Data Mesh architectures.
  • Hands-on expertise with Apache Spark, Kafka, and stream processing (Flink, Spark Streaming).
  • Familiarity with workflow orchestration tools (Airflow, Dagster).
  • Understanding of DataOps/MLOps principles and data observability tools.
  • Strong communication and presentation skills.

Core Skills

Data Engineering | Python | SQL | PostgreSQL | MongoDB | Kafka | Apache Spark | Data Modeling | ETL/ELT | Cloud Platforms (AWS/Azure/GCP) | Data Warehousing | Airflow | DataOps

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

noida, new delhi, gurugram