Senior Data Engineer

5 years

0 Lacs

Posted:7 hours ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

This is a remote position.

Shuru is a self-managed technology team specializing in accelerating visions through product, technology, and AI leadership. With a focus on bespoke execution, we deliver impactful solutions that are scalable and designed for success. At Shuru, we deliver mobile solutions that meet and exceed customer expectations. Our collaborative and fast-paced environment encourages creativity and innovation.


We are seeking a highly skilled Senior Data Engineer to join our engineering team. This role requires a hands-on professional who will design, build, and maintain robustdata pipelines that power our product, analytics and machine learning initiatives. You'll work closely with cross-functional teams to ensure data quality, reliability, and accessibility across our platform.


Responsibilities:

  • Design and implement scalable data pipelines for ingestion, validation, cleanup, and normalization
  • Build and maintain ETL/ELT processes to support both batch and real-time data processing
  • Develop data quality frameworks and monitoring systems to ensure data integrity
  • Optimize pipeline performance and troubleshoot data flow issues
    Deploy and manage data infrastructure on AWS and GCP platforms
  • Implement Infrastructure as Code (IaC) using Terraform for reproducible
  • deployments
  • Design and maintain data lakes, data warehouses, and streaming architectures
  • Ensure security, compliance, and cost optimization across cloud resources
  • Implement event-driven architectures using technologies like Kafka, Kinesis, or Pub/Sub

Requirements

Technical Skills

5+ years of experience in data engineering or related field Hands-on experience with data pipeline development and maintenance Strong proficiency in Python, SQL, and at least one other programming language (Java, Scala, Go) Extensive experience with AWS services (S3, Redshift, EMR, Kinesis, Lambda, Glue) Solid experience with GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Storage) Proven experience with Infrastructure as Code using Terraform Strong background in streaming technologies (Apache Kafka, Amazon Kinesis, Google Pub/Sub) Data Technologies
  • Experience with data processing frameworks (Apache Spark, Apache Beam, Airflow)
  • Proficiency with data warehousing solutions (Redshift, BigQuery, Snowflake)
  • Knowledge of NoSQL databases (MongoDB, Cassandra, DynamoDB)
  • Familiarity with containerization (Docker, Kubernetes)
  • Experience with monitoring and observability tools (CloudWatch, Stackdriver, Datadog)
Benefits

Benefits:

  • Competitive salary and benefits package.
  • Opportunity to work with a team of experienced product and tech leaders.
  • A flexible work environment with remote working options.
  • Continuous learning and development opportunities.
  • Chance to make a significant impact on diverse and innovative projects


Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

mumbai, hyderabad, bengaluru

chennai, tamil nadu, india