Developer – Data Engineering & Cloud Analytics

4 - 6 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Developer – Data Engineering & Cloud Analytics

Role Overview:

Responsible for building, maintaining, and optimizing large-scale data pipelines and analytics solutions leveraging Snowflake, Kafka, Splunk, Airflow, AWS, Apache Iceberg, and Presto. The candidate will bring hands-on development skills and collaborate with architects, analysts, and DevOps teams to deliver reliable, efficient, and scalable data services.

Key Responsibilities

           •          Develop and maintain ETL/ELT pipelines using Airflow, orchestrating data movement across Snowflake, AWS, Iceberg, and Kafka systems.

           •          Implement and optimize real-time data ingestion and streaming solutions using Apache Kafka, ensuring high throughput and fault tolerance.

           •          Integrate Apache Iceberg and Presto for interactive analytics on large-scale data lakes.

           •          Write SQL, Python, and/or Scala code for complex data transformations, metric calculations, and business logic deployment.

           •          Collaborate with data architects to evolve data models and ensure alignment with enterprise best practices.

           •          Utilize Splunk for operational monitoring, log analysis, and incident troubleshooting within data workflows.

           •          Deploy and manage infrastructure on AWS (S3, EC2, Glue, Lambda, IAM), focusing on automation, scalability, and security.

•          Document pipelines, produce clear runbooks, and share technical knowledge with team members.

Required Skills & Experience

           •          4 to 6 years of hands-on development experience with modern data stack components: Snowflake, Apache Kafka, Airflow, and AWS.

           •          Strong working knowledge of scalable SQL (preferably Snowflake, Presto) and scripting (Python/Scala).

           •          Experience implementing data lake solutions with Apache Iceberg.

           •          Familiarity with Splunk for monitoring and event management.

           •          Proven history of building, deploying, and troubleshooting ETL/ELT data flows and real-time streaming jobs.

           •          Knowledge of IAM, networking, and security concepts on AWS.

Preferred Qualifications

           •          Bachelor’s degree in Computer Science, Engineering, or related field.

           •          Experience in cloud-native data warehousing, cost optimization, and compliance.

           •          Certifications in AWS, Snowflake, or other relevant technologies.

This role is ideal for candidates who enjoy end-to-end work on cloud-native analytics platforms, working with cutting-edge data streaming and lakehouse technologies in production environments.

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You