Experience: 6-10 Years Location: Bangalore Employment Type: Full Time, Permanent Department: Engineering Software & QA Role Category: Software Development Industry Type: Management Consulting Role Overview We are looking for a passionate and experienced Data Engineer with expertise in AWS cloud services to join our dynamic team. In this role, you'll design and implement advanced data pipelines and analytics solutions that empower business decision-making. You will also get the unique opportunity to work closely with AWS Professional Services, leveraging the latest AWS technologies in Big Data and Machine Learning to address complex enterprise challenges. Key Responsibilities Collaborate with Product Owners and AWS Architects to analyze requirements and deliver scalable data solutions Design and build robust data transformation pipelines using best practices Implement data models and architectures aligned with business needs and technical design Explore and evaluate new AWS analytics capabilities, develop prototypes, and contribute to proof-of-value (POV) initiatives Participate in architectural and design discussions with cross-functional teams Required Qualifications & Skills Cloud & Data Engineering Expertise 3+ years of hands-on experience developing ETL/ELT pipelines handling structured, semi-structured, and unstructured data Proficient in real-time, streaming, batch, and on-demand data processing Deep understanding of data warehousing, data lakes, and lakehouse architecture on AWS AWS Technologies Proven experience with AWS services including: AWS Glue, Redshift, Athena, S3, Kinesis, Lambda, EMR, and Lake Formation Exposure to DynamoDB is a plus Familiarity with MSK (Managed Streaming for Kafka) and Apache Spark Programming & Data Modeling Proficient in Python, PySpark, SQL Additional programming in Java or Scala preferred Understanding of dimensional data modeling is an advantage Streaming & Orchestration Tools Experience with streaming platforms such as Kafka (on-prem/cloud) and Kinesis Hands-on experience in workflow orchestration tools like Apache Airflow or Control-M Other Requirements: Strong communication and interpersonal skills to collaborate with cross-functional teams and stakeholders Excellent attention to detail, with the ability to manage multiple tasks and prioritize effectively Exposure to Agile/Scrum methodologies Education UG: Any Graduate PG: Any Postgraduate Key Skills AWS (Glue, Lambda, Kinesis, Redshift, S3, Athena, EMR) Python, PySpark, SQL Data Warehousing & Lakehouse Concepts Data Architecture & Modeling Kafka / MSK, Apache Airflow, Control-M Machine Learning (Exposure preferred) Cloud Data Engineering NoSQL (e.g., DynamoDB) Analytics & ETL Development Why Join Us If you're passionate about solving real-world problems using cloud and data technologies and thrive in a collaborative, high-energy environment, this is the right opportunity for you. You'll work with a talented team and cutting-edge AWS solutions, contributing to impactful, enterprise-scale projects.