AWS Data and API Gateway Pipeline Engineer

6 - 10 years

22.5 - 25.0 Lacs P.A.

Pune, Bengaluru, Noida

Posted:2 months ago| Platform: Naukri logo

Apply Now

Skills Required

aws developmentaws developerapi gateway engineerPySparkapi engineeraws engineerapi developerSQLPython

Work Mode

Work from Office

Job Type

Full Time

Job Description

Key Responsibilities: Data Pipeline Design & Development Design and develop scalable, resilient, and secure ETL/ELT data pipelines using AWS services. Build and optimize data workflows leveraging AWS Glue, EMR, Lambda, and Step Functions. Implement batch and real-time data ingestion using Kafka, Kinesis, or AWS Data Streams. Ensure efficient data movement across S3, Redshift, DynamoDB, RDS, and Snowflake. Cloud Data Engineering & Storage Architect and manage data lakes and data warehouses using Amazon S3, Redshift, and Athena. Optimize data storage and retrieval using Parquet, ORC, Avro, and columnar storage formats. Implement data partitioning, indexing, and query performance tuning. Work with NoSQL databases (DynamoDB, MongoDB) and relational databases (PostgreSQL, MySQL, Aurora). Infrastructure as Code (IaC) & Automation Deploy and manage AWS data infrastructure using Terraform, AWS CloudFormation, or AWS CDK. Implement CI/CD pipelines for automated data pipeline deployments using GitHub Actions, Jenkins, or AWS CodePipeline. Automate data workflows and job orchestration using Apache Airflow, AWS Step Functions, or MWAA. Performance Optimization & Monitoring Optimize Spark, Hive, and Presto queries for performance and cost efficiency. Implement auto-scaling strategies for AWS EMR clusters. Set up monitoring, logging, and alerting with AWS CloudWatch, CloudTrail, and Prometheus/Grafana. Security, Compliance & Governance Implement IAM policies, encryption (AWS KMS), and role-based access controls. Ensure compliance with GDPR, HIPAA, and industry data governance standards. Monitor data pipelines for security vulnerabilities and unauthorized access. Collaboration & Stakeholder Engagement Work closely with data analysts, data scientists, and business teams to understand data needs. Document data pipeline designs, architecture decisions, and best practices. Mentor and guide junior data engineers on AWS best practices and optimization techniques. Required Qualifications: Experience: 6+ years in data engineering with a focus on AWS cloud technologies. Technical Skills: Expertise in AWS Glue, Lambda, EMR, Redshift, Kinesis, and Step Functions. Proficiency in SQL, Python, and PySpark for data transformations. Strong understanding of ETL/ELT best practices and data warehousing concepts. Experience with Apache Airflow or Step Functions for orchestration. Familiarity with Kafka, Kinesis, or other streaming platforms. Knowledge of Terraform, CloudFormation, and DevOps for AWS. Preferred Skills: Experience with AWS Lake Formation and Data Catalog for metadata management. Knowledge of Databricks, Snowflake, or BigQuery for data analytics. AWS Certified Data Analytics Specialty or AWS Certified Solutions Architect is a plus. Soft Skills: Strong problem-solving and analytical thinking. Excellent communication and collaboration abilities. Ability to work independently and in agile teams. A proactive approach to identifying and addressing challenges in data workflows. Location-Bengaluru,Pune,Noida,Hyderabad

Staffing and Recruitment
San Francisco

RecommendedJobs for You

Chennai, Pune, Delhi, Mumbai, Bengaluru, Hyderabad, Kolkata

Pune, Bengaluru, Mumbai (All Areas)

Chennai, Pune, Delhi, Mumbai, Bengaluru, Hyderabad, Kolkata

Bengaluru, Hyderabad, Mumbai (All Areas)

Hyderabad, Gurgaon, Mumbai (All Areas)