Home
Jobs

Sr (AWS) Data Engineer / Architect

6 - 11 years

18 - 33 Lacs

Posted:3 months ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Job Description: Senior Data Engineer/Architect with at least 6 years of experience in designing, developing, and optimizing data pipelines, data lakes, and cloud-based data architectures. Skilled in implementing scalable data solutions using Databricks SQL and AWS services, ensuring data quality, security, and performance. Proven ability to collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to deliver high-impact data solutions that drive business insights and operational excellence. Key Qualifications Cloud Data Architecture & Engineering: Expertise in designing and implementing cloud-based data architectures using AWS services such as S3, Glue, Redshift, Athena, Lambda, and EC2. Experience in setting up data lakes, data warehouses, and ETL pipelines optimized for performance and cost efficiency. Databricks Expertise: Strong proficiency in using Databricks SQL for data processing, transformation, and analysis. Skilled in developing and optimizing Spark-based ETL jobs and ensuring seamless integration of Databricks with AWS cloud services. Data Pipeline Development: Experience in building and maintaining scalable and fault-tolerant data pipelines using tools like Apache Spark, Airflow, and AWS Glue. Ability to ingest, transform, and aggregate large volumes of structured and unstructured data efficiently. SQL & Data Modeling: Expertise in SQL programming for data extraction, transformation, and loading (ETL). Experienced in designing and optimizing data models, including dimensional modeling, star schema, and OLAP solutions to enhance query performance. Data Governance & Security: Proficient in implementing data governance frameworks, managing data quality, ensuring compliance with data privacy regulations, and configuring IAM roles, policies, and VPCs to protect sensitive data in AWS environments. Collaboration & Stakeholder Management: Skilled at partnering with business teams, data analysts, and data scientists to gather requirements, translate them into scalable data solutions, and continuously optimize data workflows to meet evolving business needs. Performance Optimization: Proven ability to optimize ETL pipelines and SQL queries, ensuring efficient data processing and reduced latency. Expertise in implementing partitioning, indexing, caching, and other optimization techniques in AWS and Databricks environments. Technical Skills Cloud & Data Platforms: AWS (S3, Glue, Redshift, Lambda, Athena, EMR), Databricks, Apache Spark SQL & Scripting: Databricks SQL, Python, PySpark, SQL, Scala Data Engineering Tools: Apache Airflow, AWS Glue, Delta Lake Data Modeling: Star Schema, Snowflake Schema, Dimensional Modeling Security & Governance: IAM, VPCs, Encryption, Data Privacy Regulations CI/CD & Automation: Terraform, AWS CloudFormation, Git, Jenkins Certifications (Preferred but not mandatory) AWS Certified Data Analytics Specialty Databricks Certified Data Engineer Professional AWS Certified Solutions Architect Associate

Mock Interview

Practice Video Interview with JobPe AI

Start Pyspark Interview Now

My Connections Knowledge Foundry

Download Chrome Extension (See your connection in the Knowledge Foundry )

chrome image
Download Now
Knowledge Foundry
Knowledge Foundry

Technology / Data Analytics

Tech City

50-100 Employees

9 Jobs

    Key People

  • John Smith

    CEO
  • Jane Doe

    CTO

RecommendedJobs for You