Aws Engineer

5 - 10 years

15 - 30 Lacs

Posted:2 weeks ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Primarily looking for a Data Engineer with expertise in processing data pipelines using Databricks PySpark SQL on Cloud distributions like AWS

Must have AWS Databricks

Good to have PySpark Snowflake Talend

Requirements-

Candidate must be experienced working in projects involving

Other ideal qualifications include experiences in

Primarily looking for a data engineer with expertise in processing data pipelines using Databricks Spark SQL on Hadoop distributions like AWS EMR Data bricks Cloudera etc.

Should be very proficient in doing large scale data operations using Databricks and overall very comfortable using Python

Familiarity with AWS compute storage and IAM concepts

Experience in working with S3 Data Lake as the storage tier

Any ETL background Talend AWS Glue etc. is a plus but not required

Cloud Warehouse experience Snowflake etc. is a huge plus

Carefully evaluates alternative risks and solutions before taking action.

Optimizes the use of all available resources

Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit

Skills

Hands on experience on Databricks Spark SQL AWS Cloud platform especially S3 EMR Databricks Cloudera etc.

Experience on Shell scripting

Exceptionally strong analytical and problem-solving skills

Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses

Strong experience with relational databases and data access methods especially SQL

Excellent collaboration and cross functional leadership skills

Excellent communication skills both written and verbal

Ability to manage multiple initiatives and priorities in a fast-paced collaborative environment

Ability to leverage data assets to respond to complex questions that require timely answers

has working knowledge on migrating relational and dimensional databases on AWS Cloud platform

Skills

Mandatory Skills : Apache Spark, Databricks, Java, Python, Scala, Spark SQL.

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Ltimindtree logo
Ltimindtree

Information Technology and Services

Bangalore

RecommendedJobs for You

Mumbai, Hyderabad, Chennai

Hyderabad, Chennai, Bengaluru

Hyderabad, Gurugram, Bengaluru