Posted:1 week ago| Platform:
Work from Office
Full Time
Role & responsibilities : Job Description: Primarily looking for a Data Engineer (AWS) with expertise in processing data pipelines using Data bricks, PySpark SQL on Cloud distributions like AWS Must have AWS Data bricks ,Good-to-have PySpark, Snowflake, Talend Requirements- • Candidate must be experienced working in projects involving • Other ideal qualifications include experiences in • Primarily looking for a data engineer with expertise in processing data pipelines using Databricks Spark SQL on Hadoop distributions like AWS EMR Data bricks Cloudera etc. • Should be very proficient in doing large scale data operations using Databricks and overall very comfortable using Python • Familiarity with AWS compute storage and IAM concepts • Experience in working with S3 Data Lake as the storage tier • Any ETL background Talend AWS Glue etc. is a plus but not required • Cloud Warehouse experience Snowflake etc. is a huge plus • Carefully evaluates alternative risks and solutions before taking action. • Optimizes the use of all available resources • Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit • Skills • Hands on experience on Databricks Spark SQL AWS Cloud platform especially S3 EMR Databricks Cloudera etc. • Experience on Shell scripting • Exceptionally strong analytical and problem-solving skills • Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses • Strong experience with relational databases and data access methods especially SQL • Excellent collaboration and cross functional leadership skills • Excellent communication skills both written and verbal • Ability to manage multiple initiatives and priorities in a fast-paced collaborative environment • Ability to leverage data assets to respond to complex questions that require timely answers • has working knowledge on migrating relational and dimensional databases on AWS Cloud platform Skills Mandatory Skills: Apache Spark, Databricks, Java, Python, Scala, Spark SQL. Note : Need only Immediate joiners/ Serving notice period. Interested candidates can apply. Regards, HR Manager
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
INR 10.0 - 20.0 Lacs P.A.
Mumbai, Hyderabad, Bengaluru
INR 8.0 - 12.0 Lacs P.A.
Hyderabad, Pune, Bengaluru
INR 20.0 - 35.0 Lacs P.A.
Hyderabad
INR 10.0 - 20.0 Lacs P.A.
INR 15.0 - 30.0 Lacs P.A.
Chennai
INR 17.0 - 32.0 Lacs P.A.
INR 6.0 - 8.0 Lacs P.A.
Hyderabad
INR 17.0 - 30.0 Lacs P.A.
Experience: Not specified
INR 3.0 - 8.0 Lacs P.A.
INR 20.0 - 30.0 Lacs P.A.