Posted:2 days ago|
Platform:
Work from Office
Full Time
At least 3 years of experience in Pyspark with GCP (Airflow, Dataproc, Big query) capable of configuring data pipelines.
Strong Experience in writing complex SQL queries to perform data analysis on Databases SQL server, Oracle, HIVE etc.
Possess the following technical skills SQL, Python, Pyspark, Hive, ETL, Unix, Control-M (or similar scheduling tools)
Ability to work independently on specialized assignments within the context of project deliverables
Take ownership of providing solutions and tools that iteratively increase engineering efficiencies.
Design should help embed standard processes, systems and operational models into the BAU approach for end-to-end execution of Data Pipelines
Proven problem solving and analytical abilities including the ability to critically evaluate information gathered from multiple sources, reconcile conflicts, decompose high-level information into details and apply sound business and technical domain knowledge
Communicate openly and honestly. Advanced oral, written and visual communication and presentation skills - the ability to communicate efficiently at a global level is paramount.
Ability to deliver materials of the highest quality to management against tight deadlines.
IDESLABS PRIVATE LIMITED
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Java coding challenges to boost your skills
Start Practicing Java Now6.0 - 9.0 Lacs P.A.
Bengaluru
7.0 - 12.0 Lacs P.A.
30.0 - 35.0 Lacs P.A.
Mumbai, Delhi / NCR, Bengaluru
8.0 - 10.0 Lacs P.A.
18.0 - 27.5 Lacs P.A.
80.0 - 85.0 Lacs P.A.
Indore, Pune, Chennai
3.0 - 7.0 Lacs P.A.
1.0 - 5.0 Lacs P.A.
Bengaluru
5.0 - 8.0 Lacs P.A.
Bengaluru
8.0 - 12.0 Lacs P.A.