Data Engineering- ( GCP+Pyspark) Professional

4 - 8 years

17 Lacs

Posted:1 week ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Candidate Specification

  • Minimum 3+ years of experience into Data Engineering (GCP+Pyspark) Job Description: Expertise in SQL programming and database management systems
  • Experience with building ETL pipelines in object-oriented/object function scripting languages Apache Spark
  • Experience in managing and Maintaining Pipelines Google Cloud Platform (GCP), and experience with relevant services (eg GCP Dataflow, GCP DataProc, Biq Query, Procedures, Cloud Composer etc)
  • Experience with tools like Airflow/cloud composer
  • Experience with BI tools and visualization platforms (eg Tableau) is a plus

Contact Person- Sheena Rakesh

Email id- sheena@gojobs.biz

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Golden Opportunities logo
Golden Opportunities

Career Services

Opportunities City

RecommendedJobs for You