Pyspark GCP Engineer

5 - 10 years

25 - 35 Lacs

Hyderabad, Pune, Bengaluru

Posted:1 day ago| Platform: Naukri logo

Apply

Skills Required

Pyspark SQL bigdata Bigquery Dataproc Hadoop Platform airflow Hive spark gcp Biq Query Data Flow hadoop Python

Work Mode

Work from Office

Job Type

Full Time

Job Description

Job Description Data Engineer/Lead Required Minimum Qualifications Bachelors degree in computer science, CIS, or related field 5-10 years of IT experience in software engineering or related field Experience on project(s) involving the implementation of software development life cycles (SDLC) Primary Skills : PySpark, SQL, GCP EcoSystem(Biq Query, Cloud Composer, DataProc) Design and develop data-ingestion frameworks, real-time processing solutions, and data processing and transformation frameworks leveraging open source tools and data processing frameworks. Hands-on on technologies such as Kafka, Apache Spark (SQL, Scala, Java), Python, Hadoop Platform, Hive, airflow Experience in GCP Cloud Composer, Big Query, DataProc Offer system support as part of a support rotation with other team members. Operationalize open-source data-analytic tools for enterprise use. Ensure data governance policies are followed by implementing or validating data lineage, quality checks, and data classification. Understand and follow the company development lifecycle to develop, deploy and deliver.

Mock Interview

Practice Video Interview with JobPe AI

Start Pyspark Interview Now
Datametica
Datametica

IT Services and IT Consulting

New York NY

1001-5000 Employees

24 Jobs

    Key People

  • Shashi Vangapandu

    Founder & CEO
  • Hitesh Kumar

    CTO

RecommendedJobs for You

Hyderabad, Pune, Bengaluru

Gurugram, Bengaluru, Mumbai (All Areas)

Bengaluru, Delhi / NCR, Mumbai (All Areas)

Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru

Hyderabad, Pune, Bengaluru