Google Data Engineer

8 - 15 years

7 - 8 Lacs

Posted:1 week ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Google Cloud Data Engineer
Job Description: We are seeking a highly skilled Data Engineer with extensive experience in Google Cloud Platform (GCP) data services and big data technologies. The ideal candidate will be responsible for designing, implementing, and optimizing scalable data solutions while ensuring high performance, reliability, and security. : Design, develop, and maintain scalable data pipelines and architectures using GCP data services. Implement and optimize solutions using BigQuery, Dataproc, Composer, Pub/Sub, Dataflow, GCS, and BigTable. Work with GCP databases such as Bigtable, Spanner, CloudSQL, AlloyDB, ensuring performance, security, and availability. Develop and manage data processing workflows using Apache Spark, Hadoop, Hive, Kafka, and other Big Data technologies. Ensure data governance and security using Dataplex, Data Catalog, and other GCP governance tooling. Collaborate with DevOps teams to build CI/CD pipelines for data workloads using Cloud Build, Artifact Registry, and Terraform. Optimize query performance and data storage across structured and unstructured datasets. Design and implement streaming data solutions using Pub/Sub, Kafka, or equivalent technologies. Required Skills & Qualifications: 8-15 years of experience Strong expertise in GCP Dataflow, Pub/Sub, Cloud Composer, Cloud Workflow, BigQuery, Cloud Run, Cloud Build. Proficiency in Python and Java, with hands-on experience in data processing and ETL pipelines. In-depth knowledge of relational databases (SQL, MySQL, PostgreSQL, Oracle) and NoSQL databases (MongoDB, Scylla, Cassandra, DynamoDB). Experience with Big Data platforms such as Cloudera, Hortonworks, MapR, Azure HDInsight, IBM Open Platform. Strong understanding of AWS Data services such as Redshift, RDS, Athena, SQS/Kinesis. Familiarity with data formats such as Avro, ORC, Parquet. Experience handling large-scale data migrations and implementing data lake architectures. Expertise in data modeling, data warehousing, and distributed data processing frameworks. Deep understanding of data formats such as Avro, ORC, Parquet. Certification in GCP Data Engineering Certification or equivalent. Good to Have: Experience in BigQuery, Presto, or equivalent. Exposure to Hadoop, Spark, Oozie, HBase. Understanding of cloud database migration strategies. Knowledge of GCP data governance and security best practices.
Skills:
  • Apache Kafka
  • Apache Spark
  • Data Migration
  • Google Cloud Platform
  • Python
  • Hadoop Ecosystem (HDFS)

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
CGI logo
CGI

Information Technology and Consulting

Montreal

RecommendedJobs for You