Posted:8 hours ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Experience Level:

Loc:

Must Have Skillset

SQL (4+ Years)

Python or PySpark (4+ Years)

GCP services (3+ Years)

BigQuery

Dataflow or Dataproc

Pub/Sub

Scheduled Query

Cloud Functions

Monitoring Tools Dashboard

Apache Kafka

Terraform scripting (2+ Year)

Airflow/Astronomer/Cloud Composer (2+ Year)

Good to have Skillset

Data fusion

Tekton pipeline/CI-CD Pipeline

GCP:

Cloud Secrete Manager

Soft Skills

Strong all-round communication skills (written / presentation, one-to-one / group) & Excellent analytical, problem-solving and communication skills

Additional Notes:

When hiring for a Data Engineer with GCP experience, it is important to look for candidates with a strong understanding of both data warehousing and data processing technologies.

Roles & Responsibilities

  • Design, develop, and maintain scalable data pipelines and architectures in Google Cloud Platform (GCP) using Python and SQL.
  • Lead the implementation of data integration solutions utilizing GCP services such as BigQuery, Dataflow, and Pub/Sub.
  • Collaborate with cross-functional teams to gather requirements and translate them into technical specifications for data-related projects.
  • Implement monitoring and optimization processes to ensure data quality and performance across data workflows.
  • Utilize Apache Kafka for real-time data processing and ensure data availability and reliability.
  • Develop and manage CI/CD pipelines using Terraform and Tekton for automated deployment.
  • Mentor and guide junior data engineers in best practices for data engineering and GCP technologies.
  • Create and maintain documentation for data engineering processes, architecture, and workflows.
  • Conduct regular performance testing and troubleshooting of data systems to identify and resolve issues swiftly.
  • Stay updated with the latest trends and advancements in data engineering and GCP services to drive continuous improvement.

Our ideal candidate

  • Data Engineer with 5-30 years of experience
  • Advanced proficiency in Google Cloud Platform (GCP)
  • Ability to design, develop, and manage scalable data processing solutions in a cloud environment
  • Expertise in GCP services like BigQuery, Cloud Dataflow, and Data Storage
  • Efficient data handling and analysis
  • Proficient knowledge of Python as the primary programming language for data manipulation
  • Building data pipelines and automating data workflows
  • Strong skills in data engineering concepts including ETL processes, data modeling, and architecture
  • Managing complex datasets to ensure data integrity and accessibility across platforms
  • Experience in leveraging GCP for data lakes and warehouses
  • Implementing best practices for data governance and security
  • Optimizing performance
  • Solid understanding of data analytics frameworks and tools
  • Deriving meaningful insights from large datasets
  • Master's of Technology (M.Tech) in Data Science or Big Data
  • Bachelor’s degree in Engineering (B.E. or B.Tech) with a specialization in Computer Science and Engineering
  • Preferred certifications: Google Cloud Professional Data Engineer
  • Microsoft Certified: Azure Data Engineer Associate


naveena@intellistaff.in

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

bengaluru, karnataka, india