GCP Data Engineer with AI Development knowledge

6 - 11 years

18 - 30 Lacs

Hyderabad Pune Bengaluru

Posted:1 day ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Location:

  • Any city in India

Skillset

  • In your role as an Engineer, you will work with change initiatives across the organization to help them design solutions that meet our architecture principles and drive the Bank towards its desired target state.
  • You will work closely with the data modelers to implement various data ingestion and transformation patterns for the feeds coming in from core banking platforms to the warehousing system.
  • You will also design various job streams in Control-M or Apache airflow according to the requirements.
  • You will work closely with cloud engineers to design and develop the next generation of the data distribution solutions, leveraging GCP capabilities.
  • Work with business and technology stakeholders on all levels to understand interfacing application requirements, prioritization and gain required business signoffs.
  • Actively control test management with activities including scoping, determining test strategies, driving defect management, running status calls, meeting business stakeholders' expectations and gaining sign-offs.
  • Perform detailed technology analyses to highlight weaknesses and make recommendations for improvement.
  • Perform unit testing, support UAT testing, various periodic production release related activities and paperwork, post implementation checkouts, SDLC documentation ETC.

Domain or platform knowledge / experience:

  • Experience of 5+ years in the following areas:
  • Strong Programming skill in Python, SQL/PLSQL
  • Handson experience with Pyspark for large scale distributed data processing.
  • Solid understanding of Apache Airflow (DAG design, Scheduling, Orchestration)
  • Experience working with Google cloud platform especially
  • Big query
  • Postgres
  • Cloud Storage
  • Data Proc
  • Cloud composer
  • GKE
  • Knowledge of DevOps configuration management tools (TeamCity, Jenkins, uDeploy, Kubernetes, Maven etc)
  • Building scalable data processing solution using pyspark running on Dataproc/Dataflow
  • Stakeholder influencing & communications.

Additional Requirement:

  • GCP Data engineer with AI development knowledge

Good to have

  • Knowledge of terraform or Infrastructure as code.
  • Understanding of github workflow actions
  • Experience with Data quality Framework

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now
coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Tech Mahindra logo
Tech Mahindra

Information Technology & Services

Noida

RecommendedJobs for You

Bengaluru, Delhi / ncr, Mumbai (all areas)