5 years

7 - 9 Lacs

Posted:6 hours ago| Platform: GlassDoor logo

Apply

Work Mode

On-site

Job Type

Part Time

Job Description

As a GCP Data Engineer, you will integrate data from various sources into novel data products. You will build upon existing analytical data, including merging historical data from legacy platforms with data ingested from new platforms. You will also analyze and manipulate large datasets, activating data assets to enable enterprise platforms and analytics within GCP. You will design and implement the transformation and modernization on GCP, creating scalable data pipelines that land data from source applications, integrate into subject areas, and build data marts and products for analytics solutions. You will also conduct deep-dive analysis of Current State Receivables and Originations data in our data warehouse, performing impact analysis related to Ford Credit North America's modernization and providing implementation solutions. Moreover, you will partner closely with our AI, data science, and product teams, developing creative solutions that build the future for Ford Credit.

Experience with large-scale solutions and operationalizing data warehouses, data lakes, and analytics platforms on Google Cloud Platform or other cloud environments is a must. We are looking for candidates with a broad set of analytical and technology skills across these areas and who can demonstrate an ability to design the right solutions with the appropriate combination of GCP and 3rd party technologies for deployment on Google Cloud Platform.


  • GCP certified Professional Data Engineer
  • Successfully designed and implemented data warehouses and ETL processes for over five years, delivering high-quality data solutions.
  • 5+ years of complex SQL development experience
  • 2+ experience with programming languages such as Python, Java, or Apache Beam.
  • Experienced cloud engineer with 3+ years of GCP expertise, specializing in managing cloud infrastructure and applications to production-scale solutions.

  • In-depth understanding of GCP’s underlying architecture and hands-on experience of crucial GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, Big Query, Dataflow, Pub/Sub, Data form, astronomer, Data Fusion, DataProc, Pyspark, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Cloud build and App Engine, alongside and storage including Cloud Storage

  • GCP certified Professional Data Engineer
  • Successfully designed and implemented data warehouses and ETL processes for over five years, delivering high-quality data solutions.
  • 5+ years of complex SQL development experience
  • 2+ experience with programming languages such as Python, Java, or Apache Beam.
  • Experienced cloud engineer with 3+ years of GCP expertise, specializing in managing cloud infrastructure and applications to production-scale solutions.

  • In-depth understanding of GCP’s underlying architecture and hands-on experience of crucial GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, Big Query, Dataflow, Pub/Sub, Data form, astronomer, Data Fusion, DataProc, Pyspark, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Cloud build and App Engine, alongside and storage including Cloud Storage
  • DevOps tools such as Tekton, GitHub, Terraform, Docker.
  • Expert in designing, optimizing, and troubleshooting complex data pipelines.
  • Experience developing and deploying microservices architectures leveraging container orchestration frameworks
  • Experience in designing pipelines and architectures for data processing.
  • Passion and self-motivation to develop/experiment/implement state-of-the-art data engineering methods/techniques.
  • Self-directed, work independently with minimal supervision, and adapts to ambiguous environments.
  • Evidence of a proactive problem-solving mindset and willingness to take the initiative.
  • Strong prioritization, collaboration & coordination skills, and ability to simplify and communicate complex ideas with cross-functional teams and all levels of management.
  • Proven ability to juggle multiple responsibilities and competing demands while maintaining a high level of productivity.
  • Master’s degree in computer science, software engineering, information systems, Data Engineering, or a related field.
  • Data engineering or development experience gained in a regulated financial environment.
  • Experience in coaching and mentoring Data Engineers
  • Project management tools like Atlassian JIRA
  • Experience working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment.
  • Experience with data security, governance, and compliance best practices in the cloud.
  • Experience using data science concepts on production datasets to generate insights

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

pune, maharashtra, india

pune, maharashtra, india