Senior Consultant - Business Consulting Risk - FSRM

4 - 8 years

13 - 18 Lacs

Posted:2 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description


 Requisition Id 1650678 
As a global leader in assurance, tax, transaction and advisory services, we hire and develop the most passionate people in their field to help build a better working world. This starts with a culture that believes in giving you the training, opportunities and creative freedom. At EY, we don't just focus on who you are now, but who you can become. We believe that its your career and Its yours to build which means potential here is limitless and we'll provide you with motivating and fulfilling experiences throughout your career to help you on the path to becoming your best professional self.  The opportunity: 
 GCP Data Engineer 
Our FSRM team is a fast-moving, high-growth area with huge potential. It offers variety, challenge, responsibility and the opportunity to realize your leadership potential. Our Financial Services Risk Management (FSRM) practice focuses on risk management, regulatory, quantitative, and technology backgrounds. The breadth of experiences of FSRM professionals enables the practice to coordinate the delivery of a broad array of risk management services to capital market participants throughout the world in a well-integrated manner.  About the Role: 
We are looking for a skilled and passionate GCP Data Engineer with strong hands-on experience in building scalable data pipelines and working with core GCP services. The ideal candidate will have a deep understanding of data engineering principles and be proficient in Python scripting and orchestration tools like Airflow.  Key Responsibilities: 
  • Design, develop, and maintain robust data pipelines using GCP services such as BigQuery, Dataflow, Pub/Sub, CloudSQL, Cloud Composer (Airflow).
  • Build and manage Airflow DAGs for orchestrating complex workflows.
  • Write efficient and reusable Python scripts for data processing and automation.
  • Develop solutions to move and transform data across GCP projects securely and efficiently.
  • Collaborate with data scientists, analysts, and other engineers to ensure data availability and quality.
  • Optimize data workflows for performance, scalability, and cost-efficiency.

  •  Required Skills & Qualifications: 
  • Minimum 4 years of experience in data engineering, with at least 2+ years on GCP.
  • Strong expertise in BigQuery, Dataflow, Pub/Sub, CloudSQL, Composer/Airflow.
  • Proficient in Python for scripting and automation.
  • Experience in designing and implementing data pipelines across GCP projects.
  • Familiarity with CI/CD practices and version control (e.g., Git).
  • Excellent problem-solving and communication skills.

  •  Nice to Have: 
  • Experience with Terraform or other IaC tools.
  • Knowledge of data governance and security best practices on GCP.
  • Exposure to real-time data processing and streaming architectures.

  •  

    Mock Interview

    Practice Video Interview with JobPe AI

    Start Python Interview
    cta

    Start Your Job Search Today

    Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

    Job Application AI Bot

    Job Application AI Bot

    Apply to 20+ Portals in one click

    Download Now

    Download the Mobile App

    Instantly access job listings, apply easily, and track applications.

    coding practice

    Enhance Your Python Skills

    Practice Python coding challenges to boost your skills

    Start Practicing Python Now
    EY logo
    EY

    Professional Services

    London

    RecommendedJobs for You