Gcp Data Engineer

5 - 10 years

15 - 30 Lacs

Posted:2 months ago| Platform: Naukri logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Role & responsibilities As a Data Engineer with a focus on pipeline migration from SAS to Google Cloud Platform (GCP) technologies, you will tackle intricate problems and create value for our business by designing and deploying reliable, scalable solutions tailored to the companys data landscape. You will be responsible for the development of custom-built data pipelines on the GCP stack, ensuring seamless migration of existing SAS pipelines. Responsibilities: Design, develop, and implement data pipelines on the GCP stack, with a focus on migrating existing pipelines from SAS to GCP technologies. Develop modular and reusable code to support complex ingestion frameworks, simplifying the process of loading data into data lakes or data warehouses from multiple sources. Collaborate with analysts and business process owners to translate business requirements into technical solutions. Utilize your coding expertise in scripting languages (Python, SQL, PySpark) to extract, manipulate, and process data effectively. Leverage your expertise in various GCP technologies, including BigQuery, Dataproc, GCP Workflows, Dataflow, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM, and Vertex AI, to enhance data warehousing solutions. Maintain high standards of development practices, including technical design, solution development, systems configuration, testing, documentation, issue identification, and resolution, writing clean, modular, and sustainable code. Understand and implement CI/CD processes using tools like Pulumi, GitHub, Cloud Build, Cloud SDK, and Docker. Participate in data quality and validation processes to ensure data integrity and reliability. Optimize performance of data pipelines and storage solutions, addressing bottlenecks. Collaborate with security teams to ensure compliance with industry standards for data security and governance. Communicate technical solutions engineering teams and business stakeholders. Required Skills & Qualifications: 5-13 years of experience in software development, data engineering, business intelligence, or a related field, with a proven track record in manipulating, processing, and extracting value from large datasets. Extensive experience with GCP technologies in the data warehousing space, including BigQuery, Dataproc, GCP Workflows, Dataflow, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM, and Vertex AI. Proficient in Python, SQL, and PySpark for data manipulation and pipeline creation. Experience with SAS, SQL Server, and SSIS is a significant advantage, particularly for transitioning legacy systems to modern GCP solutions. Ability to develop reusable, modular code for complex ingestion frameworks and multi-use pipelines. Understanding of CI/CD processes and tools, such as Pulumi, GitHub, Cloud Build, Cloud SDK, and Docker. Proven experience in migrating data pipelines from SAS to GCP technologies. Strong problem-solving abilities and a proactive approach to identifying and implementing solutions. Familiarity with industry best practices for data security, data governance, and compliance in cloud environments. Bachelor's degree in Computer Science, Information Technology, or a related technical field, or equivalent practical experience. GCP Certified Data Engineer (preferred). Excellent verbal and written communication skills, with the ability to advocate for technical solutions to a diverse audience including engineering teams, and business stakeholders. Willingness to work in the afternoon shift from 3 PM to 12 AM IST.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
TELUS International logo
TELUS International

Telecommunications / Customer Experience

Edmonton

RecommendedJobs for You

Hyderabad, Pune, Bengaluru

Pune, Bengaluru, Delhi / NCR