Posted:1 day ago|
Platform:
Remote
Full Time
We are seeking a mid-level GCP Data Engineer with 4+years of experience in ETL, Data Warehousing, and Data Engineering. The ideal candidate will have hands-on experience with GCP tools, solid data analysis skills, and a strong understanding of Data Warehousing principles.
Roles & Responsibilities;
* Analyze the different source systems, profile data, understand, document & fix Data Quality issues
* Gather requirements and business process knowledge in order to transform the data in a way that is geared towards the needs ofend users
* Write complex SQLs to extract &format source data for ETL/data pipeline
* Create design documents, Source to Target Mapping documents and any supporting documents needed for deployment/migration
* Design, Develop and Test ETL/Data pipelines
* Design & build metadata-based frameworks needs for data pipelines
* Write Unit Test cases, execute Unit Testing and document Unit Test results Deploy ETL/Data pipelines
* Use DevOps tools to version, push/pull code and deploy across environments
* Support team during troubleshooting & debugging defects & bug fixes, business requests, environment migrations & other adhoc requests
* Do production support, enhancements and bug fixes
* Work with business and technology stakeholders to communicate EDW incidents/problems and manage their expectations
* Leverage ITIL concepts to circumvent incidents, manage problems and document knowledge
* Perform data cleaning, transformation, and validation to ensure accuracy and consistency across various data sources
* Stay current on industry best practices and emerging technologies in data analysis and cloud computing, particularly within the GCP ecosystem
Qualifications:
* 4+years of experience in ETL & Data Warehousing
* Should have excellent leadership & communication skills
* Should have experience in developing Data Engineering solutions Airflow, GCP
* BigQuery, Cloud Storage, Dataflow, Cloud Functions, Pub/Sub, Cloud Run, etc.
* Should have built solution automations in any ofthe above ETL tools
* Should have executed at least 2 GCP Cloud Data Warehousing projects
* Should have worked at least 2 projects using Agile/SAFe methodology
* Should Have mid level experience in Pyspark and Teradata
* Should Have mid level experience in
* Should have working experience on any DevOps tools like GitHub, Jenkins,
* Cloud Native, etc & on semi-structured data formats like JSON, Parquet and/or XML
* files & written complex SQL queries for data analysis and extraction
* Should have in depth understanding on Data Warehousing, Data Analysis, Data
* Profiling, Data Quality & Data Mapping
Education: B.Tech. /B.E. in Computer Science or related field.
eCertifications: Google Cloud Professional Data Engineer Certification.
Job Types: Full-time, Permanent
Pay: ₹600,000.00 - ₹1,800,000.00 per year
Benefits:
Application Question(s):
Experience:
Location:
Work Location: In person
NexGen TechSoft
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Kolkata, New Delhi, Bengaluru
30.0 - 35.0 Lacs P.A.
hyderabad, bangalore, mohali, chennai, kolkata, mumbai city, bhillai, delhi
0.0001 - 0.00016 Lacs P.A.
Pune, Maharashtra, India
Salary: Not disclosed
Pune, Maharashtra, India
Experience: Not specified
Salary: Not disclosed
6.0 - 10.0 Lacs P.A.
Hyderabad, Telangana
Experience: Not specified
10.0 - 15.0 Lacs P.A.
Chennai, Tamil Nadu
Experience: Not specified
6.0 - 18.0 Lacs P.A.
Pune, Maharashtra, India
Salary: Not disclosed
India
Salary: Not disclosed
Pune, Maharashtra, India
Salary: Not disclosed