Posted:1 day ago|
Platform:
On-site
Part Time
Role Proficiency:
Act creatively to develop applications by selecting appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions. Account for others' developmental activities; assisting Project Manager in day to day project execution.
Outcomes:
Measures of Outcomes:
Outputs Expected:
Code:
Documentation:
Configure:
Test:
Domain relevance:
Manage Project:
Manage Defects:
Estimate:
Manage knowledge:
Release:
Design:
Interface with Customer:
Manage Team:
Certifications:
Skill Examples:
Knowledge Examples:
Highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in software development and data engineering.Certified engineers in GCP Data Engineering and Cloud Architecture. Key Responsibilities: Develop, optimize, and maintain scalable data pipelines and workflows. Design and implement solutions using cloud-based data warehouses such as GCP BigQuery, Snowflake, and Databricks. Collaborate with product and business teams to understand data requirements and deliver actionable insights. Work in an Agile environment, contributing to sprint planning, task prioritization, and iterative delivery. Write clean, maintainable, and efficient code . Knowledge in Python will be advantage. Ensure data security, integrity, and reliability across all systems. Required Qualifications: Strong background in data engineering, including ETL processes and data pipeline construction. Basics of RESTful APIs (GET, POST, PUT, DELETE methods). Knowledge of how APIs communicate and exchange data (JSON or XML formats). Skills in organizing and transforming data fetched from APIs using Python libraries like pandas and json. Understanding of how to avoid vulnerabilities when working with APIs, such as avoiding hardcoding sensitive information Using tools like Postman or Python libraries to test and debug API calls before integrating them into a system Familiarity with Python libraries for API interactions, such as requests, http.client, urllib, database connectors. Skills to handle API response errors and exceptions effectively Knowledge of common authentication methods like OAuth, API keys, or Basic Auth for securing API calls. Hands-on experience with cloud-based data warehouses like BigQuery, Snowflake, and Databricks. Proven ability to work in Agile frameworks and deliver high-quality results in fast-paced environments. Excellent communication skills to liaise effectively with product and business teams. Familiarity with data visualization tools is advantageous.
Sql Queries,Datawarehouse,Gcp,Bigquery
UST Global
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Nowthiruvananthapuram
6.4 - 8.0 Lacs P.A.
trivandrum, kerala, india
Experience: Not specified
Salary: Not disclosed
thiruvananthapuram
6.4 - 8.0 Lacs P.A.
trivandrum, kerala, india
Experience: Not specified
Salary: Not disclosed