Posted:13 hours ago|
Platform:
On-site
Part Time
Role Proficiency:
This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required.
Outcomes:
Measures of Outcomes:
Outputs Expected:
Code:
Documentation:
Configure:
Test:
Domain Relevance:
Manage Project:
Manage Defects:
Estimate:
Manage Knowledge:
Release:
Design:
Interface with Customer:
Manage Team:
Certifications:
Skill Examples:
Knowledge Examples:
Knowledge Examples
Additional Comments:
Role Purpose: The Cloud Engineer ensures the stability, performance, and scalability of the Data Transformation Platform hosted in Microsoft Azure. They will work as part of a team responsible for both the operational support to maintain platform uptime, as well as the delivery of new features enhancements driven by stakeholder requirements. This includes designing, building, and optimising infrastructure, CI/CD workflows using tools such as Databricks, Terraform, and Azure DevOps. The Engineer will also be working closely with developers to build pipelines for ingestions and transformations. Key Accountabilities / Responsibilities: · Working as part of a team that is responsible for the uptime of data transformation platforms, as well any feature requests that are being fed in by our stakeholders. · Building scalable and reliable Azure DevOps pipelines for infrastructure deployments. · Working in collaboration with data engineers, developers, and security teams · Maintaining up-to-date documentation for the platform, as well as thoroughly documenting and showcasing any new features built. · Participating in OOH support call-out rota. Required Skills & Experience: · Strong hands-on experience with Azure Cloud, Azure Databricks, and data integration workflows. · Confident in Terraform for IaC in Azure. · Understanding of Azure networking (VNets, NSGs VPNs, Private Endpoints, DNS). · Experience building and managing Azure DevOps pipelines for code, infrastructure, and data workflows. · Familiarity with monitoring, logging, and ing using Azure Monitor, Log Analytics, or similar tools. · Comfortable with scripting (PowerShell, Bash, or Python). · Understanding of cloud security best practices (IAM, RBAC, Key Vault, policies). · Excellent communication, documentation, and collaboration skills. · Previous Experience in Data Platform teams preferred.
Azure Databricks,Terraform,Azure DevOps,Azure Cloud
UST Global
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Nowthiruvananthapuram
8.0 - 9.0 Lacs P.A.
thiruvananthapuram, kerala
Salary: Not disclosed
Thiruvananthapuram
8.0 - 9.15 Lacs P.A.
Trivandrum, Kerala, India
Salary: Not disclosed
pune, maharashtra
Salary: Not disclosed
Pune, Maharashtra, India
Salary: Not disclosed
Thiruvananthapuram
8.0 - 9.6 Lacs P.A.
Trivandrum, Kerala, India
Experience: Not specified
Salary: Not disclosed
thiruvananthapuram
8.0 - 9.0 Lacs P.A.
thiruvananthapuram, kerala
Salary: Not disclosed