Posted:1 day ago|
                                Platform:
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                 
                                
                                
                                
                                
                            
On-site
Part Time
Role Proficiency:
This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions.
Outcomes:
Measures of Outcomes:
Outputs Expected:
Code Development:
Documentation:
Configuration:
Testing:
Domain Relevance:
Project Management:
Defect Management:
Estimation:
Knowledge Management:
Release Management:
Design Contribution:
Customer Interface:
Team Management:
Certifications:
Skill Examples:
Knowledge Examples:
Knowledge Examples
Additional Comments:
Role Summary: Data Engineer As an experienced Data Engineer should be responsible for developing and managing scalable data workflows using a wide range of Azure services. Resource should be able to integrate data from different systems (Teradata, Sybase, SAP, Salesforce) to support both business analytics and real-time data solutions. Collaborate closely with cross-functional teams across sectors, focusing on building and optimizing robust, secure, and cost-effective data solutions. Key Responsibilities 1.Develop and manage Azure Data Factory (ADF) pipelines to ingest data from legacy and cloud systems into the Data Lake. 2.Build and optimize Databricks notebooks for ingestion and curation, enabling semantic views for reporting and event streaming to messaging queues. 3.Optimize workflows for performance and cost-efficiency, including job scheduling, cluster sizing, and PySpark / SQL code tuning. 4.Develop and manage Unity Catalog objects, ensuring proper access control and data organization. 5.Ensure compliance with PEP data governance and security standards across all data workflows. 6.Participate in code reviews, documentation, and DevOps deployments using Azure DevOps. 7.Integrate with various systems such as Snowflake, Salesforce, Oracle, SAP, ASQL, PostgreSQL, messaging queues, ADLS, API's and Unity Catalog for data ingestion and publishing. 8.Contribute to solution design, job optimization, and data workflow architecture. 9.Support migration efforts, including the transition of Teradata ETL logic to Azure Databricks and Unity Catalog. Technical Skills & Tools 1.Cloud & Data Engineering: Azure Data Factory, Azure Databricks, Azure Logic Apps, Azure Functions, AKS, Event Hubs, Kafka, Power Automate etc. 2.Data Sources: Teradata, Sybase, SAP, Salesforce, Oracle, PostgreSQL, Snowflake, SharePoint 3.Languages & Frameworks: PySpark, SQL, Python 4.Data Governance: Unity Catalog, AAD integration, access control 5.DevOps & CI/CD: Azure DevOps, ARM templates 6.Visualization & Reporting: Exposure to Power BI, Web Apps. 7.Streaming & Real-Time: Event Hubs, Kafka, Autoloader etc.
Data Governance,Azure Databricks,Azure Data Factory,Azure Devops
 
                UST Global
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
 
        Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
 
            
         
                        Practice Python coding challenges to boost your skills
Start Practicing Python Now 
    thiruvananthapuram
7.32 - 9.0 Lacs P.A.
trivandrum, kerala, india
Experience: Not specified
Salary: Not disclosed
bengaluru
5.0 - 5.5 Lacs P.A.
pune, maharashtra, india
Experience: Not specified
Salary: Not disclosed
chennai, bengaluru, thiruvananthapuram
5.0 - 5.5 Lacs P.A.
karnataka
Salary: Not disclosed
pune, maharashtra, india
Experience: Not specified
Salary: Not disclosed
thiruvananthapuram, kerala
Salary: Not disclosed
Thiruvananthapuram
8.0 - 10.0 Lacs P.A.
Trivandrum, Kerala, India
Salary: Not disclosed