The IT Technical Engineer Act as a technical software expert, accountable for software development including the research, design, programming, and testing of operating or new software. Supporting business and IT to adapt digital technologies faster and to improve application or solution performance.
HOW YOU WILL CONTRIBUTE AND WHAT YOU WILL LEARN
- Develop ETL/ELT workflows for both batch and streaming data, supporting structured and unstructured data sources.
- Implement and support Lakehouse and Medallion architecture (Bronze/Silver/Gold) using Azure Data Lake Storage and Delta Lake.
- Optimize Spark jobs through effective partitioning, caching, cluster tuning, and cost-efficient resource utilization.
- Integrate Databricks with key Azure services, including Data Factory, Synapse, Key Vault.
- Implement strong data governance practices using Unity Catalog, encryption, lineage, and data quality checks.
- Monitor, troubleshoot, and improve the performance of data pipelines, clusters, and job executions.
- Collaborate with business analysts, architects, and business stakeholders to develop end-to-end analytics solutions.
- Support CI/CD automation, Git branching strategies, and DevOps best practices.
- Support change management activities, and follow coding standards.
Qualifications
Must Have:
- 3.5+ years of hands-on experience in Azure Databricks, Spark, PySpark, SQL, and data engineering with Strong experience implementing Delta Lake, Unity Catalog, Lakehouse architecture, and Databricks Jobs/Pipelines.
- Proficiency with Azure services such as Data Lake Storage, Data Factory, Integrated runtimes, and Key Vault.
- Strong understanding of ETL/ELT design, data modeling, medallion architecture, and large-scale data processing.
- Experience with performance tuning, cluster optimization, and troubleshooting distributed systems and Hands-on experience with Power BI.
- Experience with CI/CD pipelines, Azure DevOps, Git version control.
Nice-To-Have:
- Exposure to Unity Catalog advanced features, Row-Level Security, and data governance frameworks.
- Knowledge of monitoring and observability tools, data quality frameworks, or automation solutions. or other analytics/metadata platforms.
- Azure certifications such as DP-203 or Databricks Data Engineer Associate/Professional.
- Experience integrating Databricks with external systems via APIs, connectors, and streaming services.