Core Purpose To design, build, and maintain scalable data pipelines and platforms that enable secure, efficient, and reliable data processing. This role focuses on transforming raw data into structured, high-quality datasets for analytics, AI/ML, and business intelligence. The Data Engineer collaborates with stakeholders to deliver robust data solutions that drive innovation and operational excellence.
Key Responsibilities Data Engineering and Pipeline Development - Build and maintain ETL/ELT pipelines for batch and streaming data.
- Develop data ingestion frameworks integrating APIs, transactional systems, and streaming platforms.
- Optimize data workflows for performance and cost efficiency in cloud environments.
Collaboration and Knowledge Sharing - Work closely with product managers, analysts, and engineering teams to understand data requirements and deliver optimized pipelines.
- Document pipeline workflows, data lineage, and operational runbooks for easy maintenance and onboarding.
- Participate in code reviews and knowledge-sharing sessions to improve team efficiency and technical excellence.
Solution Security, Scalability and Reliability - Implement automated monitoring and alerting for data pipelines to ensure reliability and minimize downtime.
- Develop CI/CD processes for data workflows to enable rapid deployment and version control.
- Monitor and optimize database performance, query efficiency and cost management in cloud environments.
Skills & Qualifications Education & Experience - Bachelor’s degree in Computer Science, Information Systems, Data Science or related field
- 3-5 years of hands-on experience in data-related roles.
- Strong experience building and maintaining ETL/ELT pipelines for batch and streaming data.
- Practical experience with data integration from APIs, transactional systems and streaming platforms.
- Experience with cloud-based data platforms (Azure, Databricks or MS Fabric preferred). Certifications in cloud architecture platforms a plus
Technical Fluency - Proficiency in Python and SQL
- Knowledge of data quality frameworks, metadata management and master data management.
- Familiarity with prompt engineering and agent-based tools (Copilot Studio, Azure AI Foundry, or similar)
- Knowledge of data visualization tools (Power BI, Tableau, or similar)
Core Competencies - Manages Complexity: Makes sense of complex, high quantity, and sometimes contradictory information to effectively solve problems. For example, asks questions to encourage others to think differently and enrich their analyses of complex situations. Accurately defines the key elements of complex, ambiguous situations.
- Plans and Aligns: Plans and prioritizes work to meet commitments aligned with organizational goals. For example, outlines clear plans that put actions in a logical sequence; conveys some time frames. Aligns own work with relevant workgroups. Takes some steps to reduce bottlenecks and speed up the work.
- Ensures Accountability: Holds self and others accountable to meet commitments. For example, tracks performance and strives to remain effective, learning from both successes and failures. Readily takes on challenges or difficult tasks and has reputation for delivering on commitments.
- Tech Savvy: Anticipates and adopts innovations in business-building digital and technology applications. For example, ensures that the team has adequate resources to invest in technology advancements and the training to use them well. Deploys some new technologies to enhance effectiveness of the group and business.