About Company
Papigen
 is a fast-growing global technology services company, delivering innovative digital solutions through deep industry experience and cutting-edge expertise. We specialize in technology transformation, enterprise modernization, and dynamic areas like Cloud, Big Data, Java, React, DevOps, and more. Our client-centric approach combines consulting, engineering, and data science to help businesses evolve and scale efficiently.
About The Role
We are seeking a 
highly skilled BI Developer
 to support our data integration and reporting initiatives. The organization leverages 
Microsoft Azure Data Factory (ADF)
 to orchestrate, automate, and manage data flows across multiple internal and external data sources, and 
DBT
 for transformation. You will play a key role in designing, developing, and maintaining ADF pipelines to ensure 
reliable, scalable, and secure data integration
, while building impactful Power BI reports for business stakeholders.
Key Responsibilities
This role encompasses end-to-end data integration, transformation, and reporting activities. Responsibilities are organized below with examples of daily tasks, expected deliverables, and reporting relationships to set clear expectations.Core responsibilities
- Analyze business and technical requirements for data integration and reporting; translate stakeholder needs into data specifications and acceptance criteria. 
- Design and develop ADF pipelines for ETL/ELT processes, including parameterization, triggers, error handling, retries, and CI/CD integration. 
- Integrate on-premises and cloud-based data sources (e.g., Oracle, File System, SAP, Blob Storage, REST APIs) by configuring linked services, datasets, and integration runtimes. 
- Implement transformation logic using ADF mapping data flows or DBT, create reusable models, and ensure transformations are testable and version-controlled. 
- Build and maintain Power BI data models and reports: design semantic models, DAX measures, performance-optimized reports, and deliver dashboards aligned to business KPIs. 
- Monitor, troubleshoot, and optimize pipeline and report performance; implement proactive alerts, automate recovery where possible, and triage incidents promptly. 
- Collaborate with data engineers, analysts, data architects, product owners, and business stakeholders to ensure data quality, lineage, and usability. 
- Work with infrastructure and security teams to ensure proper ADF configuration, networking, deployment pipelines, and adherence to data security and privacy standards. 
- Document pipeline architecture, data flow diagrams, runbooks, and operational procedures to support handoffs and on-call rotations. 
 
Daily tasks (examples)
- Attend daily stand-ups and requirement refinement sessions with product owners and stakeholders. 
- Review pipeline runs and monitoring dashboards; investigate and resolve failures or performance issues. 
- Develop or update ADF pipelines and DBT models; write unit tests and verify transformation results in lower environments. 
- Create or refine Power BI visuals and measures, validate report data with business users, and apply performance optimizations. 
- Prepare and update documentation: data mappings, deployment notes, and operational runbooks. 
 
Expected deliverables
- Production-ready ADF pipelines with automated deployments and clear parameterization. 
- DBT models and transformation artifacts with tests and version control. 
- Power BI reports and dashboards delivering agreed KPIs, with documented data lineage and refresh schedules. 
- Monitoring dashboards, alerting rules, post-incident reports, and operational runbooks. 
- Technical design documents, data mapping spreadsheets, and architecture diagrams reviewed with stakeholders. 
 
Reporting lines & collaborationThis role typically reports to the Data Engineering Manager or BI Lead. You will work closely with Data Architects, Infrastructure/Platform teams, QA engineers, Data Stewards, Product Owners, and Business Analysts. Expect to provide regular status updates to the BI Lead and participate in monthly roadmap and prioritization reviews.
Qualifications
- Bachelors degree in Computer Science, Information Systems, or related field 
- Minimum 8 years of experience in data engineering, BI, or ETL development 
- Proven hands-on expertise in Azure Data Factory and Power BI 
- Experience with DBT is a plus 
- Excellent documentation and collaboration skills 
 
Skills: sql,azure services,dbt,adf,microsoft power bi,azure data factory,powerbi,data goverance