Define relationships, hierarchies, calculation groups, KPIs, and reusable business logic in DAX
Select and tune Import, DirectQuery, Direct Lake (and Hybrid tables/Incremental Refresh when needed)
Optimize with aggregations, partitioning, column sizing, and the Performance Analyzer
Build models on Lakehouse/Warehouse (incl. SQL analytics endpoint), Dataflows Gen2, and external sources; configure gateways for on‑prem data
Implement Row‑Level Security (RLS) and Object‑Level Security (OLS); manage Build permissions, endorsements (certified/promoted), and sensitivity labels in partnership with data governance
Use Power BI Projects (PBIP), Git integration and deployment pipelines for versioning and multi‑environment releases; serialize models with TMDL and use external tools where appropriate
Monitor and right‑size Fabric capacity using the Capacity Metrics app; collaborate with admins to keep models and refreshes healthy
Document definitions, lineage, and refresh behaviors; evangelize reuse of the semantic layer across workspaces
Requirements
6+ years of experience building production ETL/ELT pipelines
Strong Power BI modeling experience with dimensional/ star schema design, and strong DAX and Power Query (M)
Hands‑on experience with Microsoft Fabric (Lakehouse/Warehouse, SQL analytics endpoint, OneLake concepts)
Proven use of Direct Lake plus Import/DirectQuery and a clear understanding when/why to use each
RLS/OLS design and implementation experience
PBIP + Git and deployment pipelines for CI/CD; TMDL familiarity (preview features acceptable)