Bennett Thrasher is a premier provider of professional tax, assurance, and consulting services to businesses and high-net-worth individuals. They are seeking a Senior Data Warehouse Engineer to lead multi-client engagements, design and implement dimensional data warehouses, and deliver governed Power BI semantic models in a client-facing role.
Responsibilities:
- Design, implement, and maintain star/snowflake schemas (grain definition, conformed dimensions, SCD Types 1/2)
- Define reference architectures for Fabric OneLake, Lakehouse ↔ Warehouse patterns, medallion zoning, and security/RLS
- Create technical design docs, estimates, and roadmaps; facilitate design reviews
- Build Dataflows Gen2 for standardized ingestion and PySpark notebooks for scalable transformations
- Develop metadata-driven, reusable load patterns (parameterization, reusable templates, CDC/incremental)
- Orchestrate workloads with Fabric Pipelines including scheduling, retries, alerting, and SLA monitoring
- Author performant T-SQL (staging, MERGE/UPSERT, window functions, partitioning, indexing)
- Publish Power BI semantic models aligned to dimensional designs; implement RLS and incremental refresh
- Partner with analysts to define certified metrics, calculation groups, and dataset governance
- Embed automated data quality checks, reconciliation, lineage, and documentation/data contracts
- Optimize cost/performance (partitioning, caching, refresh strategies) and enforce RBAC/PII controls
- Implement Git-based CI/CD (Azure DevOps/GitHub) and promote Dev/Test/Prod release automation
- Translate business processes into scalable data solutions; present progress and risks to team/client stakeholders
- Mentor team members, lead code reviews, and contribute to internal accelerators/playbooks and best practices
- Lead discovery sessions, current-state assessments, and target-state Fabric designs across multiple clients
- Develop re-usable templates for Dataflows Gen2, PySpark, and Pipelines; create training materials and deliver enablement sessions for client teams
Requirements:
- Bachelor's degree in Information Systems, Computer Science, Engineering, or related field (or equivalent experience)
- 6–10+ years in data warehousing/analytics with deep dimensional modeling (Kimball) and production T-SQL expertise
- Hands-on experience across Microsoft Fabric: OneLake, Lakehouse, Warehouse, Dataflows Gen2, Pipelines, Notebooks (PySpark), and Power BI semantic models
- Strong engineering practices: version control, code reviews, documentation, CI/CD
- Excellent client-facing communication, estimation, and stakeholder management skills; ability to work independently and collaboratively on multi-disciplinary teams