Workiva is seeking a Staff Data Engineer to lead the implementation of high-value, data-driven features within their product domain. The role involves designing and executing data solutions that power customer-facing applications, advanced analytics, and AI/ML features, while collaborating with Product Owners and Application Engineers.
Responsibilities:
- Drive Product Value: Architect and build high-performance data solutions that directly power customer-facing features, utilizing the internal data platform (Snowflake, dbt, Kafka)
- End-to-End Execution: Lead the design and delivery of complex data projects independently, from initial discovery with stakeholders to production deployment and monitoring
- Stakeholder Alignment: Partner with Product Managers and Business Leaders to translate customer needs into technical data requirements, ensuring "Data-as-a-Product" excellence
- Bridge the Gap: Work embedded with Application Engineering teams to advocate for upstream data quality and ensure application architectures support downstream data needs
- Influence the Platform: Act as the "Lead Customer" for our internal Data Platform team—identifying gaps in the platform’s capabilities and contributing to its strategic roadmap based on product requirements
- Best Practices: Establish and evangelize standards for data modeling, observability, and performance within the product domain
- Scalable Delivery: Design resilient, production-grade pipelines using DLT and Snowpipe that handle enterprise-scale workloads with low latency
- Complex Transformations: Own the domain’s dbt layer, ensuring code is modular, tested, and optimized for high-performance serving in Snowflake
- Mentorship: Guide and elevate Senior and Mid-level engineers on the team through code reviews, design docs, and technical coaching
Requirements:
- 8+ years of experience in Data Engineering, with a strong emphasis on building solutions for customer-facing products or applications
- Independent Execution: Proven ability to lead large-scale projects from concept to completion with minimal supervision, navigating ambiguity across multiple teams
- Strategic Communication: Exceptional ability to evangelize data strategy to non-technical stakeholders and influence application engineers on data best practices
- Mastery of the Stack: Deep expertise in Snowflake, dbt, and Kafka for building real-world, high-value data products
- Software Mindset: Strong Python and SQL skills, with a focus on building reusable libraries, automation, and CI/CD-driven workflows
- Product Sense: Experience working in a SaaS product environment, understanding how data latency and accuracy impact the end-user experience
- Data Modeling: Advanced understanding of modeling for both analytical (OLAP) and application-support (high-concurrency) use cases