The Shade Store is a company that has handcrafted window treatments for 75 years, focusing on providing an effortless experience for customers. They are seeking an Analytics Engineer to design, build, and maintain their modern data stack, ensuring high-quality data modeling and supporting cross-functional teams with data-driven insights.
Responsibilities:
- Build, optimize, and maintain production-grade data models in dbt, following analytics engineering best practices (modularity, DRY principles, testing, documentation, CI/CD)
- Own Snowflake data transformations and warehouse objects, ensuring reliable, performant data that downstream teams trust
- Model and integrate data from diverse source systems including proprietary internal applications, Salesforce, and third-party platforms spanning payments, telephony, HR, expense management, and customer engagement
- Implement and manage data quality frameworks (dbt tests, schema tests, audits, source freshness)
- Write clean, efficient SQL for complex data transformations across large datasets
- Develop and maintain standardized metrics, KPIs, and business definitions across departments to ensure a single source of truth
- Partner with stakeholders to translate ambiguous business questions into structured data requirements and well-designed semantic models
- Build Looker explores, LookML views, and dashboards that support enterprise reporting and self-service analytics. Own and evolve the governed semantic layer to serve as the single source of truth for enterprise metrics
- Provide guidance and governance around KPI definition, UTM/campaign taxonomy, product hierarchy, financial metrics, operational KPIs, and more
- Collaborate with Finance, Marketing, Operations, and other business units to gather requirements, model data, and align on metric standards
- Drive structured analytics engineering projects, from requirement gathering to deployment, using strong project management practices
- Communicate proactively with stakeholders to clarify requirements, share updates, and ensure alignment
- Serve as a subject-matter expert for data modeling, dbt, Snowflake, and Looker
- Enforce data documentation standards and maintain clear lineage in dbt and Looker
- Participate in code reviews and contribute to team development standards
- Help shape and evolve the data architecture supporting our enterprise data warehouse
- Support forecasting and advanced analytics initiatives by preparing, curating, and modeling the underlying datasets required for demand planning, financial modeling, and performance analysis
Requirements:
- Bachelor's degree in Computer Science, Data Science, Engineering, Analytics, Business, or equivalent practical experience
- 3+ years of experience in analytics engineering, data warehousing, BI development, or SQL-heavy analytics roles
- Strong command of dbt (data modeling, macros, tests, documentation, deployment)
- Expert-level SQL and strong understanding of analytic warehouse concepts (dimensional modeling, star/snowflake schemas, SCDs, semantic layers)
- Hands-on experience with Snowflake or a similar cloud data warehouse
- Intermediate to advanced experience building semantic models and dashboards in Looker / LookML
- Demonstrated experience partnering with cross-functional teams to define KPIs and business logic
- Experience with Git-based workflows, CI/CD, code review practices, and modern SDLC methodologies
- Strong communication skills with the ability to translate technical concepts into business language
- Working proficiency in Python for data pipeline development, automation, and lightweight analytical tasks
- Familiarity with ELT/ETL orchestration tools (Airflow, Dagster, Prefect, Fivetran)
- Background in marketing analytics, e-commerce analytics, or retail/operational reporting
- Experience supporting data governance, data quality frameworks, or enterprise metric catalogs
- Hands-on experience with demand forecasting, financial modeling, or statistical analysis methods
- Experience with AI-powered coding and productivity tools (e.g., Claude Code, OpenAI Codex, GitHub Copilot) and familiarity with emerging AI integration patterns such as MCP (Model Context Protocol), agentic workflows, or AI-assisted data development