Alpaca is a US-headquartered self-clearing broker-dealer and brokerage infrastructure company seeking an Analytics Engineer to own and execute the vision for their data transformation layer. This role involves designing, building, and maintaining scalable data models while collaborating with various teams to deliver reliable data products.
Responsibilities:
- Own the Transformation Layer: Design, build, and maintain scalable data models using dbt and SQL to support diverse business needs, from monthly financial reporting to near-real-time operational metrics
- Set Technical Standards: Establish and enforce best practices for data modelling, development, testing, and monitoring to ensure data quality, integrity (up to cent-level precision), and discoverability
- Enable Stakeholders: Collaborate directly with finance, operations, customer success, and marketing teams to understand their requirements and deliver reliable data products
- Integrate and Deliver: Create repeatable patterns for integrating our data models with BI tools and reverse ETL processes, enabling consistent metric reporting across the business
- Ensure Quality: Champion high standards for development, including robust change management, source control, code reviews, and data monitoring as our products and data evolve
Requirements:
- 4+ years of experience in analytics engineering or data engineering with a strong focus on the 'T' (transformation) in ELT
- Proven track record of owning data products end-to-end, applying analytics and data engineering best practices to ensure data quality, scalability, and robust data models
- Comfortable working with ambiguity and collaborating with stakeholders to define requirements; able to take ownership with minimal oversight in a fast-paced environment
- Experience proactively identifying and implementing improvements to data warehouse performance and ETL efficiency
- Expert-level SQL and DBT skills for complex queries and data transformations
- Proficiency in Python for transformations that extend beyond SQL
- Hands-on experience with query optimization across OLTP and OLAP systems (e.g., Postgres, Iceberg)
- Proficiency with Semantic Layer modelling (e.g. Cube, dbt Semantic Layer)
- Experience owning CI/CD workflows and establishing team-wide standards for version control and code review (e.g., Git)
- Familiarity with cloud environments (GCP or AWS)
- Experience with data ingestion tools (e.g., Airbyte) and orchestration tools (e.g., Airflow)
- Domain experience for brokerage operations or a passion for financial markets and modelling financial datasets