Define and evolve data architecture standards for analytics and reporting, including data modeling, naming conventions, schema design, and documentation practices across the organization
Own the data catalog and metadata strategy, partnering with stakeholders to define, name, and organize data assets across multiple domains and source systems
Collaborate closely with Principal Data Engineering leadership and application engineering teams to align on ELT patterns, Snowflake usage, schema evolution, and analytical data modeling practices
Contribute hands‑on through SQL and Python, developing reference data models, prototypes, templates, and example implementations that demonstrate architectural intent
Support and enable data analysts by establishing consistent data usage, modeling standards, and shared definitions across a wide range of technical skill levels
Partner with application engineers on schema design to support rapid application development and reliable integration between operational and analytical data systems
Support PostgreSQL (including AWS Aurora) and Snowflake data modeling and analytical access patterns in collaboration with platform and database stakeholders
Establish and promote data governance practices covering data quality, ownership, lifecycle management, and schema change management
Drive incremental delivery of data architecture improvements, aligning short‑term progress with a clear long‑term architectural vision
Design high-ingestion pipelines (using tools like InfluxDB, Timescale, or Snowflake) capable of handling millions of data points per second from globally distributed battery sites
Ensure data can be seamlessly ingested from various industrial protocols such as Modbus, CAN bus, or DNP3, and translated into standardized cloud formats
Ensure data architectures comply with grid-specific regulations (like NERC CIP) and mandate on-site data storage for grid resilience
Help set the vision, roadmap and communicate the enterprise data strategy for the company
Requirements
8+ years of experience in data engineering, data architecture, or analytics platform development, with demonstrated ownership of cross‑team data standards and models
Strong expertise in analytical data modeling, including dimensional modeling, semantic layers, and schema design for multi‑consumer analytics use cases
Deep working knowledge of SQL and experience collaborating on or authoring complex analytical queries and models
Proven experience designing or contributing to analytics platforms built on Snowflake or similar cloud data warehouses using ELT‑based architectures
Experience defining and operationalizing data catalogs, metadata, and shared definitions, including naming conventions, ownership models, and documentation practices
Experience designing or supporting data governance frameworks, including data quality, ownership, lifecycle management, and schema change management
Experience supporting application teams on schema design to enable rapid application development and reliable data integration
Familiarity with analytics engineering practices and tools (e.g., dbt or similar modeling frameworks)
Proficiency with Python for data analysis, modeling, validation, or prototyping (not limited to production pipeline code)
Experience partnering closely with data engineers on ELT patterns, schema evolution, and data quality practices
Experience working with PostgreSQL or compatible systems (including managed services such as AWS Aurora) and understanding how operational schemas interact with analytical models
Knowledge of AWS data services and cloud‑native data patterns strongly preferred
Demonstrated ability to work effectively with data analysts across a wide range of technical skill levels, including analysts with limited engineering backgrounds
Strong communication skills and a track record of cross‑functional collaboration with engineering, analytics, and business stakeholders
Experience working in environments with large, diverse analyst populations and high data consumption across teams preferred
Proven ability to deliver incremental architectural improvements while maintaining a clear long‑term vision for data consistency and scalability
Background in energy, finance, trading, or other data‑intensive, operationally complex domains preferred.
Tech Stack
AWS
Cloud
Postgres
Python
SQL
Benefits
Highly competitive total compensation
Flexible, work from home or hybrid work
Unlimited vacation
Work from home stipend
Educational assistance
Parental leave
Highly engaging company culture with opportunities for in-person connection and learning and growth.