Arcadia is a global utility data and energy solutions platform, and they are seeking a visionary leader for their Data and Analytics engineering team. The role involves defining the long-term architecture of the Data Platform, driving innovation, and managing large-scale data operations to support enterprise clients' energy challenges.
Responsibilities:
- Define the organizational structure and build, lead, and scale a successful data and analytics engineering organization; including outstanding managers and individual contributors across Data Analytics and Data Engineering
- Drive technical architecture, decisioning, and execution of data and analytics engineering work. As a deep technical expert on the latest technologies and best practices, you quickly outline technical plans, align stakeholders, and drive rapid and high quality engineering work
- Align engineering data investments with company business goals and long-term strategy, and restructure the organization as needed to support data-centric product development
- Establish and drive department-wide standards for data platform governance and development and that we are building our products to give our customers the insight they need to run their business
- Own the cross-organizational data platform development and establish a multi-year strategy around data infrastructure, enterprise data modeling, and processing capabilities. Ensure we are thoughtful about what data we collect, growing it where possible and useful, and that we are using it to its full potential
- Lead the optimization and evolution of our Snowflake-based data architecture to handle exponential data growth
- Own the enterprise unified data model and architecture that will power all of Arcadia’s applications and use cases
- Enable data platform development and usage across all R&D, enabling other engineers, analysts, and product managers to contribute and best utilize the data platform
- Proactively recognize and remove potential people, process, or technology roadblocks to facilitate delivery of products across departments
Requirements:
- You are an expert Data & Analytics leader with demonstrated experience processing and analyzing large-scale datasets (billions of records)
- Demonstrated strong judgment and decision-making without having the total picture during routine business, as well as when in high-pressure situations
- Deep expertise with Snowflake as a data platform, including performance optimization, cost management, and architecting for scale
- Hands-on experience with our modern data stack: dbt for transformation, Hex for analytics, and Fivetran/Airbyte for data ingestion
- You have built and led Data & Analytics teams at high-growth SaaS companies, specifically those dealing with high-volume data processing
- Experience with utility data, billing systems, or similar high-volume transactional data is highly valued
- A strong communicator who uses data narratives to explain the solutions they build
- You have strong process leadership skills, but know when process layers become too heavy
- 12+ years in the workforce with significant experience in data-intensive environments
- You are also a 'doer' and are not afraid to roll up your sleeves
- You have top-notch technical skills covering both data and quantitative techniques: data facility, descriptive analytics, and predictive modeling
- SQL and Python are a must, with demonstrated ability to write optimized queries for large-scale data processing
- You are business-focused, with a strong tether to real world needs and a focus on solving actual problems rather than areas of conceptual interest
- Eager to work in a fast-paced, growth setting. Flexible and fluid, you make things work and operate efficiently in a cross-functional environment
- You build and deliver data products. You are outcome driven with strong intuition for business; you pick the practical over the fancy
- Experience with data governance, security, and compliance in handling sensitive customer data
- Experience with real-time or near-real-time data processing systems
- Knowledge of cloud platforms (AWS, Azure, or GCP) and their data services
- Experience with orchestration tools (Airflow, Dagster, or similar)
- Background in energy, utilities, or sustainability sectors