Own enterprise data architecture across conceptual, logical, and physical models, including data standards, naming conventions, and best practices.
Design reusable architecture patterns (CDC ingestion, SCD handling, lakehouse/warehouse layers, medallion architecture) and architect scalable ETL/ELT + ingestion frameworks for end-to-end integration and transformations.
Build and optimize warehouse/datamart architectures for analytics and reporting with strong focus on performance tuning (partitioning, clustering, indexing, query optimization).
Ensure data governance, quality, lineage, metadata management, and secure access controls (RBAC, encryption, masking).
Drive architecture reviews, technical decisions, and create/maintain documentation (architecture diagrams, data flows, ERDs, integration specs) ensuring alignment with enterprise standards.
Collaborate with business stakeholders, data engineers, cloud, and BI teams to translate requirements into validated solutions.
Client-facing role with end-to-end ownership of data architecture and delivery in a high-stakes, high-impact environment.
Requirements
8+ years of experience in data architecture and/or data engineering with strong architecture ownership
Strong knowledge of data modeling (3NF, Star Schema, Snowflake Schema, Dimensional Modeling)
Expertise in SQL and database design principles
Strong experience with data warehousing concepts and best practices
Hands-on experience with at least one modern cloud data platform (Warehouse/Lakehouse): BigQuery / Databricks / Snowflake / Redshift
Experience with ETL/ELT tools and orchestration frameworks
Knowledge of data governance tools (Collibra, Alation, Purview) is a plus
Exposure to API-based ingestion and streaming (Kafka / PubSub) is a plus
Strong communication and stakeholder management skills
Bachelor’s/Master’s degree in Computer Science / IT / Engineering or equivalent experience