Architect solutions for Data ingestion, transformation & orchestration, data storage, and analytics, reporting, and ML/AI consumption for Danaher wide data models
Build trusted, scalable, secure, compliant, and reusable data products in Snowflake that support AI/ML models, executive dashboards, and cross-OpCo analytics
Establish data engineering standards, best practices, and reference architectures
Enable self-service access through semantic layers, catalogues, and APIs
Drive continuous improvement by implementing automated data quality checks, scaling ingestion/onboarding for M&A integrations
Requirements
Minimum 15 years of progressive experience creating enterprise IT architecture and ecosystems
Minimum 10 years of Data & Analytics experience
Deep expertise with modern data technologies (Snowflake, BigQuery, Databrikcs, Azure Fabric/Synapse, Airflow, Spark)
Hands-on experience with cloud data ecosystems (AWS, Azure)
Deep understanding of data modeling, ETL/ELT patterns, and orchestration frameworks
Effective communication skills and ability to influence cross-functional stakeholders
Ability to travel to Washington DC and global Danaher sites, up to 10%