EP Wealth Advisors is a wealth management advisory firm seeking a Data Engineer to contribute to their first enterprise data platform initiative. The role involves designing, developing, and implementing a modern, cloud-based data environment powered by Snowflake, collaborating closely with internal teams and external partners.
Responsibilities:
- Collaborate closely with EP Wealth’s Snowflake implementation partner to design, configure, and deploy the firm’s first enterprise data platform
- Participate in architecture design sessions, technical planning, and milestone reviews with both internal stakeholders and external partners
- Develop and maintain efficient, secure, and scalable ETL/ELT data pipelines to ingest data into Snowflake
- Build and optimize Snowflake data models, schemas, and role-based access controls
- Define and enforce Snowflake best practices for performance optimization, cost management, and data security
- Support environment setup, data migration, and initial production rollout activities in coordination with the implementation partner
- Integrate data from multiple enterprise systems including CRM, portfolio management, financial planning, and custodial platforms
- Design data models and reusable ingestion frameworks that support analytics and reporting use cases
- Contribute to the development of a unified enterprise data model and metadata framework
- Collaborate with the implementation partner to align technical design with EP Wealth’s business strategy and compliance standards
- Implement data validation, monitoring, and auditing to ensure data accuracy, completeness, and reliability
- Collaborate with IT leadership to define and operationalize data governance standards
- Troubleshoot and resolve data pipeline issues while maintaining system performance and uptime
- Conduct query and warehouse optimization to ensure efficient Snowflake usage
- Work closely with data analysts and technology teams to deliver accessible, well-documented data sets
- Participate in design reviews, sprint planning, and architectural discussions with the implementation partner
- Identify opportunities to extend Snowflake functionality through features such as Snowpark, data sharing, or automation tools
- Stay informed on Snowflake’s evolving ecosystem and cloud data engineering best practices
Requirements:
- Bachelor's degree in Computer Science, Information Systems, Data Engineering, or a related technical field
- 5+ years of data engineering experience, with direct hands-on experience in Snowflake strongly preferred
- Demonstrated success contributing to or co-delivering a cloud data platform implementation alongside external partners or vendors
- Experience building scalable data pipelines and transformations for analytics and operational reporting
- Advanced proficiency in SQL, with deep experience in Snowflake query optimization and data modeling
- Proficiency in Python or another scripting language for data processing and automation
- Experience with ETL/ELT tools and orchestration platforms (DBT, Airflow, Dagster, etc.)
- Familiarity with cloud infrastructure (AWS, GCP, or Azure) and Snowflake integrations
- Experience with enterprise data integration from CRM, portfolio management, and financial planning systems (Salesforce, Tamarac, Orion, etc.)
- Understanding of data governance, lineage, and observability tools
- Strong understanding of Snowflake architecture, data sharing, and multi-cluster warehouse capabilities
- Experience defining data security and access policies in regulated environments
- Ability to bridge technical and business needs through thoughtful data design and collaboration