Snowflake Data Architect / Administrator
Plymouth Rock Assurance
Woodbridge, NJ or Boston, MA 4 days/week onsite
F2F interview is required
Overview
We are seeking a Snowflake Data Architect / Administrator to lead the design, implementation, and administration of a modern cloud data platform. This role will focus on Snowflake architecture, ETL modernization, and migration from legacy systems using DBT and Informatica.
Key Responsibilities
- Architect and manage scalable, secure, and high-performance data solutions on Snowflake
- Lead end-to-end migration from on-premise data platforms to Snowflake, including assessment, planning, and execution
- Design and oversee ETL/ELT pipelines using DBT and Informatica PowerCenter
- Define data models, architecture standards, and best practices for data warehousing and data lakes
- Administer Snowflake environment, including performance tuning, resource optimization, and cost management
- Implement and manage Snowflake features such as Snowpipe, Streams, Tasks, UDFs, and external tables
- Design and enforce data security, governance, and access control policies
- Oversee ingestion of structured and semi-structured data (CSV, JSON, Parquet) from cloud storage
- Provide technical leadership, mentor developers, and guide architectural decisions
- Collaborate with cross-functional teams in an Agile environment
Required Skills & Experience
- Strong expertise in Snowflake architecture and administration
- Hands-on experience with DBT (Data Build Tool) and Informatica PowerCenter (8.x/9.x/10.x)
- Proven experience in data platform migration to Snowflake
- Advanced knowledge of SQL, PL/SQL, and data warehousing concepts
- Experience with Snowflake utilities (SnowSQL, Snowpipe, Streams, Tasks, UDFs)
- Familiarity with AWS and/or Azure cloud platforms
- Experience in performance tuning, query optimization, and cost control
- Working knowledge of Python, Control-M, GitHub, and UNIX environments
- Experience working in Agile methodologies
Preferred Qualifications
- Experience converting legacy ETL workflows into DBT models
- Strong understanding of modern data architecture patterns (ELT, lakehouse, real-time pipelines)
- Prior experience in a lead or architect role