Per Scholas is a nonprofit organization dedicated to driving mobility and opportunity through tech training. They are seeking a Data Engineer to maintain and expand their enterprise data infrastructure, focusing on managing data flow into their Snowflake Data Lake and supporting reporting and visualization efforts.
Responsibilities:
- Data Pipeline Management
- Set up and monitor automated data ingestion connectors using Fivetran
- Troubleshoot sync errors and work with API providers to ensure continuous data flow
- Write and optimize SQL transformations within Snowflake to prepare data for reporting
- Familiarity or experience with OpenFlow data ingestion is nice-to-have
- Data Lake Maintenance
- Maintain the structure of the Snowflake Data Lake (schemas, tables, and views) as defined in partnership with InterWorks
- Perform regular data quality checks to ensure the 'source of truth' remains accurate
- Monitor warehouse performance and assist in managing Snowflake credit consumption
- Reporting & Visualization Support
- Build and maintain 'Published Data Sources' in Tableau for use by various departments
- Assist team members in building complex Tableau dashboards by providing the necessary back-end data structures
- Translate business requests into technical requirements for new data pipelines
- Privacy & Compliance
- Execute data masking and access control policies to ensure compliance with Data Privacy frameworks
- Assist in maintaining documentation for data lineage and the organizational data dictionary
Requirements:
- Excellent communication skills, able to communicate technical requirements to non technical audiences
- A clear understanding of data modeling and best practices
- 2–4 years of professional experience in data engineering or a highly technical data analyst role
- Experience with CI/CD pipelines
- Snowflake: Hands-on experience querying and managing tables within a Snowflake environment
- Fivetran: Proficiency in setting up and managing SaaS connectors
- Experience writing and implementing data transformations with dbt (Data Build Tool)
- Tableau: Experience in data modeling for Tableau (calculating fields, joining data sets, and optimizing for performance)
- Technical Skills: Strong SQL skills are mandatory. Experience with Python for basic scripting is preferred
- Mission-Driven: A genuine interest in using technology to support Per Scholas' mission of closing the opportunity gap
- Familiarity or experience with OpenFlow data ingestion