AppFolio is a technology leader in the real estate industry, pioneering cloud and AI solutions. The Staff Data Science Engineer will be responsible for the operational backbone of the Business Data Platform, ensuring resilient and scalable systems that support analytics and business decision-making.
Responsibilities:
- Own the design, build, and maintenance of AppFolio’s Business Data Platform’s observability and testing solutions that identify reliability, scalability, or data quality issues and strive to continuously improve the platform
- Build, monitor, and maintain CI/CD pipelines and orchestration tools to ensure timely and accurate data delivery
- Manage service accounts, roles, and permissions across Snowflake, dbt, and BI tools
- Own the creation of and performance of Data Pipeline SLOs
- Create and enforce data access controls, masking policies, and encryption standards (at rest and in transit)
- Partner with InfoSec and Compliance to ensure auditability and privacy frameworks are implemented and upheld across the Business Data Platform
- Maintain lineage and metadata documentation across key data domains
- Implement and manage infrastructure-as-code for data platform components
- Drive automation of routine operations (e.g., environment provisioning, credential rotation, usage monitoring)
- Partner with data engineers, data science engineers, data scientists, and analysts to improve platform usability and self-service capabilities
- Document and evangelize best practices for data access, job orchestration, and environment management
Requirements:
- 6+ years in DataOps, Data Platform Engineering, DevOps, or related roles within modern cloud data environments
- Proficiency in data engineering tools and technologies - SQL, Python, Airflow, Mulesoft, Linux scripting, and dbt
- Strong Experience with cloud technology, especially AWS tech stack (s3, ec2, eks), Docker and Kubernetes
- Experience with cloud data warehouse technology, such as Snowflake, including data security and governance
- Advanced proficiency with Airflow
- Strong knowledge of CI/CD pipelines, GitOps workflows, codespaces, and infrastructure-as-code (Terraform etc.)
- Experience with building self-serve processes
- Familiarity with security standards and practices for data (IAM, encryption, audit logging)
- Familiar with Snowflake RBAC