TopDog Law is a nationally scaling, fully-integrated personal injury law firm built for impact, excellence, and growth. The Senior Data Engineer will help build and scale the data platform that powers analytics, reporting, and data-driven decision-making across the organization.
Responsibilities:
- Design, build, and maintain reliable data ingestion pipelines from internal systems and third-party data sources
- Implement scalable ELT workflows that process and deliver data across the organization
- Maintain transformation pipelines and ensure reliable delivery of analytics-ready datasets
- Manage and optimize the performance, reliability, and scalability of the company’s cloud data warehouse environment
- Maintain orchestration frameworks and scheduling systems that support data workflows
- Optimize data pipeline performance, compute utilization, and system efficiency
- Implement monitoring, alerting, and observability across data pipelines and platform components
- Ensure data freshness and system uptime meet defined service expectations
- Diagnose and resolve production issues including pipeline failures, data quality issues, and performance bottlenecks
- Maintain version-controlled data infrastructure and CI/CD workflows for data pipelines
- Implement testing and validation practices to ensure data quality and reliability
- Contribute to documentation of data pipelines, system architecture, and platform dependencies
- Partner with the Director of Data to implement data architecture and platform improvements
- Support analytics and BI teams by ensuring reliable and well-modeled datasets are available for reporting and analysis
- Contribute engineering input to platform improvements and technical roadmap initiatives
Requirements:
- 5+ years of experience building and maintaining production data pipelines
- Strong SQL skills and experience working with large datasets
- Experience with modern cloud data warehouses such as Snowflake or BigQuery
- Experience building transformation workflows using dbt or similar tools
- Experience working with orchestration tools such as Airflow
- Strong understanding of data pipeline reliability, performance optimization, and scalability
- Experience using version control systems such as Git in collaborative development environments
- Experience supporting data infrastructure used for machine learning workflows
- Experience building feature pipelines or supporting predictive modeling workloads
- Experience using Python for data processing and pipeline development
- Experience implementing monitoring and observability for data systems
- Experience working in high-growth or rapidly scaling environments
- Strong communication skills—written and verbal
- Ability to think critically, prioritize effectively, and execute with speed