Premier Truck Rental (PTR) is a family-owned company that provides customized commercial fleet rentals nationwide. The Data Engineer plays a critical role in designing, building, and maintaining reliable data pipelines to support PTR’s enterprise-wide analytics platform while collaborating with the Business Intelligence team.
Responsibilities:
- Pipeline Development : Design, build, and maintain robust ELT pipelines using FiveTran, dbt, and Astronomer (Airflow) to ensure accurate and timely data delivery to the Snowflake data warehouse
- Data Operations : Support existing data processes in legacy systems, ensuring data reliability and service continuity during transition and decommissioning
- Data Integration : Develop automated workflows to integrate data from multiple sources into Snowflake, working with external APIs, databases, and enterprise systems
- Transformation & Modeling : Implement and maintain dbt models aligned with dimensional design standards and business requirements defined by the Lead Data Architect and BI analysts
- Quality & Observability : Partner with the architect and BI teams to establish robust data quality and observability practices using Metaplane, ensuring trust in all published data assets
- Monitoring & Maintenance : Manage data pipeline runs, monitor system health, and troubleshoot issues related to performance, scheduling, and data freshness
- Collaboration : Partner cross-functionally with BI analysts and data stakeholders to understand business needs and ensure that engineering work supports business priorities
- Documentation & Versioning : Maintain clear documentation of data pipelines, data mappings, and transformation logic, leveraging Git-based version control and CI/CD practices
- Continuous Improvement : Identify opportunities to automate manual processes, optimize data performance, and enhance system scalability
- Legacy Transition Support : Assist in incremental migration of data and processes from on-premises or legacy systems to the modern cloud data platform
Requirements:
- Bachelor's degree in computer science, Information Systems, or a related field (or equivalent work experience)
- 5+ years of experience in data engineering, ETL/ELT development, or backend data platform operations
- Practical experience with Snowflake, Fivetran, and dbt in building and deploying production data pipelines
- Familiarity with Astronomer (Apache Airflow) or similar workflow orchestration tools for data pipeline scheduling and monitoring
- Strong SQL proficiency for data transformation, modeling, and optimization
- Experience maintaining and enhancing data pipelines in hybrid or legacy environments
- Hands-on experience optimizing query performance and managing large datasets
- Understanding modern data modeling techniques (Star, Snowflake schemas)
- Experience with version control (Git) and CI/CD practices for data workflows
- Strong problem-solving skills, detail orientation, and an ownership mindset in production data operations
- Effective communication and collaboration skills with technical and non-technical stakeholders
- Experience with Metaplane or other data observability tools
- Familiarity with BI tools such as Sigma, Power BI, or Tableau
- Exposure to Python scripting for automation or light transformation logic
- Experience supporting legacy data warehouses (e.g., SQL Server, PostgreSQL) during modernization efforts
- Knowledge of cloud environments (AWS, Azure, or GCP) and associated data services
- Understanding of data governance, metadata management, and data privacy compliance standards