Pavion is a global leader in providing innovative fire, security, and communication integration solutions. The Data Engineer will support the organization’s data strategy by designing, building, and maintaining reliable data pipelines and core data infrastructure to enable accurate and timely data across various business functions.
Responsibilities:
- Design, build, and maintain scalable data pipelines (ETL/ELT) and data models that support operational KPIs, financial performance, customer metrics, and growth initiatives, including ingestion via Fivetran, transformation and modeling using dbt, and processing structured and unstructured data from BLOG, RDBMS, and no-SQL data sources
- Extract, transform, and load data from ERP, CRM, field service systems, and blob or database storage, implementing data quality checks to ensure accuracy, completeness, and consistency, leveraging dbt tests and documentation where appropriate
- Provide clean, well-structured, and documented datasets, including analytics-ready dbt models, that support BI tools such as Power BI
- Partner with analytics and business teams to ensure data is accessible, understandable, and fit for reporting and analysis
- Work closely with analytics, IT, and business stakeholders to understand data requirements and support reporting, analytics, and M&A needs
- Assist with data integration for acquisitions, including onboarding new source systems, ingesting historical or ad hoc data delivered via blob storage or flat files, and supporting post-acquisition reporting
- Support the data foundation required to track value creation initiatives and operational improvements
- Implement data quality rules and operate within data governed environments. Support master data management processes and enforce standards, naming conventions and metadata management
- Identify opportunities to improve data architecture, automate pipelines, and strengthen data reliability over time, using Fivetran-managed connectors, dbt best practices, and blob storage as part of the data architecture
Requirements:
- Bachelor's degree in Computer Science, Data Engineering, Information Systems, or a related field
- 3 to 7+ years of experience in data engineering, analytics engineering, or a similar technical role
- Strong SQL skills and experience building and maintaining relational data models, including dbt-based transformations
- Experience working with cloud blob or object storage such as Azure Blob Storage, Amazon S3, or Google Cloud Storage
- Experience designing and maintaining data pipelines across multiple source systems
- Experience implementing and maintaining enterprise metadata and data catalogs
- Familiarity with BI tools such as Power BI or Tableau from a backend or data modeling perspective
- Experience validating, monitoring, and troubleshooting data quality issues
- Strong problem-solving skills and attention to detail
- Comfort working in a fast-moving, results-oriented environment
- Experience designing and maintaining data pipelines across multiple source systems, including ELT tools such as Fivetran and transformation frameworks such as dbt
- Familiarity with field service, operations, or financial systems
- Exposure to cloud data platforms or modern data stacks
- Experience supporting M&A data integration or post-acquisition reporting
- Knowledge of Python or similar tools for data processing and automation
- Exposure to cloud data platforms or modern data stacks, including dbt, Fivetran, and blob-based data lake patterns
- Experience managing and integrating no-SQL databases, including columnar and document stores
- Experience with data catalog and metadata management tools such as Collibra, Alation, Amazon Glue, etc