ActiveCampaign is a company that enables DevOps, InfoSec, and Development teams to enhance their security posture while improving productivity and innovation. The Senior Data Engineer will design, build, and maintain data pipelines, enhance data models, and drive a data-first culture across teams to ensure reliable and accessible information.
Responsibilities:
- Develop and Maintain Scalable Data Infrastructure. You will design, build, and optimize data pipelines using tools like Airflow and Athena, ensuring efficient data ingestion, transformation, and analysis while maintaining integrity and accessibility
- Enhance Data Models and Visualization. You will expand the company-wide data model to improve reporting accuracy and develop user-friendly dashboards in tools such as Tableau to provide meaningful insights across departments
- Ensure Data Quality and Documentation. You will implement monitoring systems to measure data quality, annotate ingestion feeds with accuracy metrics, and write documentation to help teams access and interpret data effectively
- Drive a Data-First Mindset Across Teams. You will collaborate with sales, marketing, product management, and engineering to uncover insights, answer complex business questions, and advocate for best practices in data usage and decision-making
Requirements:
- At least five years of experience working with ETL processes, data pipelines, and data infrastructure, demonstrating your ability to manage complex data systems
- Proficient in Python (preferred) or another programming language such as Go, Perl, Ruby, or Java, using code to automate and streamline data processes
- Experience working with Athena, Snowflake, Redshift, Luigi, or similar data management platforms, allowing you to build and optimize effective data storage solutions
- Skilled in visualization tools such as Tableau, Power BI, or Google Data Studio, ensuring that you can present data in a clear and meaningful way
- Strong analytical and problem-solving skills, with experience working with structured and unstructured datasets to uncover insights and trends
- Excellent communicator, both in writing and in speech, able to translate technical concepts into understandable information for non-technical audiences
- Practical approach to data ownership, ensuring that stakeholders have the resources and guidance they need to access and interpret data effectively
- Build engineering, especially for languages such as Java, Go, and Rust
- Open Source projects and culture, especially dependency management and library maintenance
- Experience with open-source data processing tools such as Kafka, Hadoop, Airflow, PrestoDB, dbt, and similar technologies
- Background in cloud-based data engineering, particularly in AWS or GCP environments
- Previous experience working with sales or marketing analytics
- Contributions to open-source projects and communities