ActBlue is a nonprofit organization dedicated to creating cutting-edge technology that fuels Democratic victories and enables progressive causes to thrive. They are seeking a Senior Data Engineer II to contribute to building and evolving the data products that power their platform, focusing on integrating internal data with user-facing applications and deploying machine learning pipelines.
Responsibilities:
- Design, build, and maintain scalable, reliable, and secure data pipelines using Python, with a focus on enabling data access and insight across product teams, engineering, and entities
- Develop reusable data services and frameworks that support high-quality data ingestion, transformation, and ML model deployment — accelerating analytics and experimentation across the organization
- Collaborate with data scientists and ML engineers to productionize machine learning workflows using SageMaker, Vertex AI, or other MLOps tools
- Implement monitoring, testing, and CI/CD automation for data pipelines and ML services
- Own and evolve real-time and batch data integrations between ActBlue’s core systems and user-facing applications — influencing design decisions and architecture across multiple teams including Product, Engineering, and Analytics
- Develop, optimize and support reverse ETL workflows using tools like Hightouch
- Participate in code reviews, mentor junior engineers, and help foster a high-trust engineering culture
- Demonstrate technical leadership through writing documentation, establishing effective monitoring, and fostering clear and audience-oriented communication
Requirements:
- 5+ years of relevant professional experience in data engineering or backend development with a strong focus on Python
- Expertise in writing clean, modular, tested, and production-ready Python code
- Strong understanding of data architecture, distributed systems, and security best practices
- Experience deploying and supporting production ML workflows (e.g., SageMaker, Vertex AI, or equivalent)
- Familiarity with ELT tools such as Fivetran and data modeling frameworks like DBT
- Solid command of SQL and experience working with large analytical databases (e.g., Redshift, PostgreSQL)
- Experience with monitoring and observability using Datadog or similar tools
- A team player mentality. You keep the end user in mind and enjoy hearing feedback from your teammates, yet know when and how to defend your own ideas in a respectful manner
- Commitment to ActBlue's mission and values, including equity, accessibility, and civic engagement
- Experience with ML platforms like SageMaker, Vertex AI, TensorFlow, or Modelbi
- Experience with real-time data systems or streaming platforms
- Experience contributing to internal platforms or tooling used across engineering teams
- Experience implementing robust testing frameworks for data workflows (e.g., Pytest, dbt tests)