Climate Cabinet is seeking a Contract Data Engineer to enhance and maintain their vital data infrastructure and tooling. This role is pivotal in identifying and leveraging opportunities to support the election of pro-climate candidates and the passage of impactful pro-climate legislation.
Responsibilities:
- Enhance and maintain Climate Cabinet's vital data infrastructure and tooling
- Design and maintain robust backend systems, data pipelines, and cloud-based storage and processing to support large-scale electoral and civic data analysis
- Ensure the accuracy and completeness of data related to 500,000+ local offices, and regularly update our databases with external data sources like LegiScan and Open States for up-to-date legislative information, demographics, and climate data
- Build out the internal toolset with applications that streamline operations, such as custom database interfaces for tracking electoral data and fundraising activities
- Develop defensible, well-thought-through methodologies for calculating useful metrics
- Analyze and generate data visualizations to support Political, Policy, Comms, and Development teams, enforcing high standards for analysis quality and accurate data-driven storytelling
- Build out AI-powered tools and connect Climate Cabinet datasets to provide powerful self-serve policy analysis and data visualization tools to the broader team
- Collaborate closely with the Tech Team, managing work via Jira and Github
- Take the lead on specific projects that will be agreed with the team as well as provide code review and input on projects led by other team members
Requirements:
- Proficiency in Python for data analysis and manipulation
- Experience with SQL and database management (e.g., PostgreSQL, MySQL)
- Fluency with Github for version control
- Experience building and managing data systems via Google Cloud Platform (preferred) or AWS
- Experience orchestrating ETL (Extract, Transform, Load) pipelines on cloud infrastructure (using Apache Airflow, Cloud Run, Cloud Functions, etc.)
- Ability to work with APIs and integrate various data sources
- Knowledge of data visualization tools (e.g., Tableau, Power BI, D3.js)
- Familiarity with political data sources and electoral processes (e.g. Legiscan, Open States, BillTrack50) and proficiency with data analysis, particularly political data
- Knowledge of AI tools (preferred)
- Comfort with AI in workflow (Github Copilot, Claude Code, etc)
- Knowledge of building AI applications (MCPs, LLM integrations, etc)