Design, develop, and maintain scalable data pipelines and ETL/ELT workflows to support humanitarian operational and analytical needs.
Build and optimize data models, schemas, and cloud-based data architectures to support dashboards, machine learning workflows, and advanced analytics.
Respond quickly to database problems that arise and carry out periodic maintenance and troubleshooting. Monitor the system performance by performing regular tests, troubleshooting, and integrating new features.
Integrate heterogeneous humanitarian data sources.
Implement data quality, validation, and monitoring processes to ensure accuracy, reliability, and transparency.
Maintain an awareness of trends and developments in database maintenance.
Maintain documentation, metadata, and data governance standards in alignment with state department policies.
Occasionally, respond quickly to data needs for responses to rapid onset disasters or urgent decision-making requests.
Explore and analyze complex humanitarian datasets to identify trends, anomalies, risks, and opportunities.
Produce actionable insights that inform policy, resource allocation, program design, and crisis response.
Support predictive analytics or light machine-learning workflows where appropriate.
Communicate analytical findings clearly to technical and non-technical audiences.
Serve as a subject-matter resource for data best practices, including how to access, interpret, and apply available datasets.
Work closely with the team business analyst and HA decision makers to understand their data needs and translate them into analytical or technical solutions.
Provide training, informal guidance, and practical tools to staff who rely on data for decision-making.
Demonstrate flexibility by taking on adjacent responsibilities as needed in a small, fast moving team environment to ensure project goals are met.
Engage in continuous learning about the humanitarian sector.
Requirements
Bachelor’s degree in Data Science, Computer Science, Statistics, Information Systems, or a related field
Hands-on experience in both:
o Data engineering (pipelines, ETL/ELT, SQL, cloud tools)
o Data analysis (exploratory analysis, dashboarding, basic data science methods)
Proficiency with SQL and modern programming/scripting languages (Python preferred).
Familiarity with cloud platforms (e.g. AWS, Databricks) and common data engineering tools.
Strong analytical thinking, problem-solving skills, and intellectual curiosity.
Ability to explain complex concepts to diverse stakeholders.
A team player but able to work independently.
Multi-tasking and time-management skills, with the ability to prioritize tasks.
Excellent interpersonal skills.
Ability to work well in a diverse team.
The candidate must be a U.S. citizen to qualify for the required U.S. government security clearance for this project.