Culmen International is seeking a Full Stack Engineer to join a small, high-impact team focused on building data-intensive products at the intersection of modern web infrastructure and AI. The role involves working across the full lifecycle of development, from API design to frontend integration, with a strong emphasis on data fluency and collaboration within the team.
Responsibilities:
- Dive deep into our existing codebase to build a thorough understanding of our architecture, data models, and service dependencies — becoming a reliable backup and steward of core systems
- Write clean, well-documented TypeScript and Node.js code across backend services, APIs, and frontend features, with a consistent eye toward maintainability and performance
- Manage your work through Jira — picking up tickets, contributing to sprint planning, and keeping tasks and statuses accurate so the team always has clear visibility into progress
- Collaborate via GitHub using pull requests, code reviews, and branching strategies that keep our codebase healthy and deployments predictable
- Maintain and expand technical documentation in Confluence, ensuring that institutional knowledge is captured, up to date, and accessible to the whole team
- Participate in regular code reviews — both giving and receiving feedback thoughtfully to raise the quality bar across the board
- Help triage, debug, and resolve production issues, taking ownership of problems through to resolution rather than passing them along
- Contribute to the design and implementation of new features end-to-end, from data modeling and API design to integration and testing
- Proactively flag technical debt, inconsistencies, or risks you encounter and work collaboratively to address them over time
- Engage in team ceremonies — standups, sprint reviews, retrospectives — and bring a constructive, communicative presence to everything you do
Requirements:
- Experience with Node.js and TypeScript
- Strong fluency in SQL and data modeling
- Comfort thinking through schema design, query optimization, and data flow
- Experience with Google Cloud Platform (GCP) services like BigQuery, Cloud Run, Vertex AI, or Cloud Storage
- Familiarity with Claude Code or similar AI-assisted development tools
- Experience working with or building agentic AI systems
- Prior experience supporting federal government contracts or programs
- Exposure to geospatial data concepts, tools, or platforms
- Experience with data pipeline development
- Familiarity with Python as a secondary language
- Comfort working in a startup or small team environment
- Genuine curiosity about agentic AI implementations
- Interest in mobile data collection platforms, intelligence tooling, or data-as-a-service products