Culmen International is seeking a Full Stack Engineer with GCO and Federal Government experience to join a small, high-impact team building data-intensive products. The role involves working across the full lifecycle of development, from backend service logic to frontend integration, while focusing on data modeling and schema design.
Responsibilities:
- Dive deep into our existing codebase to build a thorough understanding of our architecture, data models, and service dependencies — becoming a reliable backup and steward of core systems
- Write clean, well-documented TypeScript and Node.js code across backend services, APIs, and frontend features, with a consistent eye toward maintainability and performance
- Manage your work through Jira — picking up tickets, contributing to sprint planning, and keeping tasks and statuses accurate so the team always has clear visibility into progress
- Collaborate via GitHub using pull requests, code reviews, and branching strategies that keep our codebase healthy and deployments predictable
- Maintain and expand technical documentation in Confluence, ensuring that institutional knowledge is captured, up to date, and accessible to the whole team
- Participate in regular code reviews — both giving and receiving feedback thoughtfully to raise the quality bar across the board
- Help triage, debug, and resolve production issues, taking ownership of problems through to resolution rather than passing them along
- Contribute to the design and implementation of new features end-to-end, from data modeling and API design to integration and testing
- Proactively flag technical debt, inconsistencies, or risks you encounter and work collaboratively to address them over time
- Engage in team ceremonies — standups, sprint reviews, retrospectives — and bring a constructive, communicative presence to everything you do
- Hands-on experience with Google Cloud Platform (GCP) — particularly services like BigQuery, Cloud Run, Vertex AI, or Cloud Storage
- Familiarity with Claude Code or similar AI-assisted development tools, and a comfort level integrating LLM-powered workflows into day-to-day engineering work
- Experience working with or building agentic AI systems — automated pipelines, multi-step LLM orchestration, or tool-use implementations
- Prior experience supporting federal government contracts or programs, including an understanding of the documentation, compliance, and delivery expectations that come with that environment
- Exposure to geospatial data concepts, tools, or platforms (e.g., H3, GeoJSON, PostGIS, Google Earth Engine)
- Experience with data pipeline development — ingestion, transformation, and delivery of structured datasets at scale
- Familiarity with Python as a secondary language for scripting, data processing, or ML-adjacent work
- Comfort working in a startup or small team environment where roles are fluid, priorities can shift, and self-direction is valued
- Experience with Anthropic's API or other LLM APIs (OpenAI, Gemini, etc.) in a production context
- An interest in — or prior exposure to — mobile data collection platforms, intelligence tooling, or data-as-a-service products
Requirements:
- GCO and Federal Government experience
- Strong fluency in SQL and data modeling
- Experience with Node.js and TypeScript
- Ability to write clean, well-documented TypeScript and Node.js code across backend services, APIs, and frontend features
- Experience managing work through Jira
- Collaboration via GitHub using pull requests, code reviews, and branching strategies
- Maintain and expand technical documentation in Confluence
- Participate in regular code reviews
- Help triage, debug, and resolve production issues
- Contribute to the design and implementation of new features end-to-end
- Proactively flag technical debt, inconsistencies, or risks
- Engage in team ceremonies such as standups, sprint reviews, and retrospectives
- Hands-on experience with Google Cloud Platform (GCP), particularly services like BigQuery, Cloud Run, Vertex AI, or Cloud Storage
- Familiarity with Claude Code or similar AI-assisted development tools
- Experience working with or building agentic AI systems
- Prior experience supporting federal government contracts or programs
- Exposure to geospatial data concepts, tools, or platforms
- Experience with data pipeline development
- Familiarity with Python as a secondary language for scripting, data processing, or ML-adjacent work
- Comfort working in a startup or small team environment
- Experience with Anthropic's API or other LLM APIs in a production context
- An interest in or prior exposure to mobile data collection platforms, intelligence tooling, or data-as-a-service products