Nitrogen is a financial technology company revolutionizing client engagement for financial advisors and wealth management firms. The Senior Staff Data Engineer will lead the design and evolution of data systems, ensuring the delivery of robust data pipelines and APIs to enhance client experiences.
Responsibilities:
- Advances Nitrogen's data platform and service capabilities with your expertise in building large-scale, modern data and API systems
- Owns the successful delivery of data flows end-to-end: ingesting raw data from external partners, standardizing it in our warehouse, and exposing it through APIs and downstream production systems
- Champions good data governance, industry best practices, and efficient processing techniques
- Delivers high-quality, production-grade features, and reliably meets commitments
- Sets a high bar for technical productivity and efficiency through deep, consistent use of AI and agentic developer tools
- Provides architectural leadership across data pipelines, APIs, and services to ensure scalable designs and high-quality implementations
- Develops a deep understanding of what our domain-specific data means to our customers and their product experience
- Proactively identifies and addresses technical debt and developer experience gaps across data and service layers, advocating for AI-enhanced solutions where appropriate
- Mentors and elevates the technical skills of fellow engineers
- Demonstrates a continuous improvement mindset in both personal development and all technical workflows
- Ensures your technical contributions align with company objectives and expectations
Requirements:
- 10+ years of hands-on experience in data engineering, building reliable pipelines and production-grade workflows
- Expert at writing optimized SQL & Python in Snowflake & dbt, and have deep experience designing reliable, production-grade data workflows
- Built and operated CDC systems syncing data into and out of warehouses, and have experience building or integrating with APIs and services
- Experience with tools like DMS, OpenFlow, Airflow, Kafka, Debezium, S3, or similar
- Design scalable data and AI systems end-to-end, with experience across RDBMS, data warehouses, NoSQL, and graph databases
- Experienced with modern AI architectures like agentic workflows, RAG, agent context & memory, and have worked with tools like MCP, LangChain and LangGraph
- Experience beyond core data tooling—whether in backend services, APIs, or full-stack systems
- Experience with languages such as Go, TypeScript, or similar is a strong plus
- Deep understanding of how data engines and pipelines work inside and out, and can troubleshoot the most difficult issues
- Actively leverage agentic AI developer tools to significantly amplify productivity and impact
- Trusted technical authority that others turn to for solving the most challenging problems and making high-impact decisions
- Experience with ML systems or ML engineering pipelines is a strong plus