Lob is transforming the way businesses use direct mail through innovative technology. The Staff Data Engineer will lead the design and buildout of a unified event tracking platform, collaborating with various teams to enhance Lob's data ecosystem and mentor fellow engineers.
Responsibilities:
- Collaborate with colleagues in the Data, Product, and Engineering teams to unify Lob’s foundational data ecosystem and enable the buildout of internal and external data products
- Create modular, reusable frameworks to enable other engineering teams to publish events to the unified platform
- Implement thorough monitoring and alerting to ensure the health of the data platform
- Apply software development best practices to ensure maintainability of the platform
- Partner with the Engineering Manager to set the Data team’s roadmap & define team processes
- Coach and mentor mid-level engineers on the Data team regarding technical best practices and problem-solving
- Create and maintain documentation for data products and systems
- Advise stakeholders on the constraints and assumptions of the data processed through the unified data platform
- Deprecate outdated legacy systems without negatively impacting core functionality for end users
- Monitor Cloud and SaaS spend for unusual spikes and seek out opportunities to save costs
- Participate in the team on-call rotation (approximately 1 in 4 weeks). Triage and resolve alerts as needed
- Coordinate with other Staff+ engineers on the broader the tech team to align decision making and execute strategic initiatives
Requirements:
- Bachelor's or Master's degree in a quantitative field, or equivalent work experience
- At least 8 years (but preferably 10 or more) of combined experience in Data, ML, and/or Software Engineering
- Expertise in Cloud Data Warehousing (Snowflake strongly preferred, Redshift is a plus)
- Expertise in streaming data processing systems such as Kafka and Apache Flink
- Expertise in modern software development fundamentals including APIs, version control, containerization, and CICD
- Expertise in a variety of database types, including transactional databases (PostgreSQL preferred) and document/vector databases (like Elasticsearch), with ability to select the right tool for a given job
- Familiarity with dbt (data build tool)
- Familiarity with pipeline orchestration engines (Apache Airflow or Prefect preferred)
- Familiarity with Change Data Capture (CDC) patterns and methods
- Excellent verbal, written, and visual communication skills
- Excellent organizational and project management skills
- Ability to be decisive & adaptable in the face of ambiguity
- Experience with AI-assisted coding tools like ClaudeCode or Cursor
- Proficient with project management tools including Jira