Peerspace is the leading online marketplace for venue rentals, seeking a Senior Data Engineer who will build robust data infrastructure and services. The role involves modernizing legacy data pipelines, collaborating across teams, and applying software engineering best practices to the data ecosystem.
Responsibilities:
- Modernize & Decompose: Take the lead on breaking down large, monolithic data initiatives into manageable, modern services and pipelines that allow us to ship fast and scale efficiently
- Architecture & Design: Drive the strategy for new data storage patterns and the services required to serve data back to the core product as well as for internal analytics consumption
- Engineering Excellence: Apply software engineering best practices—such as CI/CD, unit testing, and version control—to our data ecosystem
- End-to-End Data Ownership: Act with a strong ownership mindset, digging into the "why" and "how" to find answers in a startup-style environment
- Collaborate & Mentor: We’re a small and nimble team. You will be a primary collaborator across the company, from Product and Engineering to Marketing and Finance, serving as a partner and mentor to bring the whole team along in building production-grade systems
Requirements:
- Deep understanding of software design patterns and a strong foundation in data engineering basics
- Opinionated on best practices regarding data orchestration and storage
- Ability to apply engineering rigor (modularity, testing, and maintainability) to data workflows
- Track record of balancing short-term business requirements with long-term technical health
- Ability to know when to build for the future and when a tactical solution is the right call for the moment
- Highly proficient in Python and can write complex, performant SQL
- Track record of breaking down high-level business requirements into technical roadmaps and executing them
- Proactive communicator who takes pride in owning a problem from discovery to resolution
- Enjoy digging in yourself to find answers and partnering with others to implement the right solution
- Comfortable with ambiguity and enjoy digging into the code to figure out how things work
- Experience or a strong interest in deploying ML pipelines and working with LLM tooling
- Experience with dbt
- Experience with real-time or near real-time data processing (streaming)