PrizePicks is the fastest-growing sports company in North America, recognized as a leading platform for Daily Fantasy Sports. As a Data Platform Engineer, you will contribute to building and maintaining a modern Data platform that supports data engineering, analytics, and machine learning capabilities, directly impacting key platform metrics.
Responsibilities:
- Design and build the Data platform for Batch and Streaming use cases
- Build and maintain a platform with cutting-edge technologies and enable data users by building data catalog and data lineage capabilities
- Contribute to design and enforce robust data security architectures and controls
- Build platform for deploying low-latency services to pipe data for streaming or near real time use cases
- Power real-time decisions across the platform, from dynamic oddsmaking and risk analysis to smart deposit defaults
- Champion best practices for model deployment, monitoring, and CI/CD for Data pipeline deployment
- Enable complete observability for batch and streaming data platform and ensure the availability of 99.99%
Requirements:
- 3+ years of experience in Platform Engineering, with a proven track record of deploying and maintaining scalable Data platforms in high-traffic production environments
- Proficient in streaming architectures (Kafka/Flink/PubSub) and building low-latency services to serve stream ingestion and processing, which will serve model inference in <100ms
- Proficient with Containerization, Docker, Kubernetes and cluster-level management
- Deep experience building a platform for managing the full Data lifecycle, including setting up a data exploration environment
- Expert in coding with Python and Go
- Deep experience with Cloud services, preferred with GCP services (BigQuery, Cloud Functions, GKE) or AWS equivalents
- Excellent communication skills, stakeholder management and outstanding problem-solving skills
- Extensive experience in Big data technologies like Spark, Flink, Kafka or Kinesis, Argo/Airflow, Polaris, OpenMetadata, Iceberg, Lakehouse, Redis, Elasticsearch, and Databases
- Experience with building REST APIs, package management and have built libraries
- Should have been a key contributor to projects through the entire development lifecycle from concept to release
- Experience implementing infrastructure while enforcing best practices for deployment of a large scale data platform
- Background in Daily Fantasy Sports (DFS), oddsmaking, or high-frequency trading
- Experience building and scaling data platforms that successfully bridge batch historical data with real-time event streams
- Enabling self-service for Data teams for pipeline development and deployment
- Enabling AI agents for repetitive tasks and AI coding for faster and iterative software development