Develops software using the KnowBe4 Software Development Lifecycle and Agile Methodologies
Designs, develops, and researches Machine Learning systems
Transforms data science prototypes by applying appropriate Machine Learning algorithms and tools
Performs statistical analysis and using results to improve models
Inference Engineering: Drive the deployment and optimization of both standard predictive models and LLM architectures, balancing trade-offs between low latency, high throughput, and cost-efficiency
Platform Hardening: Transition research prototypes into resilient, production-ready microservices that can handle massive traffic
Lifecycle Orchestration: Execute automated pipelines for data and model versioning, validation, and retraining
Observability: Implement advanced monitoring for model drift, data integrity, and system health to ensure production reliability
Collaborative Standards: Uphold clean code practices, thorough documentation, and participate in rigorous code reviews across the ML and Engineering teams
Requirements
BS or equivalent plus 3 years experience
MS/Ph.D. or equivalent plus no experience
Training in secure coding practices (preferred)
AI/ML and Core: Python (production-grade), PyTorch
Data and Features: Apache Spark for distributed processing; experience with Feature Stores or automated feature engineering is a plus
Infrastructure: AWS (SageMaker, Lambda), Docker, and Terraform/IaC for environment reproducibility
Specialized Tooling: Experience with custom inference optimization (Python-based); orchestration via lean, custom AWS and Python-based solutions using Lambda and MLflow
Additional: C# and JavaScript (beneficial)
Familiarity with secure coding practices
Tech Stack
Apache
AWS
Docker
JavaScript
Microservices
Python
PyTorch
Spark
Terraform
Benefits
We offer company-wide bonuses based on monthly sales targets
employee referral bonuses
adoption assistance
tuition reimbursement
certification reimbursement
certification completion bonuses
all in a modern, high-tech, and fun work environment.