LeoLabs is a company focused on redefining global security, safety, and transparency in space through its global radar network and AI-enabled analytics platform. They are seeking a motivated Data Engineer to join their Insights team, where the role involves building and operating data pipelines and analytics infrastructure to support customer insights and real-time space domain awareness capabilities.
Responsibilities:
- Play a key role in building and operating the data pipelines and analytics infrastructure that power customer insights, internal decision-making, and real-time space domain awareness capabilities
- Work closely with software engineers, radar and catalog teams, and data scientists to ensure reliable extraction, transformation, and loading (ETL) of mission-critical datasets
- Develop scalable batch and streaming data workflows, enabling advanced analytics, and supporting machine learning initiatives
- Transform large volumes of sensor and orbital data into actionable intelligence that enables users to safely operate and manage assets in low-Earth orbit
- Participate in operational support rotations during critical incidents
Requirements:
- B.S. or M.S. in Computer Science, Data Science, Engineering, Mathematics, Physics, or equivalent experience
- 0-2 years of experience in data engineering, software engineering, analytics engineering, or related technical roles
- Experience designing and building data pipelines or ETL/ELT workflows
- Hands-on experience with Databricks, Apache Spark, or distributed data processing frameworks
- Proficiency in Python and SQL for data transformation and analysis
- Familiarity with data modeling concepts and modern data lake or warehouse architectures
- Experience working in cloud-native environments (AWS preferred)
- Understanding of software development best practices including version control, testing, and CI/CD
- Strong analytical mindset and ability to troubleshoot complex data issues
- Effective communication skills and ability to collaborate across distributed engineering teams
- Ability to participate in operational support rotations during critical incidents
- Experience supporting data science or machine learning workflows, including feature engineering pipelines
- Familiarity with Delta Lake, Lakehouse architectures, or large-scale telemetry data processing
- Exposure to streaming data systems such as Kafka or Spark Structured Streaming
- Experience with workflow orchestration tools such as Airflow or Databricks Workflows
- Background in orbital mechanics, aerospace, physics, or applied mathematics
- Experience building analytics datasets or semantic models for BI tools
- Active U.S. security clearance or ability to obtain one