Contribute to the delivery of enterprise geospatial dataflows and analytics pipelines supporting mission-critical outcomes within secure and complex environments
Design, develop, and maintain geospatial data pipelines and workflows under senior guidance
Implement data ingestion, transformation, validation, and delivery for geospatial and spatiotemporal datasets
Develop and manage dataflows using tools such as Apache NiFi, FME, and SQL-based transformations
Support geospatial ETL processes, including schema management, data validation, and preparation
Troubleshoot dataflow issues and support integration with downstream systems, analytics, and reporting platforms
Ensure data quality, documentation, and continuous improvement while collaborating within teams and mentoring junior members
Requirements
Experience in data engineering, data integration, or geospatial engineering roles with exposure to spatial data
Hands-on involvement in ETL/ELT pipelines, including data transformation and validation
Familiarity with data integration tools (e.g., Apache NiFi, FME) and SQL for data processing
Basic scripting or automation skills (e.g., Python, PowerShell)
Ability to follow engineering standards and work effectively within regulated environments
Must be an Australian Citizen and hold an active TSPV Clearance.
Tech Stack
Apache
ETL
Python
SQL
Benefits
Accrue up to an extra 12 days of leave per year through our Life Days program.