Game Plan Tech is dedicated to empowering public sector organizations with best-in-class Google solutions. They are seeking a Data Scientist / Data Engineer to join a cross-functional product team supporting Government Logistics, where the role involves writing code, deploying models, and engaging with government stakeholders.
Responsibilities:
- Develop, validate, and deploy statistical and machine learning models to inform supply chain decisions for items managed by Government supply chains (Aviation, Land and Maritime, Troop Support, Energy, Distribution, Disposition Services)
- Design and build data pipelines that integrate authoritative Government systems (EBS / SAP, DSS, EProcurement, FedMall) with external feeds (GIDEP, FPDS-NG, commercial supplier data)
- Translate ambiguous mission questions from senior government leaders into well-scoped analytical problems with measurable outcomes
- Productionize models within government environments (GCP, AWS GovCloud, Azure Government, on-premise enclaves), meeting DoD RMF, STIG, and IL4/IL5/IL6 controls
- Document methodology, assumptions, and limitations so government stakeholders — including contracting officers, item managers, and inspectors general — can defend model-informed decisions
- Support contract deliverables: technical reports, monthly status reports, demonstrations to the COR and government PM, and contributions to white papers and re-compete proposals
Requirements:
- U.S. citizenship
- Active DoD Secret clearance at time of application. Inactive clearances within the two-year reinstatement window will be considered on a case-by-case basis
- Bachelor's or higher in a quantitative field — statistics, mathematics, computer science, operations research, industrial engineering, economics, physics, or a closely related discipline. Equivalent experience considered
- 3+ years (mid-level) or 6+ years (senior) building and shipping data products in a production environment
- Strong proficiency in Python (pandas, NumPy, scikit-learn, PyTorch or TensorFlow) and SQL
- Working knowledge of one of: Spark/PySpark, dbt, Airflow
- Experience with at least one major cloud platform; AWS GovCloud or Azure Government strongly preferred
- Demonstrated experience moving a model from notebook to a monitored production service — including testing, CI/CD, and post-deployment performance tracking
- Experience working with messy, real-world enterprise data (ERP exports, transactional logs, hand-keyed records)
- Comfort working in a customer-facing role: explaining technical decisions to non-technical government stakeholders, taking direction from a COR/PM, and operating within the boundaries of the contract scope
- Cloud certifications in the area of architecture, data engineering, and/or machine learning
- Background working with government technology projects and programs
- Prior contractor experience supporting a Government Customer
- Familiarity with time-series and intermittent-demand forecasting methods (Croston, TSB, ETS, ARIMA, hierarchical/global deep models such as DeepAR or Temporal Fusion Transformers)
- Experience with operations research techniques: mixed-integer programming, network flow, stochastic optimization (Gurobi, CPLEX, OR-Tools, Pyomo)
- Working knowledge of SAP / ECC / S/4HANA data models, or DLA's Enterprise Business System (EBS)
- Experience operating under DoD RMF, ATO processes, and IL4/IL5/IL6 data handling
- Familiarity with federal data standards relevant to logistics: NSN/FLIS, NIIN, FSC, UID/IUID, WAWF/iRAPT, DLMS transactions
- Veterans and transitioning service members with a background in logistics, supply, or acquisition are strongly encouraged to apply