Contribute to data engineering efforts by supporting the integration and enrichment of DISN network topology data for advanced data correlation and analytics.
Participate in technical discussions with internal and external stakeholders to support solution design and implementation.
Develop, test, and deploy data pipelines and integration solutions across distributed systems and cloud environments, using Python, JavaScript, Java, and SQL.
Assist in requirements gathering and collaborate with stakeholders to design and implement data enrichment pipelines, integrating diverse data sources into Confluent (Kafka) and Elastic platforms.
Develop and maintain Kibana visualizations and dashboards to support operational insights.
Support Kafka system integrations between Elasticsearch/Logstash and other systems.
Collaborate within Agile scrum teams, contribute to team deliverables, and share knowledge with peers.
Communicate and coordinate effectively with geographically distributed team members to achieve project objectives.
Troubleshoot and help resolve installation, infrastructure, and system issues; report and help mitigate technical risks.
Develop and maintain technical documentation, including DoD requirements, interface documents, and security compliance artifacts.
Ensure solutions comply with DoD security standards and guidelines, and support platform sustainment and reliability by addressing operational challenges as needed.
Requirements
Bachelor’s degree from an accredited college in a related discipline, or equivalent experience/combined education, with 4–8 years of professional experience; or 2–6 years of professional experience with a related master’s degree.
4+ years of experience in software engineering, data engineering, or business/data analysis, preferably within Agile/Scrum teams.
Hands-on software development experience with Python, Java, SQL, and working knowledge of JavaScript and HTML.
Experience with distributed version control systems such as Git and Bitbucket.
Proficiency with data analytics and visualization tools, such as Kibana, Power BI, Tableau, and the ELK stack (Elasticsearch, Logstash, Kibana).
Experience designing, developing, and optimizing ETL processes and data pipelines, including integration with event streaming platforms like Kafka.
Background in data modeling, unification, and analytics to support data-driven projects.
Experience implementing application and system integrations, including Kafka and Elastic platform integrations.
Understanding of networking and internet protocols, with experience supporting network-centric or data-driven environments.
Experience developing and deploying software on UNIX/Linux command line platforms.
Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams and stakeholders.
Experience with Agile project management and collaboration tools such as JIRA and Confluence.
Active Secret DoD Security clearance prior to start date.
Active Security+ Certification (or other applicable DoD 8570 IAT II certification) prior to start date.