NEOGOV is a company focused on supporting public health data systems in Washington. The Data Engineer role involves managing enterprise data pipelines and supporting business intelligence platforms to ensure reliable analytics and reporting for public health initiatives.
Responsibilities:
- Design, build, and maintain data pipelines supporting BI and analytics platforms
- Develop and manage ETL and ELT processes across OLTP and OLAP environments
- Map and transform operational data into analytical environments
- Implement data schemas and structures to support high usage reporting engines
- Automate data processing workflows to improve reliability and efficiency
- Support Power BI, SSRS, and related reporting technologies
- Ensure accurate, secure, and timely delivery of data to dashboards and reporting tools
- Partner with analysts and subject matter experts to translate business needs into technical solutions
- Maintain and optimize database environments
- Conduct capacity planning and performance monitoring
- Participate in IT Risk Assessment and Security Review processes
- Support application lifecycle management and migration strategies
- Develop and execute unit, integration, and performance testing
- Implement quality control processes to reduce defects and production issues
- Troubleshoot complex data and pipeline failures
- Support system deployments and implementation planning
- Coordinate with project teams, architects, vendors, and technical partners
- Assist during public health emergencies as assigned
Requirements:
- Four (4) years of professional experience in one or more of the following data disciplines: Data pipeline, Data processing, Data migration, ETL, Database development, Application development
- An Associate's degree or higher in Information Technology program or a closely related field; and two (2) years of professional experience in one or more of the following data disciplines: Data pipeline, Data processing, Data migration, ETL, Database development, application development
- A bachelor's degree in Information Technology program or a closely related field; and one (1) year of professional experience in one or more of the following data disciplines: Data pipeline, Data processing, Data migration, ETL, Database development, application development
- Eight (8) years of full-time equivalent, professional experience in one or more of the following IT disciplines: Systems development, data administration, database management, data pipeline engineering
- An Associate's degree or higher in Information Technology program or a closely related field; and six (6) years of full-time equivalent, professional experience in one or more of the following IT disciplines: Systems development, data administration, database management, data pipeline engineering
- A Bachelor's degree or higher in Information Technology program or a closely related field; and four (4) years of full-time equivalent, professional experience in one or more of the following IT disciplines: Systems development, data administration, database management, data pipeline engineering
- Two (2) or more years, of professional experience in the software development field
- Experience in the SQL Data Administration at an Enterprise level demonstrating the ability to perform SQL Server 2012/2014/2016/2019 database administration
- Building and maintaining reports in SSRS
- Developing SSIS packages, Building Cubes and Datamart in SSAS
- Experience in data architecture, data administration, data analysis, data modeling, reports development, data pipeline development or database administration
- One (1) or more years of using Python or Machine Learning
- Knowledge of ETL processing
- Knowledge of replication, high availability, and clustering technologies
- Familiarity with version control and deployment processes
- Knowledge of Salesforce or completed Salesforce training
- Knowledge of HL7
- Two (2) or more years of experience at the expert level in MS Power BI
- Demonstrated extensive advanced experience using and or managing SQL Server Reporting Services(SSRS)
- Experience developing web-based applications, databases, and visualizations
- Demonstrated experience working with and implementing Business Intelligence tools such as Tableau, developing and/or using business intelligence reports
- Three (3) years' experience working with large-scale distributed systems such as Hadoop/Spark/Storm, data warehousing systems such as Redshift or BigQuery, event brokers such as Kafka or Google Cloud Pub/Sub, and/or databases such as HBase/Cassandra
- Experience with cloud computing platforms like AWS, Google Cloud or Microsoft Azure
- Experience building data pipelines at Internet scale (terabytes per day)