
Requisition #: 1478
Job Title: Senior Data Engineer
Location: Washington D.C.
Clearance Level: Top Secret
SUMMARY
The Senior Data Engineer will support the development, maintenance, and optimization of enterprise data systems and pipelines that power a Private Sector Portal (PSP) and related data services. This role focuses on designing scalable ETL processes, automating data workflows, and ensuring high-quality, standardized data across the organization. The engineer will work closely with relationship management teams, developers, and enterprise stakeholders to gather technical requirements, troubleshoot issues, and deliver reliable data solutions.
As a direct employee of Agile Defense, you would receive a benefit package that includes health/dental/vision insurance coverage, 401K with company match, PTO & paid holidays, and annual tuition/training assistance. For more information, please visit our website.
JOB DUTIES AND RESPONSIBILITIES
· Update and maintain the information residing on the Private Sector Portal (PSP) tool, using automation when appropriate.
· Assist the relationship management team in identifying and documenting technical requirements around the PSP system.
· Develop guidance around appropriate use of the PSP system and ensure consistent data standards across the enterprise.
· Provide expertise and customer service to HQ/field users.
· Maintain the PSP system, to include auditing content for missing or inappropriate metadata, reporting technical user problems to the contractor.
· Ensure the scheduling, processing, and documentation of data delivery and implementation, using automation when appropriate
· Design, implement, automate, and maintain Enterprise level of scalable Extract, Transform and Load (ETL) data pipelines and data analysis pipelines.
· Build and maintain robust data processing, enriching, and analysis pipelines that aggregate and normalize large datasets across diverse sources, including sources from web services. data from disparate sources.
· Automate and optimize database and data workflows for best performance and efficiency.
· Identify and propose emerging technologies, automation or alternative solutions to efficiently improve support for developers and application stakeholders.
· Develop and maintain documentation, inline comments, confluence and stand-alone documents, wherever necessary for the readability of the software design and code.
· Ability to focus on producing results efficiently.
· Ability to work independently and coordinate with team members.
QUALIFICATIONS
Education, Background, and Years of Experience
· Bachelor's degree from an accredited university or college in a quantitative field (i.e. Statistics or Operations Research) or technical field (i.e. Computer Science or Engineering).
· Educational requirement may be waived if the candidate has eight (8) or more years of experience.
· Minimum 10 years of relevant experience.
· Four (4) years of technical writing experience for the Federal Government.
· Four (4) years of Exceptional proficiency in writing and editing skills using MS Office.
· Minimum of 2-4 years of work experience in a data analytics or data engineering role
ADDITIONAL SKILLS & QUALIFICATIONS Required Skills
· Proficiency with a scripting language (e.g. Python, R, Perl, bash) and advanced SQL.
· Experience with business visualization tools (e.g. Looker, Tableau, Microsoft PowerBI).
· Experience with HTML, CSS, Bootstrap/Reactstrap, and JavaScript frameworks such as React.
· Ability to explain technical concepts to non-technical people.
· Experience with data warehousing.
· Experience with AWS, Google Cloud ecosystem (e.g. BigQuery, Redshift).
· Experience with designing, implementing and maintaining big data processing, normalizing, and enriching pipelines in language such as Python, PySpark, or Spark.
· Experience with architecting, designing and building modern medallion ETL processes for data lake, data warehouse or lakehouse.
· Experience with building and Knowledge of automating data pipeline and workflow on Apache Niagra Files (NiFi) or Databricks for ease of maintenance.
· Experience with implementing automated data Cl/CD pipelines with Continuous Integration/Continuous Delivery (CI/CD).
· Experience with dataframe processing with filtering conditions.
· Experience with organizing data version history to optimize processing and retrieval performance.
· Experience with data indexing for robust searching and querying.
· Strong verbal and written communication skills to effectively collaborate and integrate with enterprise partners and customers.
· Excellent at troubleshooting, debugging and researching.
· Ability to focus and a desire to learn.
· Knowledge of PostgreSQL is preferred.
WORKING CONDITIONS
Full time on-site with client in office setting.