System Automation Corporation is an industry leader specializing in enterprise information management applications for government agencies. The Data Engineer will be responsible for executing data migrations, deploying integration solutions, and collaborating with clients to deliver high-quality implementations of the Evoke™ platform.
Responsibilities:
- Execute data and document migrations for Evoke™ platform implementations, including data extraction, transformation, and loading (ETL) into Evoke solutions
- Review data mapping documentation and contribute to the design of conversion methodologies that enable accurate and efficient data transformation and loading
- Research and evaluate automated and AI-driven tools to enhance data migration processes, championing adoption within the organization
- Develop and deploy integration solutions for Evoke™ platform projects, including API-based and batch-based data imports and exports
- Provide input and guidance to customer data teams and internal consultants during data mapping activities, ensuring alignment with migration and integration requirements
- Collaborate with internal teams to gather data from multiple repositories and create models and visualizations that support operations management
- Design, build, and maintain scalable data pipelines and workflows to ensure efficient data processing
- Ensure data integrity, quality, and security throughout migration and integration processes
- Troubleshoot and resolve data-related issues during implementation and internal projects
- Document processes, standards, and best practices for data engineering and integration activities
Requirements:
- Bachelor's degree in computer science, data engineering, data science or related field, or equivalent experience
- 4+ years of experience in data engineering, data integration, or similar roles
- Proficiency in SQL and experience with ETL tools and scripting languages (e.g., Python)
- RESTful API Knowledge: Ability to design, consume, and troubleshoot RESTful APIs for data integration
- API Testing Tools: Familiarity with tools like Postman or Insomnia for testing and validating API endpoints
- Version Control: Proficiency with Git for managing code and configuration changes
- Data Security and Compliance: Understanding of encryption, secure data transfer, and compliance standards
- Performance Optimization: Ability to optimize ETL processes and API calls for speed and scalability
- Testing and Validation: Knowledge of automated testing frameworks for data pipelines and integrations
- CI/CD for Data Workflows: Familiarity with continuous integration and deployment practices for data engineering projects
- Strong understanding of ETL processes, data modeling, and database technologies (SQL and NoSQL)
- Experience with APIs, batch processing, and integration frameworks
- Familiarity with cloud-based data platforms and SaaS environments
- Knowledge of data visualization tools (e.g., Power BI, Tableau) and ability to create actionable insights
- Strong analytical and problem-solving skills with attention to detail
- Ability to research and implement automation and AI-driven solutions for data workflows
- Excellent communication and collaboration skills across technical and non-technical teams
- Ability to adapt quickly to shifting priorities and project requirements
- Experience with government SaaS implementations or regulatory systems
- Familiarity with AI-driven data migration tools
- Azure knowledge
- Experience with data visualization and reporting tools