Brooksource is a company focused on transforming healthcare through innovative data solutions. The Senior Data Engineer will be responsible for designing, developing, and optimizing enterprise data pipelines and solutions in a cloud environment, while ensuring operational excellence and collaboration within an agile team.
Responsibilities:
- Build data pipelines: Create, maintain, and optimize workloads from development to production for specific use cases, with a focus on cloud-native solutions and modern frameworks
- Develop the most efficient and cost-effective implementation, leveraging reusable features where possible
- Drive operational excellence, including but not limited to Incident Management, process automation leveraging AI and ensuring smooth deployments for your technology products/platform features
- Have the capability to leverage AI across development, testing, and deployment stages to drive innovation, reduce operational overhead, and enhance product quality
- Monitor and manage software configuration changes to anticipate and address data reliability and customer satisfaction issues, leveraging cloud monitoring tools and practices
- Coordinate sustaining support for multiple application platforms or business processes, ensuring seamless integration and operation in a cloud environment
- Apply significant knowledge of IT and healthcare industry trends
- Work in agile/DevSecOps pod model alongside solution leads, data modelers, analysts, business partners and other developers in delivery of data
- Support of monitoring and tuning application code to assure optimal availability, performance and utilization of resources
- Provide technical expertise working with Analysts and Business Users to implement complex and varied functional specifications into technical designs
Requirements:
- Bachelor's degree in computer science, Information Technology, Management Information Systems, or a related field (or equivalent experience), with a minimum of 5 years of relevant experience in enterprise application support and cloud-based solution delivery
- Strong experience in design, coding, integration, and deployment on Windows and/or Linux platforms
- Expertise in cloud platforms (preferably Azure; AWS/GCP acceptable) and related services (ADLS, Synapse, Data Factory)
- Proficiency in SQL and Python, with strong scripting/automation skills
- Hands-on experience with modern data stack tools: dbt, Snowflake or Databricks, BigQuery, Airflow or Tidal
- Solid knowledge of data modeling (Data Vault 2.0), data integration, data architecture, warehousing, and data quality
- Familiarity with CI/CD pipelines (Bitbucket, GitHub Actions) and Infrastructure as Code tools (Ansible)
- Familiarity with Collibra for data quality and governance
- Implement testing frameworks and validation processes for data pipelines
- Optimize performance and reliability of data workflows across development, testing, and deployment
- Working knowledge of networking/security protocols (LDAP, Active Directory) and API integrations
- Unix/Linux command-line proficiency and shell scripting
- Understanding of Agile methodologies, and enterprise cloud integration
- Excellent communication skills for translating technical deliverables
- Awareness of data governance, security standards, and production certification processes
- Healthcare industry experience and Epic software knowledge