Nordic Global is committed to excellence and innovation in healthcare, and they are seeking a Senior Consultant focused on Data Engineering. This role supports the design, development, testing, and delivery of data integration and analytics solutions for healthcare clients, while also providing strategic guidance and mentorship to junior team members.
Responsibilities:
- Serve as a trusted advisor to healthcare clients, translating complex business questions into impactful data and analytics solutions
- Support end-to-end delivery of analytics projects, gathering requirements, architecting solutions, and overseeing development, testing, deployment, and adoption
- Collaborate directly with client stakeholders across clinical, operational, and executive teams to align analytics efforts with strategic priorities
- Proactively identify opportunities to enhance analytics maturity, optimize processes, and increase the business value of data assets
- Represent Nordic with professionalism and integrity in all client engagements, ensuring exceptional service, strategic insight, and consistent alignment with Nordic's values
- Guide project teams and support solution governance by providing input into architecture, data modeling, data pipelines, and reporting design decisions
- Design and architect scalable and reliable data pipelines to ingest, process, and transform large volumes of healthcare data from diverse sources, including electronic health records (EHR), clinical systems, research databases, and external data feeds
- Build production-grade data integration solutions using Azure Data Factory to orchestrate data workflows, manage data movement, and coordinate pipeline executions across complex healthcare data environments
- Develop end-to-end data pipelines that leverage ADF for orchestration and Databricks for scalable data processing and transformation
- Implement medallion architecture patterns (bronze/silver/gold layers) within Databricks lakehouse environments to support progressive data refinement and analytics readiness
- Build and maintain data warehousing and lakehouse architectures that support enterprise analytics and reporting requirements
- Design and build robust, scalable data models, data engineering solutions and products on platforms such as Databricks, Azure Data Factory and MS SSIS
- Develop advanced SQL queries, optimize data processing logic, and validate outputs across large, complex healthcare data environments (EHR, claims, operational systems, etc.)
- Contribute to performance testing, documentation, upgrades, and validation strategies across multiple client systems and environments
- Demonstrate commitment to continuous learning and stay current with industry trends, best practices, and emerging technologies in data engineering
- Mentor and support junior team members, fostering a culture of knowledge sharing, learning, and professional growth
- Assist leadership in project estimation, resource planning, and task coordination to ensure smooth delivery execution
- Contribute to the development of reusable tools, solution accelerators, and best practice documentation that improve team efficiency and delivery consistency
- Support Nordic’s internal strategic initiatives, participate in team training sessions, and help evolve the Data Services practice through process improvement and thought leadership
- Model Nordic’s core maxims and values by promoting collaboration, excellence, and customer-first thinking across teams and engagements
Requirements:
- Generally, requires a bachelor's degree and 8 years of relevant experience, a master's degree and 6 years of relevant experience, or 11 years of related experience and no degree
- Bachelor's degree in data science, healthcare administration, information systems, or related field; advanced degree preferred
- Minimum 8+ years of experience in healthcare analytics, data engineering, or business intelligence, with demonstrated success in client-facing or consulting roles
- Minimum 4+ years of experience specifically in data engineering within healthcare environments using ADF/Databricks
- Advanced Proficiency with SQL and experience with large-scale data sets; Python or R for data engineering
- Proficiency in designing and building data integration pipelines, orchestrating workflows, managing data movement activities, and implementing scheduling and monitoring frameworks
- Experience with Git, Azure DevOps, and CI/CD practices for data pipeline deployment
- Strong experience designing, developing, and maintaining enterprise-grade data pipelines, ETL workloads, and analytics frameworks
- Experience with source control (Git, Azure DevOps) and CI/CD practices for data pipelines
- Strong documentation skills for technical specifications and data lineage
- Experience with SQL Server Integration Services (SSIS) package development, debugging, and migration
- Ability to lead client discussions, manage stakeholder expectations, and communicate technical concepts to non-technical audiences
- Demonstrated ability to mentor junior staff, lead technical workstreams, and contribute to strategic solution design
- Excellent written and verbal communication skills with the ability to influence decision-making
- High attention to detail, accountability for deliverables, and a proactive approach to problem-solving and continuous improvement
- Experience working across multiple Epic clinical or operational domains
- Exposure to modern data architectures and cloud-based analytics platforms (e.g., Azure, Snowflake, Databricks)
- Experience applying Agile methodologies to analytics project delivery
- Certification in at least one of the following tools is preferred: Databricks, Azure Data Factory with relevant experience in all tools preferred
- Epic certification in one or more modules (e.g., Clarity or Caboodle) and experience working with Epic Clarity and Caboodle data models
- Background in data science, machine learning, or advanced analytics techniques is a plus