Home
Jobs
Saved
Resumes
Data Modeler, Engineer 3 at NBCUniversal | JobVerse
JobVerse
Home
Jobs
Recruiters
Companies
Pricing
Blog
Jobs
/
Data Modeler, Engineer 3
NBCUniversal
Website
LinkedIn
Data Modeler, Engineer 3
United States
Full Time
1 week ago
H1B Sponsor
Apply Now
Key skills
Apache
AWS
Azure
Cloud
Google Cloud Platform
PySpark
Spark
SQL
Analytics
Snowflake
Apache Spark
GCP
Google Cloud
Performance Optimization
SAP
Communication
About this role
Role Overview
Design and maintain conceptual, logical, and physical data models for domain-owned data products
Translate product requirements into consumer-friendly analytical models, entities, and metrics
Build domain-aligned, analytics-ready data structures optimized for modern access patterns
Define and maintain semantic layers and shared business definitions across data products
Align data product models with modern data architecture patterns (lakehouse, multi-layered data platforms)
Partner with data platform and engineering teams to ensure data products are efficiently implemented in Snowflake and Apache Spark
Contribute to standards for schema evolution, data contracts, and backward compatibility
Work closely with data product managers, analytics engineers, and business stakeholders to refine product requirements
Act as a modeling subject-matter expert within data product teams
Support onboarding and adoption of data products by downstream consumers
Embed data quality, consistency, and usability into data product designs
Maintain clear documentation for data products, models, and metrics
Support metadata, lineage, and discoverability initiatives aligned to data product governance
Requirements
5+ years of experience in data modeling, analytics engineering, or data architecture roles.
Strong proficiency in conceptual, logical, and physical data modeling techniques.
Hands-on experience with Snowflake, including schema design, performance optimization, and cost governance.
Experience developing data models and transformations in Apache Spark (Spark SQL, PySpark preferred).
Familiarity with modern data architectures (e.g., lakehouse, domain-oriented data products, multi‑layered data platforms).
Experience working with modeling tools such as Erwin or SAP PowerDesigner.
Knowledge of schema evolution practices, data contracts, and metadata/lineage standards.
Proven ability to collaborate with cross-functional data and business teams.
Strong understanding of semantic modeling and shared business metric design.
Experience working in cloud‑native environments (AWS, Azure, or GCP).
Ability to balance scalability, performance, quality, and usability in data product design.
Excellent communication skills with the ability to explain complex data concepts to non‑technical audiences.
Tech Stack
Apache
AWS
Azure
Cloud
Google Cloud Platform
PySpark
Spark
SQL
Benefits
array of options
expert guidance
always-on tools
Apply Now
Home
Jobs
Saved
Resumes