Marathon TS is seeking a highly skilled and innovative Staff Data Engineer to lead the development and management of their enterprise data infrastructure. The role focuses on building scalable, self-service analytics through expertise in semantic layers and dimensional modeling.
Responsibilities:
- Lead the development and management of enterprise data infrastructure
- Build and maintain ETL/ELT pipelines in modern cloud environments
- Design, build, and optimize semantic layers and universal semantic models
- Translate complex business logic into scalable, reusable data models
Requirements:
- Proven experience as a Data Engineer or similar role with a focus on big data solutions in a cloud environment (AWS or Azure)
- Strong proficiency in SQL (Microsoft SQL Server, Oracle), Python programming for data processing and analysis
- Strong proficiency in SQL (advanced query optimization, complex joins, window functions, large-scale data processing)
- Hands-on experience with Python for data processing, automation, and pipeline development
- Deep experience building and maintaining ETL/ELT pipelines in modern cloud environments (preferably Google Cloud Platform / BigQuery)
- Experience with orchestration tools (e.g., Airflow, Cloud Composer) and CI/CD pipelines (e.g., Jenkins)
- Strong understanding of data quality, data validation, and production support practices
- Hands-on experience with OLAP cube technologies such as AtScale, SSAS, or similar semantic modeling tools (any multi-dimensional analytics/BI tool)
- Experience designing, building, and optimizing semantic layers / universal semantic models
- Strong understanding of Cube design and optimization (aggregations, performance tuning)
- Strong understanding of Measures, hierarchies, and calculated metrics
- Strong understanding of Query performance optimization across large datasets
- Experience enabling self-service analytics through governed semantic layers
- Expertise in dimensional modeling
- Strong experience designing Star schemas
- Strong experience designing Snowflake schemas
- Strong experience designing Fact and dimension tables aligned to business processes
- Ability to translate complex business logic into scalable, reusable data models
- Experience building metric layers or reusable business logic abstractions
- OLAP (AtScale, SSAS, or other): 3 years (Required)
- SQL: 5 years (Required)
- Python: 3 years (Required)
- ELT/ETL pipeline: 4 years (Required)