Fidium Fiber is a next-generation fiber internet and network services provider seeking a skilled Data Engineer to support their Telecom Enterprise Data Warehouse (EDW) environment. The role involves ensuring the availability, reliability, and quality of data that powers enterprise analytics and strategic decision-making across the telecom organization.
Responsibilities:
- Design, implement, and manage automated data ingestion pipelines using FiveTran and other ETL/ELT tools
- Integrate data from various telecom systems (OSS/BSS, CRM, billing, network systems, etc.) into Snowflake
- Ensure scalability and reliability of data ingestion processes to meet business SLAs
- Develop and optimize ELT pipelines in Matillion, ensuring efficient data transformations in Snowflake
- Build and maintain data models (staging, core, mart layers) to support reporting and analytics needs
- Follow and enforce data modeling best practices, ensuring consistent naming, lineage, and documentation across datasets
- Implement and monitor data validation, profiling, and quality checks to ensure data accuracy, completeness, and timeliness
- Collaborate with data governance teams to maintain metadata, lineage, and documentation within Secoda
- Troubleshoot and resolve data discrepancies and pipeline failures proactively
- Support daily operations of the EDW, including job scheduling, monitoring, and performance tuning
- Implement alerting, logging, and observability practices to ensure high system uptime
- Participate in on-call rotations or provide after-hours support as needed
- Partner with analysts, BI developers and business stakeholders to enable data consumption in Tableau and other downstream systems
- Develop and optimize data extracts, dashboards, and APIs for business use cases
- Ensure alignment between data warehouse design and analytical/reporting requirements
- Work closely with data architects, business analysts, and domain experts to translate business needs into data engineering solutions
- Contribute to standards, best practices, and automation within the data engineering team
- Support continuous improvement of ETL/ELT frameworks, CI/CD pipelines, and DevOps processes
Requirements:
- Bachelor's degree in Computer Science, Information Systems, Data Engineering, or related field
- 4+ years of experience in data engineering or data warehousing, preferably in the telecom domain
- Strong proficiency with: FiveTran (data ingestion and connector management), Snowflake (SQL, warehouse configuration, performance optimization), Matillion (ELT orchestration and transformation development), Tableau (understanding of data delivery and visualization integration), Secoda (data cataloging and lineage documentation) or similar solution
- Solid understanding of data modeling (star/snowflake schemas, dimensional modeling)
- Experience with data quality frameworks, metadata management, and DevOps for data pipelines
- Strong SQL skills and familiarity with Python or bash scripting for automation
- Experience with cloud environments (AWS, Azure, or GCP)
- Strong analytical and problem-solving skills
- Excellent communication and collaboration abilities
- Detail-oriented, organized, and proactive in issue resolution
- Ability to work independently and in a fast-paced, dynamic environment
- Telecom industry experience, particularly with OSS/BSS and network data sources
- Experience in CI/CD for data pipelines (Git, Jenkins, or similar)
- Familiarity with dbt or similar data transformation frameworks
- Exposure to data observability tools and incident management practices
- Exposure to Data360 Analyze