Fidium Fiber is a next-generation fiber internet and network services provider. They are seeking a skilled Senior AI Data Engineer to support and evolve their Telecom Enterprise Data Warehouse environment while helping build next-generation, AI-enabled data products.
Responsibilities:
- Design, implement, and manage automated data ingestion pipelines using FiveTran and other ETL/ELT tools
- Integrate data from various telecom systems (OSS/BSS, CRM, billing, network systems, etc.) into Snowflake
- Ensure scalability and reliability of data ingestion processes to meet business SLAs
- Develop and optimize ELT pipelines in Matillion, ensuring efficient data transformations in Snowflake
- Build and maintain data models (staging, core, mart layers) to support reporting, analytics, and AI/ML use cases
- Design data structures and feature-ready datasets to support machine learning and advanced analytics initiatives
- Partner with data scientists and product teams to enable scalable model training, inference, and deployment workflows
- Implement and monitor data validation, profiling, and quality checks to ensure data accuracy, completeness, and timeliness
- Collaborate with data governance teams to maintain metadata, lineage, and documentation within Secoda
- Troubleshoot and resolve data discrepancies and pipeline failures proactively
- Contribute to the design and development of a self-service analytics platform / web portal for internal stakeholders
- Build APIs and backend services to expose curated data and AI-driven insights to applications
- Collaborate on front-end integrations (e.g., React-based interfaces) to deliver intuitive, data-driven user experiences
- Support development of AI-powered features such as forecasting, anomaly detection, or intelligent data exploration
- Support daily operations of the EDW, including job scheduling, monitoring, and performance tuning
- Implement alerting, logging, and observability practices to ensure high system uptime
- Participate in on-call rotations or provide after-hours support as needed
- Partner with analysts, BI developers, and product stakeholders to enable both dashboard-based and application-based data consumption
- Support delivery of data through APIs and services for downstream applications in addition to Tableau reporting
- Work closely with data architects, data scientists, product managers, and application developers to translate business needs into scalable data and AI solutions
- Contribute to best practices across data engineering, AI/ML pipelines, and application integration
Requirements:
- Bachelor's degree in Computer Science, Information Systems, Data Engineering, or related field
- 8+ years of experience in data engineering or data warehousing
- Telecom industry experience required
- Solid understanding of data modeling (star/snowflake schemas, dimensional modeling)
- Experience with data quality frameworks, metadata management, and DevOps for data pipelines
- Strong SQL skills and familiarity with Python or bash scripting for automation
- Experience with cloud environments (AWS, Azure, or GCP)
- Strong proficiency with: FiveTran (data ingestion and connector management)
- Snowflake (SQL, warehouse configuration, performance optimization)
- Matillion (ELT orchestration and transformation development)
- Tableau (understanding of data delivery and visualization integration)
- Secoda (data cataloging and lineage documentation) or similar solution
- Telecom industry experience, particularly with OSS/BSS and network data sources
- Experience building data-driven applications or internal platforms
- Exposure to LLMs, AI-assisted analytics, or ML model deployment pipelines
- Familiarity with real-time or event-driven architectures
- Experience integrating data platforms with user-facing applications
- Exposure to data observability tools and incident management practices
- Exposure to Data360 Analyze