Laivly is an innovative company focused on enhancing customer service technology through artificial intelligence and digital automation. They are seeking a Senior Data Engineer to design, build, and scale data systems that support reporting, analytics, and machine learning initiatives, while collaborating with various teams to ensure data-driven decision-making across the organization.
Responsibilities:
- Build cloud-based data ingestion pipelines from multiple sources
- Develop scalable ETL processes and data integration workflows
- Create APIs and endpoints that integrate data across internal and external systems
- Structure and optimize datasets to support reporting, analytics, and development lifecycle
- Design and document data models, sources, and ingestion techniques
- Ensures system design facilitates good data hygiene, reliability, and observability across all systems and pipelines as well as pushes data quality as a critical feature
- Partner with Data Science and business teams to discover and organize data across distributed systems
- Support downstream stakeholders by delivering well-structured and accessible datasets
- Contribute to Agile development processes and continuous improvement initiatives
- Experiment with new tools, technologies, and approaches to improve our data ecosystem
- Share ideas and contribute to the ongoing evolution of Laivly’s data platform
Requirements:
- 8-10+ years of experience working with data in SQL environments
- A degree in Computer Science, Mathematics, Statistics, or a related field
- Strong experience with SQL, ETL development, and data integration
- Experience with relational and non-relational databases
- Experience working within a data lake warehouse environment
- Python proficiency (Spark/Flink experience is a plus)
- Experience with C# development (JavaScript experience is a bonus)
- Experience working with cloud-based systems and distributed data platforms
- Familiarity with Agile methodologies, CI/CD pipelines, and process automation
- Strong problem-solving and analytical skills
- Ability to communicate complex data concepts to both technical and non-technical audiences
- Comfort working independently within a small, fast-moving team
- Curiosity, flexibility, and a passion for learning and working with data
- Experience with Trino, Iceberg, Kafka, Airflow, Polaris (considered a bonus)