WorkWave is a customer-centric company seeking a Senior AI/Data Engineer to build and scale data systems that drive their decision intelligence products. The role involves engineering AI/ML solutions, designing data pipelines, and collaborating with cross-functional teams to deliver business value and enhance customer experience.
Responsibilities:
- AI/ML ownership: Build trusted AI/ML predictions and forecasts via feature pipelines, model input/output data flows, and robust data validation frameworks
- Pipeline Design: Design and implement reliable, scalable, and secure data pipelines that serve analytical and product use cases
- Leadership: Provide technical leadership and mentorship to other data engineers and cross-functional collaborators, fostering a culture of engineering excellence
- Architecture & Evolution: Own the architecture and evolution of our data platform, ensuring it meets the performance, scalability, and agility needs of our growing business
- Governance & Observability: Implement data governance, quality, and observability best practices to ensure trustworthy insights, proactively managing data health before it impacts the business
- Infrastructure Optimization: Optimize cloud data infrastructure for cost, performance, and maintainability, treating the platform as a product rather than just a utility
- Cross-Functional Partnership: Collaborate closely with product managers, engineers, and customer stakeholders to understand context and needs, and help translate them into engineering solutions
- Customer-Centricity: Ensure engineering efforts are aligned with delivering clear business value and enhancing the customer experience
Requirements:
- 5+ years of experience in data engineering, with 1–2+ years in a senior or lead capacity
- Deep Technical Expertise: You possess a profound understanding of ML models and can articulate the trade-offs between different architectures (e.g., complexity vs. inference speed, accuracy vs. interpretability) to ensure the right tool is selected for the job
- Coding: 'Ninja-level' proficiency in Python for complex data structures and automation, alongside strong SQL expertise
- Strong familiarity with Scikit-Learn and similar libraries, with specific experience building and maintaining associated feature engineering pipelines
- High proficiency in MLOps practices and orchestration tools (e.g., Airflow, dbt, Dagster) to manage model lifecycles and data dependencies
- Solid experience with modern data platforms (e.g., Snowflake, BigQuery, Redshift, or Databricks)
- Strong understanding of data modeling, performance optimization, and cloud computing (AWS, GCP, or Azure)
- Excellent communication and collaboration skills, with a proven ability to work effectively across technical and non-technical teams