Synaptiq is a company that values the work and the people behind it, focusing on solving significant problems through AI. The Senior Data & AI Engineer will bridge the gap between raw data and actionable insights by building data pipelines and applying advanced analytical models to address complex business challenges.
Responsibilities:
- Talk to clients: Help our customers find the root causes of their pain points and clearly communicate complex technical ideas for a variety of audiences
- Design & Build Pipelines: Develop end-to-end data workflows, integrating data from various APIs and databases
- Architect Data Solutions: Implement Medallion architectures (Bronze/Silver/Gold) to organize data for analytics and AI
- Develop Models: Build, track, and manage machine learning and data analysis experiments (e.g., using MLflow)
- Integrate AI into Workflow: Incorporate AI tools into daily development workflows to accelerate prototyping, streamline debugging, and reduce repetitive work, while critically evaluating outputs and ensuring production-quality results
- Collaborate: Partner with Product Managers and Designers to translate vague business requests like "I want better analytics" into technical requirements and dashboard prototypes
- Ensure Quality: Implement data governance, schema enforcement, and unit tests to maintain high data quality standards
Requirements:
- 5–8 years of experience in data engineering and/or data science roles
- Strong proficiency in Databricks, Spark/PySpark, and Delta Lakehouse architecture
- Expertise in Python, SQL, PowerBI
- Experience with Azure tools e.g. Data Factory, Key Vault
- Comfortable with Git-based version control and collaborative development workflows
- Strong foundation in statistics (experimental design, hypothesis testing, causal inference) and proven ability to apply these methods to complex datasets
- Ability to explore, clean, and analyze large datasets to identify trends, anomalies, and actionable insights that inform product and business decisions
- Ability to distill complex analyses into clear, compelling narratives and visualizations that drive decision-making for technical and non-technical audiences alike
- Experience scheduling and monitoring data workflows using tools like Databricks Workflows, Airflow, or Azure Data Factory
- Proficiency in leveraging AI tools (e.g., code copilots, LLMs) to accelerate development, automate repetitive tasks, generate and refactor code, and improve code quality, while maintaining strong judgment around validation, correctness, and security
- Empathy and strong communication skills to support teammates and engage with stakeholders
- Ability to navigate ambiguity and resolve technical blockers proactively
- Bachelor's degree in a technical field or equivalent practical experience
- Consulting or professional services background
- Healthcare, data-intensive industries, or AI/ML product experience
- Experience building PowerBI dashboards