Ncontracts is a leader in integrated risk management and compliance solutions for financial institutions. They are seeking an Analytics Engineer responsible for building data models and analytical frameworks to drive product improvement and deliver actionable insights for their suite of solutions.
Responsibilities:
- Design, build, and maintain the data models in Snowflake using dbt that power product analytics—from raw source data through to business-ready, well-documented datasets
- Define and maintain metrics frameworks and KPIs that drive product and business decisions across the organization
- Build interactive dashboards and self-service reports in Sigma that empower stakeholders to explore data and answer their own questions
- Design instrumentation strategies with engineering teams to ensure we’re capturing the right signals from Ncontracts’ product suite
- Partner with product managers to translate fuzzy questions (“why is this feature underperforming?”) into precise, answerable analyses
- Conduct deep-dive analyses including funnel analysis, cohort analysis, and root cause investigations to surface product insights
- Own data quality end-to-end, including dbt tests, validation, documentation, and monitoring across all models
- Optimize query performance and warehouse costs in Snowflake through efficient modeling patterns, materialization strategies, and resource monitoring
- Establish and enforce data governance standards, including naming conventions, documentation requirements, and consistent metric definitions to maintain a single source of truth
- Contribute to the evolution of the data platform by evaluating new tools, technologies, and best practices
Requirements:
- 5+ years of experience in analytics engineering, data analytics, or a related data role with demonstrated technical expertise
- Strong SQL proficiency with experience writing complex queries for data transformation, analysis, and performance optimization
- Hands-on experience with dbt (dbt Core or dbt Cloud), including model design, testing, documentation, and deployment workflows
- Experience working with Snowflake or a comparable cloud data warehouse (BigQuery, Redshift, Databricks)
- Solid understanding of data modeling concepts, including dimensional modeling, slowly changing dimensions, and star/snowflake schemas
- Proficiency with data visualization and BI tools, preferably Sigma, with the ability to create compelling dashboards and reports for diverse audiences
- Understanding of software engineering fundamentals: version control (Git), testing, code review, and CI/CD practices
- Comfort working with semi-structured data (JSON, nested structures)
- Bachelor's degree in Computer Science, Data Science, Statistics, Mathematics, or equivalent practical experience
- Experience building metrics frameworks and defining KPIs that drive product and business decisions
- Track record of error analysis and root cause investigation on product and data quality issues
- Ability to work backwards from business questions to data requirements
- Healthy skepticism about data quality and metric definitions—you ask 'should we trust this number?' before building on it
- Genuine curiosity about how products work and why users behave the way they do
- Experience partnering directly with product and engineering teams in a collaborative, cross-functional environment
- Communication skills to make technical findings accessible to non-technical stakeholders
- Comfort with ambiguity and the ability to iterate toward the right answer rather than waiting for perfect requirements
- Knowledge of financial services data domains such as lending, compliance, risk management, or regulatory reporting
- Background in high-volume event data, behavioral analytics, or product telemetry systems
- Exposure to funnel analysis, cohort analysis, or experimentation frameworks (A/B testing)
- Understanding of software system architectures and how they generate telemetry
- Enthusiasm for AI and a habit of using AI tools to accelerate your own work—writing code, automating tasks, exploring data, or prototyping solutions
- Experience measuring or evaluating AI-powered product features, including defining success metrics, tracking adoption, or analyzing model performance from a product analytics perspective
- Familiarity with emerging data and workflow patterns, including tracing, observability, and agent-based systems
- Experience with Python for data transformation, scripting, or automation tasks
- Experience with data observability platforms (Monte Carlo, Elementary, Soda)
- Knowledge of compliance frameworks relevant to financial services (SOC 2, GLBA, FFIEC)
- Experience implementing data contracts, data mesh, or modern data governance frameworks
- Exposure to orchestration tools such as Airflow, Dagster, or Prefect
- dbt Analytics Engineering Certification or SnowPro certification
- Familiarity with reverse ETL tools (Census, Hightouch) for activating data in operational systems
- Contributions to open-source data or analytics projects