TRM Labs is a company that provides blockchain analytics and AI solutions to various sectors including law enforcement and financial institutions. As a Senior Analytics Engineer, you will be responsible for developing and optimizing analytics pipelines and data models, establishing best practices in analytics engineering, and improving the overall scalability and reliability of the analytics data ecosystem.
Responsibilities:
- Lead the development and optimization of analytics pipelines and data models that power TRM’s products, investigations, and decision-making — enabling teams across the company and our customers (including former FBI, Secret Service, and Europol agents) to detect and respond to financial crime in the crypto space
- Define and implement best practices in analytics engineering (e.g., testing, observability, versioning, documentation), helping to level up the team’s development workflows and data reliability
- Improve the scalability and maintainability of our analytics data ecosystem, which process large volume of data, through thoughtful architectural decisions and tooling improvements
- Partner closely with data scientists, data engineers, product and business teams to deliver production-ready datasets that support models, metrics, and investigative workflows
- Establish best-in-class data quality solutions to increase the reliability and accuracy of our data
- Investigate performance issues, and bring creative, durable solutions to improve long-term reliability, cost and developer experience
- Drive adoption of modern data tools and workflows, helping the team evolve toward best-in-class analytics engineering practices
- Contribute to team on-call responsibilities that support the health and availability of our analytics and data science infrastructure (on-call is lightweight and shared equitably)
Requirements:
- 8+ years of experience in analytics engineering, data engineering, or data science with a strong focus on building and scaling analytics workflows
- Strong experience across the entire Data Engineering lifecycle (from ETLs, Data Model design, infra, Data Quality, architecture etc.)
- Deep proficiency in SQL and experience developing robust, modular data models using dbt (or equivalent tools) in a production environment
- Strong software engineering fundamentals, including experience with Python, CI/CD pipelines, and automated testing
- Proficiency in defining robust and scalable data models using best practices
- Experience using LLMs as well enabling AI through high quality data infrastructure
- Hands-on experience with cloud data warehouses and infrastructure (e.g., Snowflake, BigQuery, Redshift) and data orchestration tools (e.g., Airflow, Dagster, Prefect)
- Proficiency in developing compelling dashboards using tools like Looker, Tableau, Power BI, Plotly or similar
- Excellent communication skills — you can explain complex technical concepts to non-technical audiences and influence key decisions with clarity and data