Motive is a company that empowers physical operations with tools for safety, productivity, and profitability. As a Data Engineer on the BI team, you will deliver data infrastructure and high-quality datasets that drive global strategy, while implementing cutting-edge tooling and managing the entire data lifecycle.
Responsibilities:
- Collaborate & Strategize: Partner closely with business stakeholders to understand their challenges and design end-to-end architecture that solves complex business problems
- Build & Maintain Data Models: Design, develop, and own robust, efficient, and scalable data models in Snowflake and Iceberg using dbt and advanced SQL
- Orchestrate & Automate: Build and manage reliable data pipelines and CI/CD workflows using tools like Airflow, Python, and Terraform to ensure data is fresh, trustworthy, and infrastructure is version-controlled
- Champion Data Quality: Implement rigorous testing, documentation, and data governance practices to maintain a single source of truth
- Enable Analytics & Workflows: Act as the Product Owner and Tech Lead for your data domains, taking responsibility for the end-to-end data product delivery– from raw ingestion to data models enabling analytics and data apps in tools like Tableau and Retool
- Innovate with AI: Help us build our next-generation data infrastructure by integrating AI capabilities (like Snowflake Cortex AI) to democratize analytics and empower the business
- Architect Observability: Implement monitoring and alerting frameworks (e.g., dbt packages or Monte Carlo monitors) to proactively catch "silent" data failures before stakeholders do
Requirements:
- 6+ years of experience in Analytics Engineering, Data Engineering, or a similar role
- Deep expertise in SQL and developing complex data models for analytical purposes (e.g., dimensional modeling)
- Hands-on experience with Data Warehousing: High proficiency in Snowflake (preferred) and experience with Open Table Formats like Iceberg
- Hands-on experience with Data Transformation: dbt
- Hands-on experience with Orchestration & ETL: Airflow, Fivetran, Airbyte
- Hands-on experience with Cloud Platform: AWS
- Hands-on experience with Programming/Ingestion: Python
- Hands-on experience with Infrastructure as Code: Terraform
- Hands-on experience with AI-Augmented Development: Proficiency using AI coding assistants (Cursor, Copilot, or Claude) to accelerate development and automate routine tasks
- A strong analytical mindset with a proven ability to solve ambiguous business problems with data
- Excellent communication skills and experience working cross-functionally
- Self-starter with the ability to self-project manage work
- A user focus with the ability to understand how a data consumer will use the data products you build
- Experience building semantic models for natural language querying
- Direct experience with advanced Snowflake features (e.g., Snowpark, Cortex AI)
- Experience building visualizations and dashboards in tools like Tableau, Retool, or Thoughtspot