Join a tightly knit team solving hard problems the right way
Understand the various sensors and environments critical to our customers’ success
Know the data flows and technology that are currently in use to transform raw data into analytic products
Build relationships with the awesome team members across other functional groups
Learn our code practice, work in our code base, write tests, and collaborate with us in our workflows
Contribute to on-boarding processes and make recommendations to make on-boarding process better
Demonstrate your capabilities defining solutions, implementing, and delivering data products for your user stories and tasks
Contributing to systems and processes to implement and automate quality on data pipeline deliverables
Implement data tests for quality and focuses on improving inefficient tooling and adopting new transformative technologies while maintaining operational continuity
Contribute to the quality of data and the pipeline after working closely with the product team and stakeholders to understand how our products are used
Identify opportunities to improve our infrastructure, operational performance, and data pipeline deliverables
Evaluate new technologies and build proof-of-concept systems to enhance Data Engineering capabilities and data products
Contribute to improving the efficiency of our pipeline scripts, automation, and general data operations
Demonstrate command and accountability for the design and implementation of new features
Requirements
Bachelor’s or master’s degree in a technical or quantitative field
5+ years of hands-on Data Engineering experience, delivering production-grade solutions at scale
Expert in Snowflake, with proven ability to design, optimize, and deploy high-quality solutions for large-scale environments
Advanced SQL and Python skills, including writing efficient, reusable, and well-documented code
Proven experience building and maintaining ETL/data pipelines, including orchestration, monitoring, and optimization for performance and cost
Strong knowledge of data warehousing, data lakes, and relational/non-relational databases
Experience with managed cloud services (AWS or GCP) and implementing secure, scalable data solutions
Experience delivering and articulating data models to support enterprise and data product needs
Proficiency in DBT, including authoring transformations and automated tests
Experience implementing automated testing frameworks (unit tests, integration tests, data-quality checks) for data pipelines
Strong Git and Agile/Scrum experience, including code reviews and collaborative workflows
Tech Stack
AWS
Cloud
ETL
Google Cloud Platform
Python
SQL
Benefits
Competitive salary
Paid parental leave
Open (uncapped) PTO
Hybrid work environment
Medical, health & wellbeing benefits
Senior Data Engineer at Digi International | JobVerse