Mitek Systems is a global leader in digital and biometric identity authentication, fraud prevention, and mobile deposit solutions. They are seeking a talented Senior Data Engineer to join their Data Engineering team, where the role involves designing, developing, and maintaining large-scale data systems to support advanced analytics and business intelligence initiatives.
Responsibilities:
- Be responsible for designing, developing, and maintaining large-scale data systems that support advanced analytics and business intelligence initiatives across Mitek
- This role involves building sustainable and scalable data pipelines, processes, and platforms that enable efficient collection, transformation, and analysis of data throughout the organization
- You’ll have a huge impact on how our business operates through building infrastructure to answer questions with data, using software engineering best practices, data management fundamentals, and recent advances in distributed systems (i.e. MapReduce, noSQL databases)
- You will work closely with Product Management, Data Analytics, Software Development, and business stakeholders to deliver reliable, high-performing data feeds, enterprise-level reports, and data solutions that support both strategic business decisions and offline machine use cases
Requirements:
- Bachelor's degree in mathematics, Statistics, Computer Science or related field
- 5+ years of experience as a data engineer, software engineer or in a similar role
- Experience in designing and implementing scalable data architecture, including building data pipelines for stream and batch processing
- Experience with data modeling, data warehouses, and data lakehouses
- Experience with AWS including big data technologies (AWS Glue, Kinesis, Lambda)
- Experience building ELT pipelines with dbt and Snowflake
- Hands-on experience and advanced knowledge of SQL ( i.e. Postgres, Snowflake)
- Intermediate to advanced Python software development skills
- Knowledge of the Software Development Lifecycle
- Experience working with data orchestration tools (Airflow)
- Experience with infrastructure-as-code (Terraform)
- Experience with open table formats and data catalogs
- Experience with analytics tools (Tableau, PowerBI)
- Knowledge of encryption, anonymization, and tokenization
- Experience with CI/CD pipelines
- Experience with containerization
- Previous experience working in a SaaS or technology company