Take ownership, improve, scale, and iterate on existing data processing pipelines.
Design and implement new data processing pipelines.
Collect and monitor performance metrics.
Play a key role in discussing and implementing security best practices.
Requirements
Good knowledge of at least one programming language (C++, Python, Scala, or Java).
Experience with AWS services (S3, EC2, IAM, EMR, Glue, Athena, Kinesis) or any other cloud platform.
Comfortable with SQL and have a good understanding of SQL engine basics.
Proficient understanding of distributed computing principles.
Experience with at least one workflow management tool, such as Airflow, Oozie, or Luigi.
Strong analytical thinking and the ability to justify technical decisions.
Creative, resourceful, and innovative problem solver.
Excellent communication skills in English, both written and spoken.
Prior experience in the mapping, navigation, or automotive industry (Nice to Have).
Hands-on experience with data processing platforms/frameworks (Spark or other) (Nice to Have).
Tech Stack
Airflow
AWS
Cloud
EC2
Java
Python
Scala
Spark
SQL
Benefits
We value high-performing creative individuals who dig into problems and opportunities.
We believe in individuals being their whole selves at work. We commit to this through supportive health care, parental leave, flexibility for the things that come up in life, and innovating on how we think about supporting our people.
We emphasize an environment of teaching and learning to equip employees with the tools needed to be successful in their function and the company.
We strongly believe in the value of growing a diverse team and encourage people of all backgrounds, genders, ethnicities, abilities, and sexual orientations to apply.