Kin Insurance is redesigning insurance to be smarter and more customer-centric. As a Senior Data Engineer, you will ensure the quality and security of data assets by building data pipelines and collaborating with cross-functional teams for effective reporting.
Responsibilities:
- Designing and developing scalable data pipelines and models for downstream analytics and reporting
- Leading and collaborating with a cross-functional project team to implement data validation, QA standards, and effective data lifecycle management
- Optimizing pipeline performance, cost, and data quality in a large-scale data environment
- Migrating data warehouse (DBT, Redshift) architecture to Lakehouse (e.g: S3, Glue, Databricks, Unity catalog) architecture
- Mentoring data engineers and promoting best practices in software engineering, documentation, and metadata management
- Ensuring data security and compliance with regulations (e.g., GDPR, CCPA, GLBA) through robust pipeline design and access monitoring
- Translating ambiguous business requirements into technical solutions using marketing domain knowledge
Requirements:
- 4+ years of hands-on data engineering experience related to: Data structures and cloud platform environments and best practices (AWS strongly preferred, Azure, or GCP)
- ETL performance tuning and cost optimization
- Data lake and lakehouse patterns including open table formats (e.g. Iceberg, Hudi, Delta)
- Proficiency in Python (Pandas, NumPy, etc.) and SQL for advanced data processing and querying
- Expertise in distributed data processing/storage (e.g., Apache Spark, Kafka, Hadoop, or similar)
- Excellent communication skills and ability to explain complex concepts clearly and concisely
- Detail-oriented with strong data intuition and a passion for data quality
- Proven ability to model data and build production-ready ETL pipelines handling TBs of data
- Great time management and prioritization skills