KERV.ai is an award-winning company transforming how content and commerce converge in media. The Senior Data Engineer will lead the innovation of data infrastructure and analytics capabilities, focusing on building robust ETL pipelines, data warehousing solutions, and BI tools.
Responsibilities:
- Design and build scalable ETL pipelines to process high volumes of advertising data
- Develop and maintain serverless applications using AWS Lambda for data processing workflows
- Architect and optimize data warehousing solutions using Redshift and S3
- Troubleshoot complex data quality issues and implement robust data validation systems
- Build BI tools and data insight solutions to support business decision-making
- Integrate with third-party data APIs and manage data collection systems
- Write excellent code that is simple to test, understand and maintain
- Solve real problems through software and provide stable solutions consistently
- Recommend architectural decisions based on business requirements, considering long-term and short-term needs
- Drive participation in team code reviews, teaching less experienced developers how to improve the quality of their work
- Develop a culture of quality within the engineering team
- Participate in sprint planning, ensuring that realistic plans are set forth and delivered
- Communicate challenges and strategies to technical and non-technical team members - ability to effectively communicate with executives, operations and fellow developers
- Have the discipline to shift between establishing new features and maintaining legacy features based on the needs of the business
- Recommend adoption of tools and process to improve our culture and results
Requirements:
- 7+ years of professional experience with a variety of software applications
- Strong proficiency in Python
- Extensive experience with AWS services, particularly Lambda, S3, and Serverless architectures
- Hands-on experience with database design/optimization
- Proven experience building and maintaining ETL pipelines
- Experience troubleshooting complex data issues and ensuring data quality
- Experience building BI/data insight tools and dashboards
- Experience integrating with third-party data APIs
- Experience with data collection systems and data warehousing
- Proficiency with Git and version control workflows
- Professional experience delivering customer-facing production systems
- Strong academic background in software engineering, or equivalent experience
- Proven leadership ability - experience mentoring and coaching less experienced developers
- Excellent communication skills with ability to explain technical concepts to non-technical stakeholders
- Self-starter mentality with ability to drive projects independently
- Experience with Scala and/or Java
- Experience with advertising technology industry data (DCM, DV360, programmatic platforms)
- Experience with GCP (Google Cloud Platform)
- Apache Spark for large-scale data processing
- JavaScript and/or Node.js experience
- Experience with Redshift
- Experience developing an enterprise or SaaS product
- Previous experience in a startup environment
- Familiar with Usage or Development of APIs (HTTP, REST, etc.)
- Experience utilizing AI coding assistants (e.g., Cursor, Copilot, ChatGPT)