YLD is a software engineering and design consultancy focused on empowering clients to outperform their competitors. The role of Data Engineer involves building core infrastructure software and mentoring other engineers while ensuring secure data access and performance enhancements.
Responsibilities:
- Building core infrastructure software (pipelines, APIs, data modelling) as part of the client's data platform team
- Instrumenting systems for performance and enhancement throughout
- Ensuring data offerings are provided to various internal & external stakeholders using secure authentication patterns
- Choosing and implementing appropriate technologies for scaling data access patterns, batch processing, and data streaming for soft real-time consumption
- Coaching & mentoring other engineers to support the growth of their technical expertise
Requirements:
- Proven experience writing highly maintainable, performant Python code
- Experience building modern data pipelines using dbt, Kafka, Spark, AWS Kinesis, AWS Lambda, and Apache Airflow (or similar)
- Understanding of Data Modelling patterns
- Deep knowledge of complex SQL, with emphasis on Common Table Expressions, window functions, and their performance
- Experience with end-to-end monitoring & alerting experience (CloudWatch, Datadog, etc.)
- Problem-solving skills that balance innovation with pragmatic technology choices to solve business needs
- Comfortable working in a dynamic production environment and taking care of client expectations effectively
- Distinct customer focus and quality mindset
- Experience working closely with engineering leadership and architects to deliver high-quality solutions
- Experience maintaining a high degree of ownership and transparency in deliverables
- An exemplar of YLD's brand and safe-guarder of our reputation
- Exceptional communication skills, able to communicate complex ideas simply