Zillow is a leading real estate platform in the U.S. seeking a Senior Software Development Engineer, Big Data to enhance their data engineering solutions. The role involves designing scalable data pipelines, ensuring data reliability, and collaborating with various teams to translate business needs into technical requirements.
Responsibilities:
- Design and implement scalable data pipelines to collect, process, and store large volumes of critical data from various sources
- Provide data reliability and uptime by monitoring and troubleshooting data pipeline performance and scalability to ensure efficient operations
- Be a 'lazy' engineer - continuously seeking to improve the team’s efficiency by automating repeatable processes
- Facilitate engineering discussions with collaborators, customers, partners, and team members from various departments to understand business needs and convert them into technical requirements
- Be a fast learner with strong business insight and can collaborate with various product partners to quickly acquire a deep understanding of the business processes that feed into our team’s data pipelines
- Authoritatively translate business use cases into well-thought-out data models that are easy to evolve with the business
- Communicate technical concepts effectively to non-technical audiences
- Be a bar-raiser for engineering best practices - carefully reviewing specifications, designs, pull requests and providing constructive/helpful feedback to raise the quality of our team’s output
- Consistently write high-quality code, refactor, and optimize for better scalability, performance, and readability
- Provide leadership within the team and mentor junior engineers
Requirements:
- A degree (BS+) in Computer Science or a related field
- 5+ years of experience building and maintaining data-intensive applications
- Experience developing sophisticated data pipelines scaling to billions of rows with production quality deployment, monitoring and reliability and petabyte scale
- Extensive experience with modern data technologies such as Spark and Airflow
- Strong proficiency in programming languages such as Python, Java, or Scala
- Extensive experience with SQL
- Proven data modeling experience, translating business requirements into clean and easily evolvable data models
- Excellent interpersonal skills and a passion for collaborating across organizational boundaries
- Experience working with cloud services (AWS/Azure/GCP)
- Experience with Databricks