GFiber is an Alphabet company that brings Google Fiber and Google Fiber Webpass internet services to homes and businesses across the United States. As a Data Analytics Engineer, you will work with multiple stakeholders to provide quantitative support, create data pipelines, and solve business problems using large data sets and GCP technologies.
Responsibilities:
- Create foundational pipelines and data extractions with large data sets using SQL and other GCP technologies demonstrating comfort with software-engineering-level best practices and approaches to data pipelines and management
- Work with large, complex data sets to solve difficult analysis problems, assess the impact of initiatives, and create on-going tooling/resources to run the business
- Collaborate with cross-functional partners to understand business context for solutions, analysis, and tooling needed to become the single point of contact for all financial data engineering questions
- Undertake specific tasks, bugs, and maintenance work. Additionally, own complex projects, proving mastery through accuracy, timeliness, and volume of work, with minimal guidance
Requirements:
- Bachelor's degree or equivalent practical experience
- 5 years of experience in analytics, data science, and/or computer science engineering
- 5 Years of experience with using SQL, writing queries from scratch on a daily basis, data engineering, architecture, pipeline management, and Extract, Transform, Load (ETL)
- Expertise using GCP tools
- Experience in HTML, Python, or other coding languages
- Experience using data to identify opportunities for business improvement and defining/measuring the success of those initiatives
- Ability to use written and verbal communication to build relationships with cross-functional partners and conduct presentations for stakeholders, including executive leadership
- Ability to engage with PMO teams to prioritize, provide level of effort estimations, manage bug queues and bandwidth