OneMagnify is a global performance marketing organization that focuses on brand marketing, technology, and analytics. They are seeking a Senior Data Engineer to collaborate across teams and design large-scale data solutions on GCP, ensuring data quality and effective delivery.
Responsibilities:
- Work as part of a GCP implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment of Data Platform automation
- Provide technical guidance, mentorship, and code-level support to the development team
- Drive effective and efficient delivery, focusing on speed, identifying risks, implementing mitigation/contingency plans and test/compare competing solutions
- Design and build production data engineering solutions to deliver our pipeline patterns using GCP Services
- Ensure data quality, maintain profiles of data products, monitor system performance
Requirements:
- Experience in data engineering pipelines, data warehouse systems, ETL principles and complex SQL queries
- GCP experience working in Big Data deployments leveraging BigQuery, Google Cloud Storage, Dataflow, Dataproc, Cloud Run
- Python, VS Code, GitHub, Tekton and Terraform to deploy data solutions and products via DAGs with Astronomer and Apache Airflow
- Understands data architecture and design independent of the technology
- Experience working with Agile and Lean methodologies
- Experience with Test-Driven Development
- Exposure to AI/LLM
- Problem solving and communication skills and management of multiple stakeholders
- Ability to provide analytic and creative solutions to business problems by deep-dives into data
- GCP Certification Preferred - Professional Data Engineer or Associate Cloud Engineer
- Experience in Qlik Sense, Looker and Power BI preferred but not required