GrowthLoop is a pioneer in AI-powered marketing on the data cloud, helping innovative companies transform how they market and drive business impact. In this role, you will serve as the primary technical lead for a single enterprise client, owning their data transformation journey and driving strategic direction through project execution and stakeholder alignment.
Responsibilities:
- Lead the migration of large volumes of data into cloud data warehouses, with a focus on performance, reliability, and scale
- Serve as the primary technical point of contact for a single enterprise client, building a deep understanding of their business needs, data landscape, and priorities
- Design and build scalable data models that support the client's analytical and operational goals, from initial design through delivery
- Drive end-to-end project execution, coordinating across workstreams and keeping stakeholders aligned on progress and decisions
- Identify and implement improvements to internal workflows and tooling that increase efficiency and repeatability across the engagement
Requirements:
- 4–7 years of experience as a Data Engineer or Analytics Engineer, with strong proficiency in SQL and Python
- Proven experience designing, building, and maintaining cloud data warehouses (e.g., BigQuery, Snowflake, Redshift)
- Hands-on experience with Google Cloud Platform (GCP), including the core services BigQuery and Google Cloud Dataform; GCP Professional Data Engineer certification strongly preferred
- Track record of owning data projects end-to-end — from scoping and design through delivery — while coordinating the work of other team members
- Strong data modeling skills, with the ability to design scalable, well-structured models for both analytical and operational use cases
- Experience working with BI and visualization tools (e.g., Looker, Tableau, Power BI, or similar)
- Solid understanding of relational databases and core data warehousing principles
- Strong analytical thinking and problem-solving abilities
- Self-starter who operates with autonomy and takes ownership of outcomes
- Experience with transformation and modeling tools such as dbt
- Experience with workflow orchestration platforms like Airflow
- Exposure to machine learning workflows, including model development, training, deployment, or productionization
- Experience working with or within large enterprise organizations