GID is a privately-held, vertically-integrated real estate company that owns and manages a portfolio of multifamily and industrial assets. The Senior Data & Platform Engineer is responsible for the technical execution and evolution of GID’s enterprise data platform, ensuring it is scalable, secure, and ready to power analytics and internal applications.
Responsibilities:
- Platform Ownership & Architecture
- Own design architecture, reliability, and performance of the enterprise data platform—including ingestion, storage, processing, orchestration, and consumption layers
- Implement and maintain modern data stack components (e.g., Snowflake, DBT, orchestration frameworks, metadata tools, quality frameworks)
- Ensure platform scalability, availability, security posture, and cost efficiency
- Pipeline Engineering & Data Products
- Build and maintain analytics‑ready and AI‑ready data pipelines, transformation models, semantic layers, and shared data services
- Develop and execute a unified and forward-looking vision for data products and engineering
- Set foundation for AI use cases with implementation of semantic layer, context graphs, vector DBs, and stay current with the latest best practices
- Design and implement reusable data assets, domain models, and standardized transformation patterns
- Governance, Quality, and Controls
- Establish and enforce standards for data quality, data contracts, observability, lineage, and metadata management
- Implement access controls, RBAC, PII protection, and compliance with privacy regulations (GDPR, CCPA, internal retention policies)
- Partner with data governance to establish stewardship practices, certified datasets, and SLA expectations
- Collaboration & Delivery
- Partner with application engineering, data scientists, data analytics units to develop data products that unlock enterprise value
- Translate business requirements into scalable data architecture and reusable technical solutions
- Drive technical prioritization, sprint planning, and execution of the platform roadmap
- Leadership & Team Development
- Mentor data engineers and act as the principal technical lead for engineering best practices
- Introduce modern engineering patterns
- Create documentation standards, operational runbooks, and incident response processes
Requirements:
- 7–10+ years in data & analytics engineering, platform engineering, data architecture or related technical fields
- Proven experience designing and operating modern cloud data platforms, especially Snowflake and DBT
- Strong understanding of Snowflake platform including advanced Snowflake features and Microsoft Azure, including data services, security and cost governance
- Hands‑on expertise with building data pipelines using tools such as Azure Data Factory, Fivetran, Matillion, or similar ingestion & ETL frameworks
- Proficiency in SQL, Python, and data modeling
- Strong understanding of data lifecycle management, DevOps practices, data orchestration, and production data operations
- Strong problem‑solving and systems thinking abilities
- Ability to work in fast‑moving environments with ambiguous or evolving requirements
- Excellent communication skills with the ability to simplify complexity for non‑technical audiences
- A mindset of automation, reusability, and continuous improvement
- Experience operating in organizations modernizing legacy data landscapes into a modern cloud data stack