GLOBO is a rapidly growing technology company specializing in translation and interpretation services. The Director of Data & AI Engineering is responsible for building and leading the data infrastructure and AI/ML capabilities, managing a team of engineers, and driving the development of AI-powered features.
Responsibilities:
- Own the design and governance of GLOBO’s modern data stack. Architect and maintain data pipelines using Fivetran for ingestion, dbt for transformation, and Snowflake for warehousing and analytics. Define data modeling standards, ensure data quality and integrity, and establish governance practices across the organization
- Manage, mentor, and grow a team of 3–5 AI and data engineers. Set team priorities aligned with company objectives, conduct regular 1:1s and performance reviews, and foster a culture of engineering excellence. Hire and onboard new team members as the function scales
- Lead the design and development of AI-powered features within the GLOBO platform. Build and deploy LLM integrations (AWS Bedrock, Anthropic Claude) and agentic workflows (CrewAI, LangChain) that solve specific business problems such as automated QA, intelligent routing, and context-aware translation aids. Implement guardrails and evaluation frameworks to detect and mitigate hallucinations, bias, and errors
- Partner with business stakeholders to build and maintain analytics infrastructure that powers reporting, dashboards, and data-driven decision making. Ensure clean, well-modeled data is accessible to analysts and business users through Snowflake and connected BI tools
- Monitor and optimize performance and cost across data and AI services. Manage Snowflake compute costs, Fivetran sync volumes, and AI inference spend (AWS Bedrock). Promote best practices in version control, CI/CD for data and ML pipelines, testing, and documentation
Requirements:
- Bachelor's Degree in Computer Science, Data Science, Information Systems, or related field
- 5+ years of experience in data engineering, software development, or ML engineering, with at least 2 years in a technical leadership or management role
- Experience with Snowflake (data warehousing, query optimization, cost management)
- Experience with dbt (data transformation, modeling, testing)
- Experience with Fivetran (or similar ELT/ingestion tooling)
- Advanced proficiency in Python
- Experience with LLM Integration (AWS Bedrock, Anthropic Claude, or OpenAI API)
- Experience with AWS Lambda / Serverless architecture
- Advanced proficiency in SQL
- Demonstrated experience managing and mentoring engineers, including hiring, performance management, and career development
- Strong ability to communicate technical concepts to non-technical stakeholders and to translate business requirements into data and AI solutions
- Ability to work independently and lead a team in a decentralized, hybrid environment
- Master's degree
- Agentic Frameworks (CrewAI, LangChain, or similar)
- Airflow (or similar workflow orchestration)
- AWS ECS/EKS
- CDK and CloudFormation for automated deployments
- Vector Databases (Pinecone, PGVector, or OpenSearch)
- Ruby on Rails (ability to read/debug core platform code)
- Redis
- PostgreSQL
- React
- BI/visualization tools (Looker, Metabase, or similar)