10a Labs is a safety and threat-intelligence company trusted by leading AI labs and technology platforms. They are seeking a Data Engineer to design and optimize data pipelines, automate workflows, and contribute to research initiatives involving AI and data. The role involves collaboration with various teams to deliver insights and tools for safety and security in AI systems.
Responsibilities:
- Design, implement, and optimize end-to-end data pipelines for scraping and processing structured and unstructured data using Google Cloud Platform (or similar) and best practices
- Automate red teaming, including developing automated workflows for prompt generation, model evaluation, and execution of AI experiments
- Conduct ad hoc web scraping and data collection to support research and intelligence initiatives
- Design and automate workflows and research experiments, including for data curation, storage, and organization
- Brainstorm novel research approaches to both known and emerging problems involving AI, data, and the internet
- Implement robust error handling, logging, and monitoring
- Design and maintain database schemas and pipeline infrastructure
- Prepare data for further analysis, including data cleaning, transformation, anonymization, and masking
- Contribute to the development of internal and external APIs, following best practices
- Collaborate with ML engineers, data engineers, and software developers to deliver actionable insights and functional tools, including internal and external dashboards, APIs, and data dumps
Requirements:
- Degree (or equivalent work experience) in Computer Science, Engineering, Information Science, Data Science or a related field (graduate degree preferred)
- 2+ years of professional experience in data engineering or a closely related field
- Ability to communicate complex technical ideas clearly to non-technical audiences
- Proficiency in Python, SQL
- Experience with web scraping/crawling (e.g., Beautiful Soup, Selenium, Scrapy)
- Familiarity with Google Cloud Platform (or similar), including storage and database services (e.g., Cloud Storage, CloudSQL, Cloud Spanner) and workflow orchestration (e.g., Cloud Composer/Airflow, Cloud Run, Pub/Sub)
- Experience building and managing data pipelines, especially for text data
- Comfort working in fast-moving, high-impact environments, such as startups, AI research labs, or security-focused teams
- Experience deploying APIs on cloud platforms (GCP, AWS, Azure) with robust testing, CI/CD, and performance monitoring practices
- Graduate degree in a related field