Design and implement analytical and technical solutions for complex data problems with a high degree of autonomy;
Define technical strategy: solution design, effort estimates, phases, and timelines;
Lead exploratory data analyses (EDA) to understand business scenarios, identify inconsistencies, and propose high-impact solutions;
Develop and deploy Machine Learning models (predictive, NLP, computer vision, LLMs, etc.), ensuring performance and scalability;
Define and monitor metrics for deployed model quality and performance;
Prepare technical presentations and results for stakeholders and product teams;
Serve as a technical guide and mentor junior data scientists.
Requirements
Strong experience developing solutions in cloud environments, preferably GCP, using services such as Vertex AI, BigQuery, Airflow (Cloud Composer), Dataproc, Cloud Run, and GKE (Kubernetes);
Advanced knowledge of data modeling, statistics, machine learning, and the mathematical fundamentals behind AI models;
Hands-on experience integrated into a mature MLOps lifecycle: training and deployment pipelines, job orchestration, model and dataset versioning, CI/CD;
Experience with various ML modeling types: predictive models, NLP, computer vision, embeddings, LLMs;
Previous experience deploying models to production and monitoring their performance;
Software engineering best practices: clean, modular, and reusable code design.
Tech Stack
Airflow
BigQuery
Cloud
Google Cloud Platform
Kubernetes
Benefits
Meal support: Meal Allowance/Food Voucher or on-site cafeteria (depending on location);
Health support: Health Plan and Life Insurance;
Development support: Dasa University, Development and Career Cycle, Technology Academies/PMAX and Dasa's Crescer Program;
Other: Transportation Allowance and Performance Bonus (PPR).