Loading…
Thursday September 4, 2025 3:30pm - 3:55pm PDT
Lauren Peate, Multitudes, CEO & founder

We’re all building AI features now. But building with LLMs brings its own challenges – namely: How can we use cutting-edge practices, weave in AI ethics, and consider the cost of different models without blowing past delivery dates. Not to mention making sure that the features we build will be stable, reliable and maintainable in the future.

We recently built our first LLM feature, to show the quality of feedback given in code reviews. In 1 month, we did a literature review, consultation with academic experts, data labelling, model experimentation, a cost assessment, and finally, all the ML engineering to launch it into production. The outcome: <1% extreme misclassification and zero hallucinations. In this talk, we’ll share our approach to building LLM features – how we partnered with academia (without being delayed by their timelines), what tooling we used, and how we made the cost and money tradeoffs to keep business stakeholders happy. I’ll also speak to how we built this into our microservices architecture, including how we used tools to generate structured outputs from LLMs on top of AWS’s Bedrock API to have parseable responses from a range of models.

You'll walk away with practical strategies for leading your own teams through AI implementations, identifying ethical issues early, addressing them efficiently, and still delivering on time and on budget.
Speakers
avatar for Lauren Peate

Lauren Peate

CEO & founder, Multitudes
Lauren Peate is the CEO and founder of Multitudes, which helps engineering teams improve delivery sustainably. She’s focused her career on using data to support people, including as the founder of Ally Skills NZ, a consultancy helping global tech companies improve team performance... Read More →
Thursday September 4, 2025 3:30pm - 3:55pm PDT
DataWeek -- Main Stage

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link