Cheat at Search Essentials: Evaluation, NDCG, and pals

Hosted by Doug Turnbull

Tue, Jan 27, 2026

5:30 PM UTC (1 hour)

Virtual (Zoom)

Free to join

Invite your network

Go deeper with a course

'Relevant Search' masterclass
Doug Turnbull and Nick Zadrozny
View syllabus

What you'll learn

Basics of search relevance evaluation

How the best search teams evaluate and measure search relevance

Where practice diverges from theory

When to ignore traditional search metrics and just trust your A/B tests. How to interpret offline metrics.

Pros / cons of different types of labeled relevance data

Do you use clicks? Human labels? LLM as a judge? What are the pros/cons of these approaches?

Why this topic matters

Search is central to RAG and AI these days. To get better at search, know the core metrics teams have historically used to evaluate search.

You'll learn from

Doug Turnbull

Ex-Reddit, Ex-Shopify. Author: AI Powered Search+Relevant Search

Doug leads search teams past the BS to find real opportunity in emerging search technologies. He’s enthusiastic about the evolving landscape, while staying mindful of the gap between marketing and reality. Good search strategy separates promising opportunities from dangerous sand traps. Doug helps teams find a clear, practical path forward.

He led machine-learning-driven search at Reddit and Shopify, served as CTO of OpenSource Connections, and co-authored Relevant Search and AI Powered Search.

Doug has trained and advised teams at the Wikimedia Foundation, Wayfair, and AWS, and created Quepid, SearchArray, and the Elasticsearch Learning to Rank plugin.

Reddit
Shopify.com
Wikipedia
OpenSource Connections

Sign up to join this lesson

By continuing, you agree to Maven's Terms and Privacy Policy.