Skip to content

LiveBench licence + access — outreach draft

To: [email protected] From: Tom Watson, The Good Ship (good-ship.co.uk) Subject: Bearing — using LiveBench results in an open AI model recommendation tool


Hi LiveBench team,

I run Bearing (https://github.com/dataforaction-tom/bearing), an open-source tool that helps non-technical users pick AI models based on their task. It scores models across seven factors — quality, capability, cost, speed, privacy, transparency, sustainability — and produces ranked recommendations with transparent reasoning. The dataset behind it is published under CC BY-NC 4.0.

I'd like to use LiveBench scores as a quality signal, blended with our hand-curated per-task fitness numbers. Three quick questions:

  1. Licensing. The LiveBench GitHub repo carries Apache-2.0 / MIT, but I couldn't find an explicit licence on the livebench/model_judgment Hugging Face dataset card. Are the leaderboard scores themselves redistributable with attribution? If so, what attribution form do you prefer?

  2. Latest release. The most recent public release I can see on Hugging Face is 2025-03-31. The 2025-04-25 release appears to be gated. Is there a path to access more recent results, or a planned public refresh cadence?

  3. Citation. If we surface a LiveBench-derived sub-score in our public model registry, what citation should we use — the LiveBench paper, the GitHub repo, or both?

Happy to share more about how we'd present and credit the data. We'd also be glad to feed back any signal we collect from real-world outcome reports our users submit, if that's of interest.

Thanks, Tom Watson The Good Ship — good-ship.co.uk