How to read a UK liveability score (and when not to trust one)
Every UK property site offers some form of area score now. Most don't tell you the formula. Here's what ours actually means, where it's strong, and where it isn't.
Every UK property site offers some flavour of area score now. Move iQ has a "living rating". Plumplot has a crime score. Crystal Roof has a "quality" index. Streetcheck cobbles together a council-tax- and-Ofsted-and-IMD blend. The numbers don't agree. Worse, almost none of them tell you the formula.
Our liveability score is published in full at /methodology/scoring/liveability. The formula is 0.30·safety + 0.25·transport + 0.20·schools + 0.15·rent_affordability + 0.10·EPC. Each input is itself a 0–100 score derived from named public datasets. You can disagree with the weights — they're editorial. This article explains why we picked them and where the score has known limits.
Why these five inputs
Every two years YouGov, Knight Frank and Savills run UK area-choice surveys asking renters and recent buyers what mattered most when they picked their current home. The same five categories surface every time, in roughly this order:
- Crime / safety (mentioned by 70%+ of respondents in every survey)
- Transport links (60–65%)
- Cost / affordability (50–55%)
- Schools — but ~70% of respondents have no school-age children
- Energy / running costs (recently overtook schools for under-35s)
We weighted safety highest because it's consistently the top-cited factor across every demographic. Transport beats cost because cost is already partially baked into transport (a one-hour commute is itself a cost). Schools below transport because, again, most households don't have school-age children. EPC gets a 10% slot because energy bills meaningfully change the cost picture and used to be invisible to renters.
Where the score is strong
For comparing two roughly-similar UK neighbourhoods — a Brockley vs. Forest Hill, or a Didsbury vs. Chorlton — the score is genuinely useful. All five inputs are local-area truth or close to it, the weights aren't ridiculous, and the score is rebased to a national percentile after every refresh so a 70/100 always means "safer/better-connected/ better-schooled than 70% of England + Wales".
Where the score is weak
- Crime is per-resident, not per-visitor. A central nightlife area with 200 residents and a busy weekend strip looks unsafe because the denominator is wrong. We don't adjust for daytime/ nightlife population. A Soho vs. Hampstead comparison is mis-weighted in Hampstead's favour for this reason.
- The Greater Manchester crime gap is filled with a proxy. data.police.uk doesn't carry GMP data after mid-2020. The 1,294 affected areas are filled using the IMD crime sub-score from 2019. See the next article in this series for the full detail.
- Schools score reflects access, not admissions reality. We count Good/Outstanding schools within 2 km. We don't check catchment overlap, faith-school selectivity, or where last year's admissions cliff actually fell. For a household that needs a specific school place, the score is a starting point, not an answer.
- Rent affordability is council-area-level. Rent and salary both come from sources (ONS PIPR / ASHE) that publish at council level, not at the local-area level. So this 15% input is identical for every local area in the same council — it differentiates councils, not neighbourhoods.
Don't single-rank a place
Liveability is a starting point, not a verdict. The real value is in decomposing it: the area page shows the four component scores side-by- side so you can see whether a 65/100 came from high safety + low transport or from balanced everything. They feel very different to live in.
If you're comparing two areas, use /compare rather than eyeballing two single numbers. It surfaces the full set side-by-side with the winner subtly bolded — which is what the score is good for.
Got questions about a specific neighbourhood's score? Email hello@placetrics.co.uk and we'll explain how it broke down.