In October, the Care Quality Commission (CQC) will have completed five years of inspections using the ‘Mum Test’ process, measuring its five Key Lines of Enquiry (KLOEs) to establish one of four ratings for all locations under its supervision.
Over the last four and a half years, the ratings, especially the non-compliant ones, have caused controversy, sparking a number of national media stories slating the state of care quality.
Not being a care provider, I have never had a professional axe to grind with CQC. However, to build up market intelligence on the care sector, I keep a constant eye out for trends.
My first editorial on the subject was in May 2015, just a few months after the introduction of the Mum Test, when I analysed a snapshot of early ratings published on the CQC website. My headline was Why is a Care Home in West Sussex Ten Times More Likely to be Classified Inadequate than a Care Home in the West Midlands?
Okay, maybe I did have an axe to grind, because the ‘victim’ was West Sussex, my home county. The analysis established that only 2.5% of all care homes in the West Midlands were rated Inadequate, compared to 17.5% in the South East. West Sussex had an even greater proportion than its mother region, with 25% given the Inadequate rating, ten times that of the West Midlands.
Over time, regional variances have reduced, probably due to the increased volume of ratings. Care standards will vary between providers for numerous reasons, but my concern is that the different subjectivities of local teams of inspectors, along with other issues, affects the comparison of care quality on an even field.
Variations between sectors
Care providers are used to the CQC knocking on the door on a regular basis, turning up unannounced to undertake an inspection.
Around 95% of care homes in the UK have an inspection rating on their public records, and many will have been inspected a number of times over the last four and a half years. 94% of GPs and 75% of homecare businesses have also been through the process, but only 37% of NHS hospitals currently have a published quality rating. And even this is greater than the 25% of their independent ‘competitors’ that have been rated.
Why should certain sectors be under constant scrutiny, while the mighty NHS is relatively untouched by CQC? If you look at the results of the inspections that have been undertaken at NHS hospitals, it is worrying that the percentage of ratings given is not higher.
The percentages of services rated Good or Outstanding are as follows:
- Care homes: 78.4%.
- GPs: 95%.
- Homecare: 87%.
- NHS acute hospitals: 44.6%.
On the basis that so few NHS acute hospitals have had an inspection report published, only one in six of all NHS acute hospitals have proved they are compliant, so why have the rest not been inspected?
The Mum Test ratings mix
As all providers know, the five KLOEs are Safe, Effective, Caring, Responsive and Well-Led, and each are given an Outstanding, Good, Requires Improvement or Inadequate rating. The KLOEs are then aggregated to come up with the Overall rating.
If the same rating is given for each KLOE, then the Overall rating will match these. But what if there is a mix across the ratings given for individual KLOEs?
Amongst many other permutations, these are a few of the more pertinent examples I came across when analysing the latest list of CQC care home ratings:
Homes rated Requires Improvement:
Home A: Four Requires Improvement and one Inadequate.
Home B: Three Good and two Requires Improvement.
Homes rated Inadequate:
Home C: Three Good and two Inadequate.
Homes rated Outstanding:
Home D: Three Good and two Outstanding.
Home A, which did not achieve a single compliant rating across any of the KLOEs, is given a Requires Improvement rating. Home C who is compliant with three Goods but has two Inadequacies is relegated to Inadequate. Is that fair? One predominantly compliant home rated worse than one without any compliance?
And what does Home B think? They achieved three Goods against Home A’s zero, and no Inadequacies, and yet they share the same Requires Improvement rating.
Out of all the five KLOEs, safety will be top of the CQC Inspector’s list. If a home is not Safe, this in itself is an inadequacy. However, looking at a group of Home A examples, in 50% of cases Safe was the one KLOE that was judged to be Inadequate.
There are some peculiarities at the top end of the ratings as well. As we can see above, Home B is rated Requires Improvement. Now consider Home D which, like Home B achieved three Good ratings, but instead of two Requires Improvements, gained two Outstanding, and was awarded an Outstanding rating. It’s almost as if all the hard work Home B has put into achieving three Goods is totally ignored.
And of all the care homes currently rated Outstanding, only 4% gained the top rating across all five KLOEs, whilst 58% only achieved two.
The rise of Inadequate
Having noticed the start of a sharp increase in Inadequate ratings, we have been monitoring care ratings on a quarterly basis since the beginning of 2017. The analysis measures all care home ratings published during the relevant quarter.
Only 0.1% of homes inspected in the first quarter (Q1) of 2017 were rated Inadequate, but this rose to 1.7% of those rated in Q2, 5% in Q3 and 11% in Q4 – over 100 times more Inadequate ratings than in Q1.
In 2018, the proportion of Inadequate ratings each quarter dropped slightly to 9% of homes rated in Q1, 8.4% for those rated in Q2, 7% in Q3 and 6.8% in Q4, but this is still much higher than early 2017.
We have to ask, has care quality reduced a such a rate, or has CQC moved the goalposts?
Do ratings impact closures?
In 2018, we lost 300 care homes through closure, and with them nearly 10,000 beds. This led to thousands of vulnerable older people being displaced and losing the familiarity and continuity of their home and the people who cared for them.
150 care homes were opened during the same period, continuing the pattern of one opening for every two closures that we’ve seen over the last few years. Supply continues to fall away whilst demand for beds continues to rise.
There are many reasons for care home closures. Too much supply against local demand, low local authority contributions to care and a low percentage of self-funders, unsuitable and ageing properties, challenges in recruiting and retaining staff, or even the owners just throwing in the towel due to being totally unimpressed with their local authority and the Government’s lethargy towards the sector.
Nationally, there’s a fairly even split amongst ratings levels for those homes which have closed. Of all the care homes that closed in 2018, 34% had an Inadequate rating at the time of closure, 29% Requires Improvement and 36% Good.
On a regional basis, however, there were massive variances in the care ratings of the homes that closed.
In the North West, 51% of homes that closed had an Inadequate rating, compared to just 22% for their neighbours, Yorkshire and the Humber. And yet only 1.8% of all care homes in the North West are Inadequate, and 3.1% in Yorkshire and the Humber.
This suggests care quality is a much greater factor for closure in the North West, and conversely there are more commercial pressures causing homes to close in Yorkshire and the Humber.
Across the country, smaller homes of up to 40 beds accounted for around 75% of all homes that closed and yet around 40% of them had a Good rating. It is a shame that so many care homes working hard to get Good CQC ratings are lost.
Homes with 60 beds or more accounted for around 20% of all beds lost through closure. Around 90% of these homes were rated either Inadequate or Requires Improvement, and it is likely that poor care quality was a key factor behind the closures.
Closures due to poor CQC ratings cause major issues for care providers and their livelihoods, their employees (how do we encourage people into a career in care if there is a chance of closures and job losses?), and heartache for the residents and their families.
A lot more detail with regards to closures and CQC ratings is included within CSI Market Intelligence’s report .
Very recently, CQC was forced to abandon its local system review programme after the Department of Health and Social Care ignored a request for approval to continue.
I strongly believe that this, or something similar, should be maintained and certainly include ongoing analysis of the type of anomalies that I have highlighted here to ensure that the systems across the country are fair and that providers are given an equal chance, no matter their sector, location or size. Perhaps we should all be lobbying to bring this into effect?
How do these statistics relate to care provision in your area? Are you facing closure following a CQC inspection? Share your thoughts and feedback on this feature on the CMM website www.caremanagementmatters.co.uk