Every year the Picker Institute conducts an independent survey of NHS staff, and the results of the 2013 survey were published on February 25th. In recent years, staff of 2gether NHS Foundation Trust have not given it great marks, and these latest results show the trust’s problems continuing.
Last year’s poor results were never fully discussed by the trust’s Council of Governors. In April, a verbal report to the Council by the Chief Executive covered various things that included the “Staff Attitude [sic] Survey”. As minuted (paragraph 12.3, seemingly not available online), “it was agreed that a presentation outlining the progress made would be given to the full Council at the July meeting.” This never happened. It was put off and put off, and now the 2013 Staff Survey results have overtaken it.
In August last year, Gloucestershire’s NHS commissioners’ (CCG) Integrated Governance and Quality Committee discussed 2gether’s 2012 staff survey results, and the minutes read (p. 277 in the PDF):
“The Committee were disappointed to see the results of the 2gether staff survey in their Quality Annual Report, which indicated the Trust’s score was in the lowest 20% when compared with trusts of a similar type. The Committee welcomed the initiatives described in the report to address these concerns and noted that the CCG will be working with 2gether over the next few months to ensure improvement in these key areas.”
In the Quality Report, published as part of the trust’s Annual Report, the trust essentially blamed its recent tinkering with the configuration of services for the poor results, implying that the problems would be temporary. Nevertheless, it made a number of promises about the things it would do to improve matters (p. 87):
“Whilst the results of the 2012 survey were disappointing, they were not wholly unexpected in the context of the significant change which services and staff have experienced in the preceding 12 months.
“The Trust has therefore been pro-active in starting a staff engagement programme…
“Three key findings have been set at [sic] the priorities, these [sic] are ‘stress at work’; ‘communication with senior managers’ and ‘recommendation of the Trust as a place to work and receive treatment’.”
Only one of these supposed priorities shows significant improvement in the 2013 survey, and even on that improved measure the trust remains in the bottom third of mental health trusts.
2013 survey results
The survey results are quite hard to interpret. There are 23 multi-part questions (apart from background questions like gender). The results are expressed as 28 Key Findings (KFs), which are grouped under six headings. Four of the headings correspond to staff pledges in the NHS Constitution, and the other two are additional themes made up for the purposes of the survey. And in addition, there’s an overall staff engagement score.
One useful way to look at the results is to compare the key findings with previous years, and another way is to compare them with other mental health trusts. But taking a view of the overall results for the whole NHS can also be useful. The Picker Institute’s Chief Executive, Dr Andrew McCulloch, took this approach when he was quoted in its press release as saying:
“We are particularly pleased to see improvements on key measures of whether staff would recommend their organisations as a place to work or receive treatment. But whilst these have improved, there is still a long way to go. One in three members of staff did not agree with the statement ‘if a friend or relative needed treatment, I would be happy with the standard of care provided by this organisation’ – and fewer than three in every five said that they would recommend their organisations as a place to work.”
The proportion of 2gether staff who responded to the survey was higher than average for mental health trusts, and higher than last year. Of the trust’s 1800 staff, 750 were invited to participate and 418 responded. Last year only 373 staff responded, and the previous year 478 (a higher rate, even though the trust had more staff back then).
I don’t plan to plough through all the details of the results. I’ll just pick out some of the things that I find interesting. In this post I’ll only use the word “significant” to mean statistically significant, which is not always the same thing as interesting.
Insignificant changes are statistically equivalent to no change at all (because of effects like sampling error, sample size, and regression to the mean). So when a change is reported as “insignificant”, you should be careful to interpret this as “No change at all” no matter how much you were hoping for a real change.
Tables referenced in this post can be found in the full report (PDF) for the trust. Some figures are from the accompanying spreadsheets and from last year’s results. Some of the comparisons between trusts have had to be calculated from the data in the spreadsheets.
Comparisons with last year
Compared with last year, only one change in the key findings is both significant and interesting, and that’s the improvement in KF24: Staff recommendation of the trust as a place to work or receive treatment, which increased from 3.19 to 3.46 (Table A2.1, although slightly different figures are given in other places in the report).
The average KF24 for all mental health trusts increased only a little (from 3.52 last year to 3.55 ), so very little of the improvement in the trust’s score can be attributed to generally improved morale in the NHS. This is a real improvement in the trust. However, the trust remains well below average, still ranking in the bottom third of all mental health trusts for this key finding.
Amongst the different parts of the trust, the best KF24 score (3.79, Table 5.2) was from corporate staff. Their score improved (by 0.37) more than the overall improvement in the trust as a whole (0.27), and more of them (25% more) responded to the survey.
In contrast, the Children and Young People’s Service (CYPS) score got worse and fewer of them responded to the survey. The CYPS score (2.85) was worse than the worst mental health trust in England (Cambridgeshire and Peterborough, 3.01) for KF24. But because there were so few CYPS responses this made little difference to the trust’s overall result.
Two other changes in the key findings were significant but not particularly interesting — KF8: Percentage having well structured appraisals in last 12 months, and KF26: Percentage having equality and diversity training in last 12 months. Both scores improved. Improving appraisals was a stated aim of the trust, though not a priority, in its Quality Report last year, but it remains below average for mental health trusts. On equality and diversity training only four trusts performed worse.
The trust’s other priorities for improvement were KF11: Percentage of staff suffering work-related stress in last 12 months, which remained almost exactly the same as last year, and KF21: Percentage of staff reporting good communication between senior management and staff, which improved only insignificantly and is discussed further below.
Comparison with other mental health trusts
Some 1,400 of the trust’s staff do not think there is good communication with senior management, if you do the arithmetic. This situation resulted in 2gether’s worst ranking amongst the 57 mental health trusts, for KF21: Percentage of staff reporting good communication between senior management and staff. Only two trusts (Norfolk and Suffolk, and Cumbria Partnership) performed worse. The trust’s score improved (to 22%), but the improvement was insignificant despite this being one of the trust’s stated priorities.
2gether’s best ranking was 10th, for both KF4: Effective team working, and KF27: Percentage believing that trust provides equal opportunities for career progression or promotion. The trust’s score for both these KFs improved insignificantly since last year.
Based on median rankings for all 28 KFs, 2gether is 39th out of the 57 overall, which puts it in the bottom third of mental health trusts. The best trust, according to this method, is Tavistock and Portman (in North London), while the worst is Norfolk and Suffolk.
In the Bristol area, the CCG is considering proposals from various consortia for the provision of various mental health services, which will replace existing services from October. 2gether is in the running to provide community mental health services (partly through a wholly owned subsidiary, “2gether for Bristol”, whose staff might not be eligible to take part in the next NHS Staff Survey, as far as I can tell).
The competing consortium involves Avon and Wiltshire Partnership trust (AWP) and nine local community organisations. AWP ranks 46th overall, a fair bit worse than 2gether. For dementia services in Bristol, the bidders against 2gether include another consortium involving AWP, and a consortium involving Devon Partnership trust, which ranks 47th.
Earlier information about the Bristol tender had suggested that some much higher ranking trusts had submitted proposals, but it seems that either this information was wrong or Bristol CCG has now excluded them.
Other interesting results
The results I think are most interesting are the ones that seem to have direct implications for patient care.
KF2: Percentage agreeing that their role makes a difference to patients is not very interesting because the results differ so little between the worst (84%) and the best (95%) trusts. Almost all NHS staff think they make some difference.
KF1: Percentage feeling satisfied with the quality of work and patient care they are able to deliver is the most directly relevant of these KFs. The interesting thing is that more than a quarter of trust staff, more than 500 employees if you do the arithmetic, are not satisfied with the quality of their work and the patient care they are able to deliver. Five out of six mental health trusts did better (the trust ranks 49th with a score of only 71%). Why are these 2gether staff not satisfied? What would they like to do that they don’t feel they are able to do? It’s not evident that anyone has ever tried to find out, despite promises about this very issue in last year’s Quality Report.
KF13: Percentage witnessing potentially harmful errors, near misses or incidents in the last month is also relevant. The trust scored 26%, ranking 26th, which is just better than average. But the interesting thing is that getting on for 500 staff witnessing incidents each month (if you do the arithmetic) adds up to some 5,500 incidents a year, and KF14 shows that almost all of them are being reported. So somewhere or other there should be a huge database allowing tens of thousands of incidents reported over the years to be analysed so as to drive improvements. Yet there’s no sign that anyone has ever tried to implement this.
In summary, although it would be true to say that the only significant changes since last year have been improvements, 2gether’s staff still don’t give it great marks. Two out of three of the trust’s stated priorities for improvement didn’t change to any significant extent. The findings that did improve (and there were only three of them) remain below average. There are worrying implications for patient care, particularly in Gloucestershire’s Children and Young People’s Service (CYPS). Gloucestershire CCG’s Integrated Governance and Quality Committee is likely to remain disappointed, and Bristol CCG is unlikely to be impressed.