Journal Mobile

Author(s): 

Matthew Roycroft1, Sunil Bhandari2,3

Author Affiliations: 

1ST6 Geriatric and General Internal Medicine, Rotherham Hospital, Rotherham, UK; 2Consultant Nephrologist, Hull University Teaching Hospitals NHS Trust, Hull, UK; 3International Director, Royal College of Physicians of Edinburgh, Edinburgh, UK

Correspondence to: 

Matthew Roycroft, Healthcare of the Older Person, Rotherham Hospital, Moorgate Road, Rotherham S60 2UD, UK  Email: matthew.roycroft@nhs.net

Journal Issue: 
Volume 49: Issue 2: 2019
Cite paper as: 
J R Coll Physicians Edinb 2019; 49: 147–50

Format

Abstract

Background This study explores the importance of various factors upon the overall satisfaction of Core Medical Trainees (CMTs) in the Yorkshire and Humber Deanery to aid targeting of improvement efforts.

Methods Responses for all CMTs in Yorkshire and the Humber to all questions and domains from the UK National Training Survey 2017 were correlated with a marker of overall trainee satisfaction. Questions with high and low degrees of correlation were identified, as well as recurrent themes.

Results Clinical supervision appears to be closely related to overall trainee satisfaction, frequently correlating well, whereas educational supervision does not. Almost all themes with high correlation were regarding day-to-day experience, whereas those correlating poorly were regarding infrastructure issues.

Conclusions Assuming similar costs, to improve overall trainee experience the most limited resources are probably best targeted at those factors influencing day-to-day experience, such as freeing consultant time for clinical supervision rather than developing one-off trainee experiences.

HTML Full Text

Introduction

The UK National Training Survey (NTS), administered by the General Medical Council (GMC), was first carried out in 2006 1 and has been subsequently published annually. The mandatory 2017 survey had a response rate of 98.3%, reflecting the views of 53,335 doctors in training across the UK, and is seen by the GMC as being ‘crucial in helping them make sure doctors in training receive high quality education and training in a safe, supportive and effective clinical environment’. 2 Much of the data from the 2012 survey onwards is publicly available via an online reporting tool 3 and allows complex trend analysis and year-on-year comparisons. The main areas of concern identified in recent years include heavy workloads (which are getting heavier) and rota gaps, as well as the consequences of these issues with trainees often feeling short of sleep whilst working and missing educational opportunities.

As well as the NTS report, complex local reports are produced showing results in predetermined domains. Each domain is made up of a number of questions that are felt by the GMC to be relevant. Where the overall score for a domain makes it a statistical national outlier (either positively or negatively) it is then highlighted. This is not carried out for the individual questions and it is often hard, therefore, to work out which specific question or questions are scoring poorly. Alongside this there is no prioritisation of different areas reported upon or which areas make the most difference to trainees: domains that may well be of very different importance to trainees, such as ‘educational governance’ and ‘workload’, are given equal representation in the results.

The NHS in 2016 was described as being ‘underfunded, underdoctored and overstretched’4 and with such stretched resources it is, therefore, more important than ever that we use them in the most efficient ways by improving the areas that make the most difference. Presently, however, we do not know which of these areas to prioritise to make the biggest improvement to the overall trainee experience.

In this study we sought to better understand, via a correlation analysis of NTS data, what factors have the most impact upon the overall Core Medical Trainee (CMT) experience to enable hospital trusts, colleges, arm’s length regulatory bodies and others to better target resources to these areas or domains and, therefore, significantly improve the trainee clinical and educational experience.

Methods

We carried out a novel two-step retrospective study analysing results of the GMC’s 2017 NTS survey extracted from their online reporting tool using the ‘programme group by trust/board’ function looking only at CMTs in the authors’ Yorkshire and the Humber region broken down by trust. The first step used a scoring system to convert responses to a total score followed by a correlation analysis of these scores comparing them to a question reflecting overall satisfaction with the post. The second step was to look for recurrent themes in the domains with high and low correlation.

The 2017 NTS survey consisted of 17 domains encompassing a total of 69 questions (Appendix A). At the time of the 2017 survey almost all trainees will have been working in the trust they are reporting on for at least 9 months.

Step one

To allow comparison between question responses, the NTS’s individual questions, which all used Likert type scales, had a single score out of 100 calculated for them. This was carried out by multiplying the percentage of the most positive response by 100, the least positive by 0 and intermediate responses by a proportionate figure. For example, if there are five responses the percentage of respondents in each category were multiplied by 100, 75, 50, 25 and 0, respectively. For questions where ‘N/A’ (not applicable) was an optional response a simple adjustment was made, proportionately increasing all other responses to ensure the percentage of responses in the scoring questions totalled 100%. These calculated figures were then added together to create a score out of 100 for each question for each trust. Each major domain of the NTS survey already has a score out of 100, which could be used unadjusted.

Scores for all domains and questions were compared to the results of the question ‘how would you describe this post to a friend that was thinking of applying for it’ hereafter referred to as ‘the overall satisfaction question’ by Pearson’s correlation coefficient. This was calculated using the regression analysis function from Excel’s (Microsoft Corporation, USA) inbuilt data analysis suite.

Step two

Once correlation coefficients were calculated, questions were identified with either high degrees of correlation (correlation coefficients greater than 0.70) or low degrees of correlation (coefficients between -0.2 and 0.2). Common themes in each group were then identified by the authors and were not predefined.

Criteria and definitions

Defining a baseline for trainee satisfaction is a complex topic with arguments able to be made both for and against many different options. Given the retrospective nature of this study the most straightforward way to get a contemporaneous opinion was to review the questions themselves. Although the domain ‘overall satisfaction’ in theory could be used as a marker of overall trainee experience, it is made up of a collection of questions not all of which appear to adequately address the topic (Appendix A). The overall satisfaction question chosen encompasses many different things from learning opportunities and quality of supervision to intensity of workload, among others, and gives the respondent an opportunity to bring them all together to give a single score. The authors felt this question was the best marker of satisfaction as it does not make sense for a trainee to recommend a job they do not like to a friend and vice versa.

Alternative approaches to determine what factors impact trainee satisfaction were considered, such as a survey directly addressing it and interviews. The former was felt to add little on top of the NTS questions but could introduce significant issues with response numbers and respondent bias and possible questionnaire fatigue. With the latter, although very good at identifying issues, it is very hard to rank issues unless carried out with large numbers of people, which would have significant resource implications and again would be highly likely to introduce respondent bias.

The predefined criteria for high and low degrees of correlation were selected based upon the lead author’s experience with working with similar data and arguments could have been made for alternative numbers or numbers encompassing a certain percentage of respondents. It is widely believed, however, that no one approach is better than any other.

Results

In total, response data from around 230 CMTs were analysed. Of the 13 trusts in the region, one did not have enough responses (<3) for the GMC to present data and so was excluded from the analysis. For one domain (regional teaching) four trusts did not have an adequate number of respondents and so analysis of this domain was carried out using only eight trusts. All other domains and questions had responses from 12 trusts.

Table 1 shows the correlation coefficient for each of the 17 domains compared to the overall satisfaction question. Five of the domains met the predefined definition of high degrees of correlation with the overall satisfaction question. One of these domains, overall satisfaction (coefficient of 0.98) included the overall satisfaction question within it. The other domains identified as having strong correlation with the overall satisfaction question were clinical supervision, supportive environment, adequate experience and curriculum coverage.

Table 1 Correlation between National Training Survey (NTS) domains and the overall satisfaction question

NTS domain

Correlation coefficient

Overall satisfaction

0.98

Clinical supervision

0.88

Clinical supervision out of hours

0.67

Reporting systems

0.18

Workload

0.61

Teamwork

0.61

Handover

0.46

Supportive environment

0.79

Induction

0.67

Adequate experience

0.80

Curriculum coverage

0.77

Educational governance

0.43

Educational supervision

0.29

Feedback

0.42

Local teaching

0.55

Regional teaching

-0.13

Study leave

0.45

 

The full results of the correlation analysis by individual question can be seen in Appendix A. All questions with high degrees of correlation to the overall satisfaction question are shown in Table 2 and those with low degrees of correlation in Appendix B. Of note, the first four questions in Table 2 are the other four questions from the overall satisfaction domain.

Table 2 Questions with high degrees of correlation (>0.70) with the overall satisfaction question

Question

Correlation coefficient

Please rate the quality of teaching (informal and bedside as well as formal and organised sessions) in this post

0.85

Please rate the quality of clinical supervision in this post

0.86

How would you rate the quality of experience in this post?

0.93

This post will be useful for my future career

0.94

In this post how often (if ever) are you supervised by someone who you feel isn’t competent to do so?

0.72

Please rate the quality of clinical supervision in this post

0.86

In this post, OUT OF HOURS, how often (if ever) are you clinically supervised by someone who you felt wasn’t competent to do so?

0.72

In this post, how often (if at all) did your working patterns leave you feeling short of sleep when at work

0.73

Please state whether you agree or disagree with the following statement about your post. The working environment is a fully supportive one

0.89

Please state whether you agree or disagree with the following statement about your post. The working environment is one which fully supports the confidence building of doctors in training

0.75

Please rate the quality of the induction you received for this post

0.82

How would you rate the practical experience you were receiving in this post?

0.72

I am confident that this post will help me achieve the competencies I need at my current stage of training

0.75

I am confident that this post will give the opportunities set out in my development plan relating to professional experience (for example leadership, management, teaching, research, quality improvement etc.)

0.72

 

Recurrent themes

The domains with strong correlation (listed above) could be thematically linked as being areas that affect day-to-day experience. Those with poor correlation (reporting system and regional teaching) seem to reflect more of an infrastructure theme. If the definition of poor correlation was expanded further to -0.3 to +0.3 then this would add educational supervision to the list, which again is more regarding infrastructure than day-to-day experience.

Questions with high and low degrees of correlation had common themes identified for them (Appendix C) that agree with the suggestion from the domain analysis that it is day-to-day experience and not infrastructure issues that most positively affect the overall satisfaction of trainees, with strong correlation shown especially for clinical supervision, ease of achieving annual review of competence progression outcomes (which with present medical curricula primarily suggests ease of getting assessments) and a supportive environment. Themes coming up more than once for the questions with poor correlation were clinical governance, educational supervision, local teaching, regional teaching and consent. As with the domains these are primarily infrastructure issues and certainly not relating to the day-to-day job.

Discussion

This study looking at the correlation of responses to various questions in the NTS to a question reflecting overall satisfaction (likelihood of recommending the post to a friend) revealed several interesting results. Both questions and domains that looked at the day-to-day experience correlated very well with overall satisfaction, whereas those looking at infrastructure issues did not.

Examining recurrent themes, the quality of clinical supervision was most prominent amongst those positively correlated with overall satisfaction, whereas educational supervision was one of the themes identified regularly as being least important. This may be partly because at the time of the NTS survey clinical and education supervisors were delivered by different people (whereas in the August–November rotation they are often the same locally). Despite this, it still shows the importance of the relationship between the trainee and the consultant supervising them at that point in time. Educationalists have historically placed a great degree of emphasis on both training programme directors and educational supervision, but this suggests from a trainee point of view it is actually the more numerous (and often less educationally trained) clinical supervisors that are more important.

NTS questions are not necessarily reliable5 and both questions with negative correlation [days subtracted from study leave allowance to attend compulsory training (-0.42) and the quality of regional teaching (-0.53)] are likely not reliable amongst this group. The first question is likely irrelevant to most CMTs as few trainees in the region get close to using up their full allocation (as is the case in most of the country) and the second question likely had significant interpretation difficulties by trainees as many trusts did not have regional training programmes at the time of this survey (despite trainees scoring them as if they did).

The vast majority of questions had a low or moderate degree of positive correlation. There are three putative explanations for this: trusts that are generally good invest widely in education thus making most areas positively rated; that everything has an effect on trainee experience to one degree or another; and, that trainees that rate the entire placement well are more likely to rate every other question well when deciding between two options. It is the authors’ opinion that all three of these reasons have a degree of influence rather than it being any one or another.

Working patterns is a theme presently being looked at in many hospitals in the region and is likely influenced by trainees being very aware of the new junior doctor contract discussions, which may influence questions around this. Despite its importance, clinical governance is not an area many trainees will have much experience of so it is unsurprising to see it correlate poorly. What is more surprising is that work intensity (specifically at night) did not correlate with satisfaction, potentially suggesting junior doctors are still prepared to work hard and see this as part of the job as well as, possibly, seeing the type of workload on night shifts as being of significant educational value. Both regional and local teaching correlate poorly. Despite this being an important area for trainee development it is possibly simply not a big enough part of their day-to-day job to impact whether or not they would recommend a placement.

Although correlation does not necessarily imply causation, this paper has demonstrated the importance of the day-to-day experience and particularly high-quality clinical supervision above all other areas for trainee satisfaction and likelihood of recommending the post to a friend. Although still important issues, those which are more infrastructure related and only affect a trainee occasionally have far less correlation with trainee satisfaction.

The suggestion of improving clinical supervision and day-to-day work most likely comes down to workforce issues (at all levels) and consultant and junior job planning – both already large national issues – but are also areas trusts could focus upon. Further research is suggested into how this can be best implemented in a climate where consultants are increasingly having little time for nonclinical activities. Many places are presently putting large amounts of money into one-off cost infrastructure projects to improve trainee morale (such as redevelopment of on-call rooms or doctors messes), although important from many points of view it would be very interesting to see the impact of projects like this on training experience as this study suggests infrastructure projects likely are not the best use of resource.

Online Supplementary Material

Appendices A–C are available with the online version of this paper, which can be accessed at https://www.rcpe.ac.uk/journal.

References

1 Smith D, Le Rolland P, Paice E. National Training Survey 2006 – Key Findings. London: PMETB; 2007.

2 General Medical Council. 2017 National training surveys summary report: initial results on doctors’ training and progression. 2017. https://www.gmc-uk.org/about/what-we-do-and-why/data-and-research/nation... (accessed 10/04/19).

3 General Medical Council. National training survey portal. https://webcache.gmc-uk.org/ntsportal/Account/GuestLogin.mvc (accessed 10/05/18).

4 Royal College of Physicians. Underfunded. Underdoctored. Overstretched. The NHS in 2016. 2016. https://www.rcplondon.ac.uk/guidelines-policy/underfunded-underdoctored-... (accessed 10/05/2016).

5 Wood S, Gough M. Online training surveys: not worth the paper they’re written on. RCS Bulletin 2018; 100: 32–7.

Financial and Competing Interests: 
SB is the International Director of the Royal College of Physicians of Edinburgh
PDF
PDF SUPPLEMENT