Original Research

Formal assessment of the educational environment experienced by interns placed in rural hospitals in Western Australia

AUTHORS

name here
Kirsten Auret
1 FRACP, Professor, Rural and Remote Medicine *

name here
Lesley Skinner
2 FRACGP, Director, Postgraduate Medical Education

name here
Craig Sinclair
3 PhD, Associate professor, research

name here
Sharon Evans
4 PhD, Associate professor, research

CORRESPONDENCE

* Kirsten Auret

AFFILIATIONS

1 Rural Clinical School of Western Australia, University of Western Australia, Albany, Western Australia, Australia

2, 3, 4 Rural Clinical School of Western Australia, University of Western Australia, Perth, Western Australia, Australia

PUBLISHED

20 October 2013 Volume 13 Issue 4

HISTORY

RECEIVED: 21 February 2013

REVISED: 27 June 2013

ACCEPTED: 30 July 2013

CITATION

Auret K, Skinner L, Sinclair C, Evans S.  Formal assessment of the educational environment experienced by interns placed in rural hospitals in Western Australia. Rural and Remote Health 2013; 13: 2549. https://doi.org/10.22605/RRH2549

AUTHOR CONTRIBUTIONSgo to url

© Kirsten Auret, Lesley Skinner, Craig Sinclair, Sharon Evans 2013 A licence to publish this material has been given to James Cook University, jcu.edu.au


abstract:

Introduction: Increased numbers of rural hospital rotations for interns (first postgraduate year) are being created in Western Australia (WA). This study utilised the Postgraduate Hospital Educational Environment Measure (PHEEM), an internationally validated multidimensional questionnaire, to assess and compare the educational environment of five new or established rural sites and an urban teaching hospital.
Methods: The PHEEM was modified slightly: to make language appropriate for the WA context and to collect information about intern location. It was administered at the completion of each 10-week term over the first postgraduate year in 2009.
Results: A total of 147 completed PHEEM questionnaires were returned from 74 interns, evaluating a maximum of 210 individual 10-week terms (105 each from urban and combined rural). Average completion time was 6.4 minutes. The median score for teaching was 45.0 (interquartile range 39.0, 50.0) in rural locations and 43.0 (37.5, 46.0) in urban locations (p=0.046). The median scores for autonomy were 39.0 (35.0, 45.0) and 39.0 (36.0, 41.5) (p=0.672) and for support median scores 33.0 (28.0, 35.0) and 31.0 (28.0, 33.0) (p=0.019) in rural and urban locations respectively.
Conclusion: This study has utilised an Australian-appropriate version of the PHEEM and has provided the first confirmation that, in terms of educational environment, rural intern rotations compare favourably with those in urban settings in WA.

Key words: rural hospitals, educational environment, graduate medical education, internship and residency.

full article:

Introduction

In 1997-98 the Australian Medical Training Review Panel recommended that 'all postgraduate medical officer training (should) include at least one rural ... term'1 and in 2002 a national report commissioned by the Commonwealth Government reiterated this statement, further adding that state health departments should aim to locate rurally a minimum 20% of junior medical officer training positions, reflecting the distribution of the population1.

In Western Australia (WA), only a small number of rural and remote hospitals receive interns (first postgraduate year, PGY1, prior to full registration with the Medical Board) on secondment from urban teaching hospitals. For a decade there have been three established rural sites, and in 2009 two new sites were added. It is likely however that rural allocation places for interns will continue to expand as the demand grows from those graduating from rural clinical schools, as rural postgraduate training pathways are created and as more placements are needed for the increasing number of graduates from local universities2.

As these new sites open up it is imperative to demonstrate that the quality of the terms is comparable with more established teaching hospital rotations. This evidence may then encourage further initiatives and opportunities in country medical education, reassure junior medical staff of the value of rural allocations, support health services to achieve and maintain accreditation of training positions and develop partnerships between medical education research, administration and service delivery3,4.

This research focuses on the 'educational environment' of the five rural and one comparator metropolitan hospital as experienced by interns in 2009. The educational environment is a way of portraying the combined factors that determine what it is like to be a learner in a particular organisation. It is considered in three parts:

  • physical environment (eg facilities, safety, food, shelter)
  • emotional climate (eg feedback, supported, presence/absence of bullying)
  • intellectual tone (eg learning with patients, evidence-based care, planned education)5.

A robust clinical educational environment has been described as one in which good medicine is practised; which has teaching and learning opportunities that are patient-focused, well planned, reflective and motivating; which values professional behaviour; and which evaluates its teaching5. Problems in this environment may include lack of clear objectives and direction, passivity, poor clinical supervision, little time for reflection, and humiliation and bullying5.

One multidimensional tool created to measure this educational environment is the Postgraduate Hospital Educational Environment Measure (PHEEM). Its development by Roff, McAleer and Skinner in the UK was first reported in 20056 and since then it has been utilised in a growing number of countries and clinical placements5,7. The PHEEM has consistently demonstrated high levels of reliability (Cronbach's alpha coefficient > 0.91) and acceptability to participants6-8. The global score is reported to provide a reliable measure of the overall learning environment6. It has three subscales measuring perceptions of teaching (scores indicative of a range from very poor quality through to model teachers), perceptions of role autonomy (very poor through to excellent perception of one's job) and perceptions of social support (non-existent through to a good supportive environment)6.

At the time of the study, one group in Queensland had been using the PHEEM in the Australian context9, although there has since been one further publication from a group of nine metropolitan and regional hospitals in Victoria which confirmed the tool's value in systematically collecting information about educational environment in Australia10.

Methods

The participants in this study were interns undertaking rotations in any rural hospital in WA and those based at a single urban teaching hospital in Perth in 2009. Information about the study, recruitment and signed informed consent was facilitated by staff from the Rural Clinical School of WA and by directors of clinical training who are local medical staff appointed to provide support, supervision and counsel to junior doctors.

At the end of each of the five 10-week terms in 2009, recruited interns were sent an email inviting them to fill in an online version of the PHEEM questionnaire. The PHEEM is a 40-item inventory utilising a five-point Likert scale with responses ranging from 'strongly agree' (4) to 'strongly disagree' (0). Four questions are negative statements and scored in reverse. The perceptions of teaching subscale contains 15 items with a maximum score of 60, perceptions of role autonomy subscale 14 items with maximum score of 56 and perceptions of social support subscale 11 items with a maximum score of 44. The summation of these scores has a combined maximum of 160 and minimum of zero. An approximate guide to interpreting these scores is given by the tool developers: global scores of 0-40 indicate a very poor educational environment, 41-80 indicate plenty of problems, 81-120 indicate more positive than negative but room for improvement, and 121-160 indicate an excellent educational environment6.

The PHEEM was slightly modified to suit the WA context (eg 'bleeped' was changed to 'paged' and 'the New Deal' to 'hospital contracts' (Appendix 1) and an associated database was created. Space for free text allowing comment on the questionnaire was added. Two reminder emails were sent if required. An administrative assistant de-identified the questionnaires upon receipt. Any one intern could complete a maximum of five questionnaires over the course of the study, dependent on their hospital location. Although the PHEEM's authors designed the questionnaire to generate responses that were normally distributed, there was a marked skew for some scales for some sites, so the Kruskal-Wallis test was used in the most part to analyse the data, and representations were made using box plots. Differences between sites were identified using analysis of variance. Subscale scores were compared using the Chi square test with individual cell Chi square results identifying the influential cells. Changes over time were analysed using repeated measures analysis of variance. The data were analysed using SAS v9 (SAS Institute Inc.; http://www.sas.com/software/sas9).

Descriptive information about each rural site was collected through email and telephone conversations with each hospital's medical administration and director of clinical training (eg types of rotations offered, example rosters, facilities, length of time interns have been placed at the hospital, presence of a director of clinical training).

Ethics approval

The Ethics Committees of the University of WA, South Metropolitan Area Health Service and WA Country Health Service approved the study; ethics approval number RA/4/1/2271.

Results

Table 1 displays the characteristics of the rural hospitals compared with the teaching hospital comparator in the capital city, Perth.

PHEEM questionnaire responses

A total of 147 completed PHEEM questionnaires were returned from 74 interns, evaluating a maximum of 210 individual 10-week terms (105 each from urban and combined rural). Seventy-one responses came from rural locations and 76 from the urban site. The response rate over the full year was 72% for the urban hospital and 69% (range 56-93%) for the combined rural hospitals. Average completion time for the PHEEM (by self-report) was 6.4 minutes (n=132).

Table 1: Features of Western Australian country hospitals receiving interns in 2009


Urban versus rural locations for the three PHEEM dimensions

The median score for teaching was 45.0 (interquartile range 39.0, 50.0) in rural locations and 43.0 (37.5, 46.0) in urban locations (p=0.046). The median scores for autonomy were 39.0 (35.0, 45.0) and 39.0 (36.0, 41.5) (p=0.672) and for support 33.0 (28.0, 35.0) and 31.0 (28.0, 33.0) (p=0.019) in rural and urban locations respectively (Fig1).

Classifying the dimension scores into subscales shows a significant difference between the rural and metropolitan sites, with the rural locations performing better on the teaching (p=0.039), autonomy (p=0.028) and support subscales (p=0.001) as shown in Figure 1. No site scored in the worst category for any subscale.



Figure 1: Box plots of summary scores for the teaching, autonomy and support dimensions of the PHEEM at rural and urban locations.

Comparison of individual rural sites across the three dimensions

The differences seen in the teaching dimension total scores were not statistically significant from the urban site, except for site 2 (p=0.016). The subscale scores for site 2 'Model teachers' (p=0.021) and site 4 'Need retraining' (p=0.045) were significantly different from the other sites (Table 2).

The differences seen in the autonomy dimension total scores were not statistically significant from the urban site, except for site 2 (p=0.023). The subscale scores for site 2 'Excellent perception' (p=0.031) and site 5 'Negative view' (p=0.001) were significantly different from the other sites, with site 1 'Excellent perception' close to significance (p=0.063). These are summarised in Table 3.

The differences seen in the support dimension total scores were not statistically significant from the urban site, except for site 2 (p=0.007). The subscale scores for the urban site 'Supportive environment' (low, p=0.031) and site 4 'Not pleasant' (p=0.004) were significantly different from the other sites. These are summarised in Table 4.

Table 2: Teaching dimension across rural sites (and urban comparator)




Table 3: Autonomy dimension across rural sites (and urban comparator)





Table 4: Support dimension across rural sites (and urban comparator)

Comparison of dimensions over time

There were no statistically different results for teaching or autonomy over time while controlling for rural or urban location using a repeated measures analysis (p=0.413 and p=0.137 respectively). However, there was a significant decrease of 0.6 in the score for support (p=0.011) per term overall.

Comparison of first rural and first urban terms

Seven interns were available to compare their perceptions across the three domains in their first rural and their first metropolitan terms. Although in each domain the average scores were higher for first rural terms (teaching: 46.9 vs 42.9 (p=0.38); autonomy: 40.1 vs 39.4 (p=0.78); and support: 34.7 vs 30.6 (p=0.015)), only the support domain was statistically significant.

Comparison of different types of terms across rural and urban locations

Comparisons were made for urban and rural general medicine, general surgery, emergency department and surgical specialties terms. No differences were seen in the scores between urban and rural terms for teaching, autonomy or support except for the surgical specialties (orthopaedics in rural sites) where the rural interns rated the teaching significantly higher (p=0.046) and where there was a trend for the rural interns to rate the support higher (p=0.075).

'Red flagged' responses

In each of the domains of the PHEEM, there are scores that indicate an unsatisfactory result (Fig2), and so these cut-offs was looked at individually.

Of the 147 responses, 16 (10.9%) rated teaching less than 30. Of these, seven were from urban terms (9.2%), four from site 4 (28.6%), two from site 3 (7.4%), two from site 5 (22.2%) and one from site 1 (7.1%). Over half were in general surgical terms and 10 (62.5%) were female respondents.

Eight interns (5.4%) rated autonomy less than 28. Three of these were from urban terms, three from site 5 and one each from site 1 and site 3. There was no effect of gender or specialty.

Only five interns (3.4%) rated social support less than 22: three from site 4 (21.4%), one from site 3 (3.7%) and one from site 5 (11.1%). Four of these interns were female.

Within the PHEEM there are several questions that have also been flagged separately to ensure significant results were not lost:

  • 'There is racism in this post' - four responders agreed (three from same rural site and one urban), two responders strongly agreed (urban)
  • 'There is sex discrimination in this post' - three responders agreed (one each from three different rural sites)
  • 'I feel physically safe in the hospital environment' - no responder disagreed or strongly disagreed.



Figure 2: Subscales for the teaching, autonomy and support dimensions of the PHEEM at rural and urban locations.

Discussion

The national review of rural terms for junior doctors undertaken in Australia in 20021 highlighted a number of issues that appear to be well met in rural sites in WA. These include the requirement for an orientation program and term descriptions, management of workload, balance between service and training, provision of adequate levels of supervision, opportunities for professional development, access to clinical information resources, satisfactory standard of accommodation and formal processes for assessment and feedback.

Teaching, autonomy and support all rated highly in the interns' responses and the rural rotations scored significantly higher in teaching and support when compared with urban rotations. Generally, established rural and new rural locations were equally successful in delivering well-rated intern terms; however, one of the new locations (site 2) scored significantly higher in all dimensions. Comparison across rotation types also reveals equality in the educational environments of specific disciplines in rural postings compared with urban ones except for teaching, which was rated significantly higher for the surgical specialties in the rural locations. There was a consistency of experience for interns independent of the specifics of how long interns had been within each hospital system.

Only one urban hospital was approached to be the comparator site as it was a significant provider of rural intern placements in WA and willing to be involved with the study. It is possible that interns at the other two tertiary hospitals had diverse experiences of the educational environments there; if included, comparisons across rural and urban may therefore have been different.

Eley and Morrissey's survey of pre-intern students in Queensland4 demonstrated that most felt they would have more responsibility for patients, feel more part of a team and have greater contact with senior staff as interns in a rural setting. This positive perception was felt to be important in recruitment to rural jobs and so it is significant that the present study's findings support this view.

A small number of interns indicated the experience of racism or sexual discrimination in their post. Unfortunately the design of the PHEEM makes interpretation of these results unclear: it is not possible to say whether the interns themselves experienced these occurrences, or whether they saw these attitudes being directed towards others (eg patients).

Interestingly, the new sites that had been long-term rural clinical school locations and hence perhaps particularly orientated towards undergraduate curriculums seem to have adequately managed the transition to postgraduate teaching and learning. The only negative impact demonstrated on the autonomy experienced by interns was at a long established rural site (site 4). The interns at the new sites seem to have neither been 'chucked in the deep end' nor overly protected within their roles.

The results also demonstrate that all the sites are able to accommodate the changing intern requirements for supervision and independent practice as they progressed through their first year, with no change in interns' perceptions of autonomy over time.

Additional useful results of this study include the development of a locally appropriate version of the PHEEM (and associated electronic template) and evidence that interns are able to complete it rapidly.

Conclusion

This study has been important in formally evaluating the educational environment of rural intern rotations in WA and has provided the first confirmation that, in terms of this measure, rural intern rotations compare favourably with those in urban settings in WA.

Acknowledgements

The researchers acknowledge Fiona Lee, Mark Gallop and Sarah Devereux for administrative support; Viv Duggan, Tim Ford, Bernie Cregan, David Oldham, rural directors of clinical training and rural medical administration staff for promoting the study and facilitating recruitment; and the rural and urban comparator hospital interns in 2009 for completing the surveys. Funding for this study was received in 2008 from the Postgraduate Medical Council of Western Australia.

References

1. Postgraduate Medical Council of New South Wales. Community and rural terms for junior doctors in Australia - a national review. Canberra, ACT: Postgraduate Medical Council of New South Wales, 2002.

2. Snadden D. From medicine's margins - why all the interest in rural and remote education? Medical Teacher 2009; 31(11): 967-968.

3. Postgraduate Medical Council of Western Australia. Strategic directions for postgraduate medical education and training in Western Australia 2004-2008. Perth, WA: Postgraduate Medical Council of Western Australia, 2004.

4. Eley DS, Morrissey DK. Challenge or opportunity: can regional training hospitals capitilise on the impending influx of interns? Medical Journal of Australia. 2007; 187(3): 196-197.

5. Clapham M, Wall D, Batchelor A. Educational environment in intensive care medicine - use of Postgraduate Hospital Educational Environment Measure (PHEEM). Medical Teacher.2007; 29(6): e184-191.

6. Roff S, McAleer S, Skinner A. Development and validation of an instrument to measure the postgraduate clinical learning and teaching educational environment for hospital-based junior doctors in the UK. Medical Teacher 2005; 27(4): 326-331.

7. Wall D, Clapham M, Riquelme A, Vieira J, Cartmill R, Aspegren K, et al. Is PHEEM a multi-dimensional instrument? An international perspective. Medical Teacher 2009; 31(11): e521-527.

8. Boor K, Scheele F, Scherpbier A, Teunissen P, Sijtsma K. Psychometric properties of an instrument to measure the clinical learning environment. Medical Education 2007; 41(1): 92-99.

9. Eley DS. Junior doctors' perceptions of their preparedness for hospital work: support for the rural clinical school model as a key to better preparation. Medical Journal of Australia 2010; 192(2): 109-110.

10. Gough J, Bullen M, Donath S. PHEEM 'Downunder'. Medical Teacher 2010; 32: 161-163.

___________________________


Appendix 1: Postgraduate Hospital Educational Environment Measure used in this study6

You might also be interested in:

2022 - ‘Most nurses in the rural or remote health clinics run away from their job due to no remote allowances and poor working equipment and environment'. Health professional leaders’ perception of shortages in the nursing workforce in underserved areas in Vanuatu

2022 - Rural and urban disparities in anemia among Peruvian children aged 6-59 months: a multivariate decomposition and spatial analysis

2014 - Effect of rurality on screening for breast cancer: a systematic review and meta-analysis comparing mammography

This PDF has been produced for your convenience. Always refer to the live site https://www.rrh.org.au/journal/article/2549 for the Version of Record.