The Australian Commonwealth Government has recently funded all Australian medical schools to transfer a significant proportion of their undergraduate clinical teaching to rural regions as part of its rural medical workforce policy1. Is this educationally sound? One important measure of educational quality is student examination performance. Recently published Australian evidence suggests that academic performance by medical students based in rural practice is excellent2.
A further measure of quality is the students' perception of their experience and competence in managing common conditions and procedures. Evidence from a short-term rural placement in Western Australia3 was encouraging, but there is no published evidence that relates to the length of study that students will be undertaking in the new Rural Clinical Schools - at least an entire clinical year.
The School of Medicine at Flinders University, South Australia, offers a four year graduate entry medical program4 . Students undertake their major clinical examination at the end of Year 3. Students study Years 1 and 2 together in Adelaide, the state capital, but have three alternatives in Year 3. They can choose to remain at the University's principal urban tertiary teaching hospital, Flinders Medical Centre (FMC), in Adelaide. Alternatively, they can choose to undertake Year 3 based in the Northern Territory Clinical School at Royal Darwin Hospital, a tropical secondary referral base 3000 km north of Adelaide. The final option is to undertake the entire year based in rural primary care in either the Riverland5 or Greater Green Triangle6. This program, known as the Parallel Rural Community Curriculum (PRCC), enables students to learn all disciplines concurrently, in comparison to the block learning at FMC and Darwin.
Aim of the study
The Darwin and PRCC programs represent the most common educational models proposed by Rural Clinical Schools throughout Australia, providing an example of a base hospital model (secondary care) and a longitudinal rural practice model (primary care). This study, using the same instrument as that devised by Culhane et al.3, aims to compare the self-reported experience and competence of students undertaking their Year 3 study in these three very different locations.
This study formed part of a comprehensive case-study of the three teaching locations conducted from 1998 to 2002. The 1998 Year 3 cohort was chosen for this part of the study. All six students in the PRCC, and ten at Darwin were invited to participate. Sixteen students were chosen by the Department of Medical Education to match these students in terms of age, gender, rural origin and previous academic study, and also invited to participate. For simplicity, the three locations will be referred to as the primary, secondary and tertiary sites.
The 'Confidence in common procedures and conditions' survey instrument3 investigated both the reported 'experience' of patient-related opportunities available to the student and the perceived 'outcome' of confidence in managing common conditions. Eight items were added to this instrument due to their importance to Year 3 at Flinders University. The instrument then consisted of 140 items: 78 common procedures and 62 common conditions. This list of 140 items was agreed to, by the specialist disciplines, as being indicative of the core skills and conditions for a Year 3 student (Appendix I).
For each item the students indicated both their experience (four-point ordinal scale) and self-perceived competence (four-point ordinal scale). From these items, the three groups of students were compared for exposure to different clinical experiences and achievement of competency in common procedural skills using SPSS vers. 10.0 (SPSS Inc, Chicago, IL, USA).
In the Western Australian study, students were only given one ranking scale for each item with five choices: have never seen; have observed; have had some hands on experience; could manage with assistance; could perform alone. These were presented in the paper as representing an ordinal sequence. This ordinal ranking scale, combining experience and competence, was seen as a potential weakness of this research tool. In this study, it was decided to separate the self-perceived competence from the recorded experience to give greater clarity to the responses, and to test the hypothesis that the sequence used by both Culhane et al3, and Spike and Veitch7 before them, was not, in fact, ordinal.
The survey was sent to the thirty-two students described previously. Twenty-nine students (15 tertiary hospital, eight secondary hospital, and six primary care) completed this 140 item survey in week 35 of their Year 3 studies (response rate 91%; Table 1).
The aggregated response for each of the categories for the entire study population is presented in Table 2.
Table 1: Demographic description of study participants
Table 2: Reported experience and perceived competence in 78 procedural skills. Aggregated student responses (%) by location
From these figures it can be seen that the primary care students had greater experience of common procedures compared with their tertiary peers (Wilcoxon W = 917133.5, Z = -3.715, p < 0.001), and a less markedly, but still greater level of experience compared with the secondary care students (Wilcoxon W = 324307.5, Z = -2.224, p = 0.026). The primary care students also reported greater competence in these procedures compared with both the tertiary (Wilcoxon W = 911888.5, Z = -2.617, p = 0.009) and secondary (Wilcoxon W = 317410.5, Z = -2.79, p = 0.005) students.
There was considerable variation amongst the students. For example, the amount of reported 'hands on experience' varied from 15.6% to 77% and the perceived competency ratings varied from 20.5% to 89.7%. Interestingly, the students who reported the lowest experience also reported the lowest competence, but not so for the highest reporting students. This phenomena is investigated further (Table 3).
Table 3: Student reported experience by student perceived competence, whole cohort n = 2225, missing = 37
Table 3 demonstrates that there is a significant correlation between reported experience and perceived competence among all students (Spearman's rho = 0.69, p < 0.001). Students who had some hands-on experience with a procedure rated themselves as competent to perform that procedure with, at the most, minimal assistance in 95% of the cases (c2 = 842.75, 1df, p < 0.001). There was no clear relationship, however, in the 34% of cases when the student had only observed the procedure.
Importantly, although not having seen a procedure at all was associated with a lack of perceived competence in 83% of cases (c2 = 520.16, 1df, p < 0.001), students did report competence to perform a procedure alone in 5% of the cases when they had not seen the procedure at all. These procedures included mouth-to-mouth, and mouth-to-bag, ventilation for all three locations; vaginal swabs, vaccinating a child, and incision and drainage of an abscess in the tertiary hospital; and using a rotohaler and controlling epistaxis in primary care.
The data were also analysed on a procedure-by-procedure basis. Analysing the data in this way revealed the reported gaps in student experience at each location. Once again, due to the small numbers, caution must be used in interpreting these results. The following trends between sites were apparent (Table 4); the cells show the percentage of students who reported no hands-on experience with the procedure.
Table 4: Percentage of students with no reported hands-on experience of indicative common procedures
As with the results for procedures, a pattern emerged of primary care students reporting greater reported clinical experience with common conditions than their tertiary peers (Table 5; Wilcoxon W = 579319, Z = -4.347, p < 0.001).
Table 5: Student reported experience and perceived competence with 62 common conditions. Mean student responses (%) by location, n = 1798
When comparing the secondary and tertiary students, a similar pattern was found. The secondary students reported greater reported experience with common conditions (Wilcoxon W = 642872.5, Z = -2.73, p = 0.006), but the tertiary students reported a higher level of perceived competence (Table 4; Wilcoxon W = 318612.5, Z = -4.645, p < 0.001). There was no significant difference between the reported experience reported by the primary and secondary care students (p > 0.05) but, as above, the tertiary students reported less competence than the primary care students (Wilcoxon W = 203969, Z = -2.986, p = .003). These apparent differences between reported experience and perceived competence with common conditions were analysed further (Table 6).
Table 6: Student reported experience by perceived competence with common conditions. Whole cohort, n = 1790, missing = 8
It is clear from Table 6 that, for the cohort as a whole, greater experience is correlated with greater self-reported competence to manage conditions (Spearman's rho = 0.476, p < 0.001). However, in addition to the differences noted previously between the locations, there are significant exceptions that prevent this being a simple ordinal relationship. For example, in 18% of the cases not seen by students there was a preparedness to manage the condition with, at the most, minimal assistance.
As with the common procedures, the data were analysed by individual condition. Again, due to the small numbers of students in the cohort, the results of this part of the analysis need to be viewed with some caution. Table 7 documents indicative conditions in each of the specialist disciplines for which a student reported completing the year without seeing the condition at all.
Table 7: Percentages of students with no reported experience of indicative common conditions
These self-reported outcome data have documented a pattern of increased clinical exposure to common clinical conditions and procedures for the rural primary care students, in comparison with their hospital-based peers. There were also significant gaps in the reported experience at each of the locations, but these gaps were fewer in rural primary care. These data, in particular the experience with common conditions, correlate well with the findings of the previous study on short-term placements3, with the improved examination performance, based on written and Objective Structured Clinical Examination (OSCE) tests, achieved by the primary and secondary care students2, and with improvements in student self-confidence noted in students in a similar program in the USA9.
These data have also demonstrated that there was a positive correlation between reported experience and self-perceived competence, and that this was greater for procedural skills than competence in managing common conditions. However, these data have shown that self-reported experience and competence were not part of a single ordinal scale. Combining these two attributes may miss important data for some levels of experience (eg, no predictive value for competence from merely having seen a procedure), and for some students (eg, those who reported competence without any experience).
It was concerning to find that any students could complete their entire clinical year without having performed many simple, yet important manual skills, such as a rectal examination, a PAP smear, or suturing a laceration. Of greater concern was the lack of practical experience in emergency, potentially life-saving, procedures such as airway management. In addition, it was three tertiary hospital students, who may have expected to see more 'acute' medicine, who reported not having seen a patient with an acute myocardial infarction during the year. Following internal analysis of these figures, the School of Medicine has placed increased emphasis on the systematic teaching of clinical procedures through a clinical skills laboratory10 and has transformed the data collection tool into a Clinical Learning Logbook, in both on-line and paper-based formats, to enable students at all sites to track their learning throughout the year.
This study provides evidence that medical students' exposure to core clinical conditions was increased overall, and with fewer gaps, in the rural primary care setting when compared with a tertiary teaching hospital. When this is correlated with the improved examination performance, it provides further evidence that rural primary care is an excellent setting for high quality clinical and educational experiences.
The findings of this study should also serve to encourage students and staff involved with the new Rural Clinical School programs. While further research will be needed to determine the generalisability of these findings to other institutions, it will not be surprising if these new extended rural placements prove to be extremely popular with students and satisfying to the clinicians, academic staff and communities who host them.
The Parallel Rural Community Curriculum was funded through a grant from the Commonwealth Department of Health and Aging. The Northern Territory Clinical School at Royal Darwin Hospital is funded principally by the Northern Territory Government.
1. Commonwealth Dept of Health and Ageing. Workforce, Education and Training. Rural Clinical Schools. (Online) 2001. Available: http://www.health.gov.au/workforce/education/clinical.htm (Accessed 1 June 2004).
2. Worley P, Esterman A, Prideaux D. Cohort study of examination performance of undergraduate medical students learning in community settings. BMJ 2004; 328: 207-209.
3. Culhane A, Kamien M, Ward A. The contribution of the undergraduate rural attachment to the learning of basic practical and emergency procedural skills. Medical Journal of Australia 1993; 159: 450-452.
4. Finucane P, Nichols F, Gannon B, Runciman S, Prideaux D, Nicholas T. Recruiting problem-based learning (PBL) tutors for a PBL-based curriculum: the Flinders University experience. Medical Education 2001; 35: 56-61.
5. Worley P, Silagy C, Prideaux D, Newble D, Jones A. The Parallel Rural Community Curriculum: an integrated clinical curriculum based in rural general practice. Medical Education 2000; 34: 558-565.
6. Walters LK, Worley P, Mugford B. The Parallel rural community curriculum: is it a transferable model? Rural and Remote Health 3, 236. (Online) 2003. Available: http://rrh.org.au (Accessed on 1 June 2004).
7. Spike NA, Veitch PC. Competency of medical students in general practice procedural skills. Australian Family Physician 1991; 20: 586-591.
8. Department of Human Services and Health. Rural, Remote and Metropolitan Areas Classification 1991 Census Edition. Canberra, ACT: Australian Government Publishing Service, 1994.
9. Verby JA, Newell JP, Andresen SA, Swentko WM. Changing the medical school curriculum to improve patient access to primary care. Journal of the American Medical Association 1991; 266: 110-113.
10. Plummer JL, Owen H. Improving learning of a clinical skill: the first year's experience of teaching endotraheal intubation in a clinical simulation facility. Medical Education 2002; 36: 635-642.