Development of a common national questionnaire to evaluate student perceptions about the Australian Rural Clinical Schools Program
Citation: DeWitt DE, McLean R, Newbury J, Shannon S, Critchley J. Development of a common national questionnaire to evaluate student perceptions about the Australian Rural Clinical Schools Program. Rural and Remote Health (Internet) 2005; 5: 486. Available: http://www.rrh.org.au/articles/subviewnew.asp?ArticleID=486 (Accessed 20 October 2017)
[View Author Details]
The Australian Commonwealth Department of Health and Ageing provided funds for the Australian medical schools to establish Rural Clinical Schools. This workforce initiative has enabled medical students to learn in a diverse range of rural and remote healthcare settings. A common questionnaire was developed and agreed on by all the directors of the Rural Clinical Schools. Use of this common questionnaire will facilitate reports on student attitudes and program outcomes, both within individual Rural Clinical Schools and at a national program level. The data analysis will inform the community and the Australian Government about the effectiveness of the national Rural Clinical School program in (1) meeting the primary aims of providing high quality rural medical education; and (2) addressing the medical workforce shortage in rural and remote areas.
Key words: evaluation, undergraduate medical education, medical workforce, Rural Clinical Schools.
In order to address the shortage of rural doctors, the Commonwealth Department of Health and Ageing (DoHA) established nine rural clinical schools (RCS) across Australia in 2000 as part of the Regional Health Strategy1. DoHA initially mandated that for each participating university 25% of each cohort of Australian medical students were to spend 50% of their clinical training in the RCS environment. Two pre-existing rural medical student programs2,3 were included and, by 2003, all RCS programs were placing students in rural Australia.
In 2002 the RCS directors met to discuss common issues and subsequently established the Federation of Rural Australian Medical Educators (FRAME) in late 2003. Among the common issues identified, evaluation of the RCS programs, including student perceptions and attitudes, was considered one of the most important by both FRAME and DoHA.
In 2001, DoHA asked the Committee of Deans of Australian Medical Schools (CDAMS) to develop a questionnaire for use by DoHA in its biannual evaluation of the RCSs. As part of that process, members of the CDAMS-DoHA working group recognized the need for a tool for ongoing evaluation, not only of the RCS program, but also for tracking medical students more generally for the purposes of workforce evaluation and planning (Fig 1). This spurred development of a minimum data set questionnaire and national database to track all medical students in Australia. The FRAME questionnaire was designed to link with the CDAMS questionnaire and national database.
Figure 1: Australian universities participating in the FRAME project.
This article describes the development of the FRAME questionnaire and presents the questionnaire, including the minimum dataset questions as developed through the CDAMS Steering Group, so that it might be adapted and used more broadly by those with an interest in this area.
In 2003, several of the RCS began to develop questionnaires to assess student perceptions and educational effectiveness. Important issues included student recruitment, student perceptions about their academic and clinical education, and any effect that their RCS experience had on their intentions about training (pre-vocational and vocational) or practising in rural Australia. Two of the authors (DD, RM) developed a draft questionnaire, based on published work by one of the authors (DD)4,5 and presented it to the FRAME membership. A modified Delphi process6,7 coordinated by one of the authors (SS), was used over the following year to refine the questionnaire.
Delphi is an expert opinion survey with three special features - anonymous response, iteration and controlled feedback, and statistical group response. The number of Delphi questionnaires may vary from three to five, depending on the agreement and amount of additional information being sought or obtained. Each subsequent questionnaire is built upon responses to the preceding questionnaire. The process stops when consensus has been approached among participants, or when sufficient information exchange has been obtained.8
Additional questions originated from unpublished work at participating institutions (Leahy C, Peterson R [Adelaide University]. Pers. data, 1999), from FRAME members, and some were adapted from the MEDLINE literature9. An online modular version was developed to aid distribution and data collection in a dispersed environment across multiple universities. Two of the RCS (The University of Adelaide and Flinders University) piloted the online draft of the questionnaire. Due to concerns about excessive length and avoidance of duplication with the CDAMS questionnaire, the FRAME questionnaire was shortened, a common methodology was developed, and the CDAMS questionnaire was incorporated into the FRAME questionnaire in order to collect demographic data in a consistent manner (JC).
The FRAME questionnaire was finalised at a meeting in May 2005 and is presented in Appendix I.
The RCS directors have developed and accepted the FRAME questionnaire as the common evaluation tool for core educational outcomes and student perceptions. The questionnaire will be delivered online and each RCS will maintain ownership and security of data relating to its students.
To achieve both consensus (of FRAME members) and brevity (as an aid to securing completions), the FRAME questionnaire has been limited in size. Larger amounts of data could be collected but analysis and reporting would be more difficult and not necessarily produce clearer outcomes. FRAME is confident that enough data will be collected for meaningful analysis. Because students will be identifiable for linkage with the CDAMS database and longitudinal tracking of training and practice location, full ethics committee approval is being sought at each participating institution.
DoHA and the medical schools have invested considerable energy and funds in establishing Australian RCS. The FRAME questionnaire will ensure that not only will comprehensive national outcomes be measured and progressively reported as students enter the rural workforce, but also that individual RCS have a tool that will allow modifications to be made to their programs as they mature and develop so that the results can be optimised.
Substantial contributors to development of the FRAME RCS educational outcomes questionnaire and the CDAMS minimum dataset outcomes database questionnaire are acknowledged as follows.
1. FRAME Investigators (RCS Directors): Peter Baker, Dawn DeWitt, Rick McLean, Steve Margolis, Campbell Murdoch, Jonathan Newbury, Louis Pilotto, Sandy Reid, Geoff Solarsh, Judith Walker, Paul Worley.
2. CDAMS Outcomes Database Steering Committee (2005): Convenor: Professor Andrew Coats (Dean, Faculty of Medicine, University of Sydney).
Executive: Professor David Prideaux (Head, Department of medical Education, Flinders University, Deputy Head, School of Medicine, Griffith University); Professor John Humphreys (Rural Undergraduate Support & Coordination Reference Group/Monash University); Ms Danielle Brown (CDAMS, Executive Officer); Ms Baldeep Kaur (CDAMS, Database Project Officer).
Members: Professor Dawn DeWitt (FRAME representative); Professor Teng Liaw (ARHEN representative); Dr Peter Vine (CDAMS/University of New South Wales); Mr Ped Ristic CDAMS/University of Western Australia); Mr David Meredyth (Department of Health & Ageing); Professor Richard Hays (Rural Undergraduate Support & Coordination Reference Group/James Cook University); Dr Mary Harris (Australian Medical Workforce Advisory Committee); Prof Peter Roeser (Confederation of Postgraduate Medical Education Councils); Ms Dana Stanko (Vice-President, Australian Medical Students Association); Ms Lydia Scott (Chair, national Rural Health Network).
1. Department of Health and Ageing. Workforce Education and Training: Rural Clinical Schools. (Online) 2002. Available: http://www.health.gov.au/internet/wcms/Publishing.nsf/Content/health-workforce-education-clinical.htm (Accessed 12 September 2005)
2. Worley P, Lines D. Can specialist disciplines be learned by undergraduates in a rural general practice setting? Preliminary results of an Australian pilot study. Medical Teacher 1999; 21: 482-484.
3. Sturmberg J, Reid S, Khadra M. A longitudinal, patient-centered, integrated curriculum: facilitating community-based education in a rural clinical school. Education and Health (Abingdon) 2002; 15: 294-304.
4. DeWitt DE, Curtis JR, Burke W. What influences career choices among graduates of a primary care training program? Journal of General Internal Medicine 1998; 13: 257-261.
5. DeWitt DE, Migeon M, LeBlond R, Carline JD, Francis L, Irby DM. Insights from outstanding rural internal medicine residency rotations at the University of Washington. Academic Medicine 2001; 76: 273-281.
6. Turoff M. The policy Delphi. London: Addison-Wesley, 1975.
7. Williams PL, Webb C. The Delphi technique: a methodological discussion. Journal of Advanced Nursing 1994; 19: 180-186.
8. Hwang C-L, Lin M-J. Group Decision Making Under Multiple Criteria: methods and applications. Berlin: Springer-Verlag, 1987.
9. Adams ME, Dollard J, Hollins J, Petkov J. Development of a questionnaire measuring student attitudes to working and living in rural areas. Rural Remote Health 5: 327. (Online) 2005. Available: http://rrh.deakin.edu.au (Accessed 12 September 2005).
© Dawn DeWitt, Rick McLean, Jonathon Newbury, Susan Shannon, Jennifer Critchley 2005 A licence to publish this material has been given to ARHEN, http://www.arhen.org.au
|This article has been viewed 6798 times since September 19, 2005.||Article No. 486|