Original Research

Developing the accredited postgraduate assessment program for Fellowship of the Australian College of Rural and Remote Medicine

AUTHORS

name here
Janie D Smith
1 EdD, Director *

name here
David Prideaux
2 PhD, Professor and Head of Medical Education

name here
Christina L Wolfe
3 MBA, Director

name here
Tim J Wilkinson
4 PhD, Associate Dean (Medical Education)

name here
Tarun Sen Gupta
5 PhD, Head of General Practice and Rural Medicine, Director of Medical Education ORCID logo

name here
Dawn DeWitt
6 FRACP, Dean, Rural Clinical School

name here
Paul Worley
7 PhD, Dean (Medicine)

name here
Richard B Hays
8 PhD, Dean of Medicine

name here
Marita Cowie
9 MBA, CEO

CORRESPONDENCE

*Prof Janie D Smith

AFFILIATIONS

1 RhED Consulting Pty Ltd, Ocean Shores, New South Wales, Australia

2 School of Medicine, Flinders University, Adelaide, South Australia, Australia

3 Wolfe and Associates, Eaglhawke Neck, Tasmania, Australia

4 Medical Education, Christchurch School of Medicine & Health Sciences, Christchurch, New Zealand

5 School of Medicine, James Cook University, Queensland, Australia

6 University of Melbourne and School of Rural Health, Shepparton Victoria, Australia

7 Flinders University, Adelaide, South Australia, Australia

8 School of Medicine, Keele University, United Kingdom

9 Australian College of Rural and Remote Medicine, Brisbane, Queensland, Australia

PUBLISHED

16 October 2007 Volume 7 Issue 4

HISTORY

RECEIVED: 18 June 2007

REVISED: 10 September 2007

ACCEPTED: 16 October 2007

CITATION

Smith JD, Prideaux D, Wolfe CL, Wilkinson TJ, Sen Gupta T, DeWitt D, Worley P, Hays RB, Cowie M.  Developing the accredited postgraduate assessment program for Fellowship of the Australian College of Rural and Remote Medicine. Rural and Remote Health 2007; 7: 805. https://doi.org/10.22605/RRH805

AUTHOR CONTRIBUTIONSgo to url

© Janie D Smith, David Prideaux, Christina L Wolfe, Tim J Wilkinson, Tarun Sen Gupta, Dawn DeWitt, Paul Worley, Richard B Hays, Marita Cowie 2007 A licence to publish this material has been given to ARHEN, arhen.org.au


abstract:

Introduction: Accreditation of the Australian College of Rural and Remote Medicine (ACRRM) as a standards and training provider, by the Australian Medical Council (AMC) in 2007, is the first time in the world that a peak professional organisation for rural and remote medical education has been formally recognised. As a consequence, the Australian Government provided rural and remote medicine with formal recognition under Medicare as a generalist discipline. This accreditation was based on the ability of ACRRM to meet the AMC's guidelines for its training and assessment program.
Methods: The methodology was a six-step process that included: developing an assessment blueprint and a classification scheme; identifying an assessment model; choosing innovative summative and formative assessment methods that met the needs of rural and remote located medical practitioner candidates; 21 rural doctors and academics developing the assessment items as part of a week-long writing workshop; investigating the feasibility of purchasing assessment items; and 48 rural candidates piloting three of the assessment items to ensure they would meet the guidelines for national accreditation.
Results: The project resulted in an innovative formative and summative assessment program that occurs throughout 4 years of vocational training, using innovative, reliable, valid and acceptable methods with educational impact. The piloting process occurred for 3 of the 6 assessment tools. Structured Assessment Using Multiple Patient Scenarios (StAMPS) is a new assessment method developed as part of this project. The StAMPS pilot found that it was reliable, with a generalisability coefficient of >0.76 and was a valid, acceptable and feasible assessment tool with desired educational impact. The multiple choice question (MCQ) examination pilot found that the applied clinical nature of the questions and their wide range of scenarios proved a very acceptable examination to the profession. The web based in-training assessment examination pilot revealed that it would serve well as a formative process until ACRRM can further develop their MCQ database.
Conclusions: The ACRRM assessment program breaks new ground for assessing rural and remote doctors in Australia, and provides new evidence regarding how a comprehensive and contemporary assessment system can work within a postgraduate medical setting.

Key words: assessment program, Australia, Australian College of Rural and Remote Medicine, distance based assessment, rural and remote medical practice, StAMPS, vocational training.

full article:

Introduction

In June 2006 the Ruralhealth Education Development Consulting Pty Ltd (RhED) was commissioned by The Australian College of Rural and Remote Medicine (ACRRM) to develop an assessment pathway for fellowship of ACRRM. Their team consisted of eight academics and rural doctors from Australia, New Zealand, the USA and the United Kingdom who, with ACRRM's steering committee, worked to develop a rigorous, valid, reliable, acceptable, fair and educationally sound assessment program. The program was to be of a standard suitable for accreditation by the Australian Medical Council (AMC), and appropriate to the diverse needs of Australian rural and remote medical practitioners.

The scope of practice of Australian rural and remote medical practitioners is both different from and additional to that of their urban colleagues1-4. It is a broad horizontal discipline that intersects many medical specialities and general practice, and increases with geographical remoteness. Rural and remote doctors are commonly called on to provide a continuum of care from primary presentation to resolution in communities characterised by geographic isolation, cultural diversity, socio-economic inequality, resource inequity and a full range of extreme climatic conditions5,6. Their practice is both advanced and extended because they undertake roles that would be referred to a specialist in the city, such as: obstetrics, surgery, anaesthetics and emergency care. Rural and remote office-based presentations often require more complex decision-making and the diverse skills that often extend seamlessly into extensive hospital-based procedural care7. There is also considerable evidence of the much greater provision of procedural, emergency and other advanced care by rural medical practitioners both in Australia8-10 and other countries11-13 .

These factors offered the team both challenges and real opportunities for innovation, as they considered how they would develop a quality assessment program that would meet the needs of rural and remotely located candidates. The AMC accredited the program in February 2007. This was the first time in the world that a peak professional organisation for rural and remote medical education had been recognised formally as a standards and training provider, and is hence unique. Prior to this the only way Australian rural doctors could obtain vocational recognition following training was via an endpoint examination conducted by the Royal Australian College of General Practitioners.

Assessment can have many purposes. In an ideal world the most important aim of assessment is to inspire, measure and guide learning14. For ACRRM, the purpose was also to enable summative judgments about a candidate's level of competence and safety to practise in rural and remote locations; to identify and provide educational guidance and support for poorly performing candidates, and to allow for external certification15.

There were some parameters to work within, because ACRRM's training program had been in existence for almost 10 years and a great amount of curricular material had been developed and used extensively by members. This material included a second edition of the ACRRM Primary Curriculum and an extensive number of electronic resources that were delivered via a robust internet-based system - Rural and Remote Medical Education Online (RRMEO)16. The majority of the resources were also based on ACRRM's domains of rural and remote medical practice, which describe the unique aspects of the horizontal discipline of rural and remote medical practice. However because the domains were developed after the Primary Curriculum they were not yet reflected in it. This created challenges in developing an assessment blueprint because existing work had to be considered in all aspects of the development of the assessment program, in order to ensure maintenance of the integrity, intent and accessibility of existing curricula.

Ethics approval

Approval of the protocol to conduct piloting of the assessment tools was provided by the Australian College of Rural and Remote Medicine.

Methods

The development of the assessment program for the Fellowship of ACRRM had six main steps, which were consistent with approaches described in the literature15,17,18:

  1. Developing the assessment blueprint and classification system
  2. Identifying the assessment model
  3. Choosing the assessment methods
  4. Writing the assessment items
  5. Investigating the feasibility of purchasing assessment items and feasibility of the proposed assessment tools
  6. Piloting the chosen assessment methods.

Step 1: Developing the assessment blueprint

The most important phase in developing an assessment program is defining what is to be tested. In our case, this was done by identifying the desired learning outcomes19. This included identifying the range of knowledge, skills and attitudes that were expected of the candidate and the level of competence to which they are expected to perform in specific clinical tasks, within specific contexts.

This step included four phases:

  1. Reviewing the ACRRM Primary Curriculum to identify and write the learning outcomes to be assessed. This process included the development of a new curriculum framework, which enabled the identification of the 72 learning outcomes that linked across the existing 22 curriculum content areas.
  2. Reviewing, organising and numbering the learning outcomes using ACRRM's existing domains of rural and remote medical practice. These, described in Figure 1, formed the basis for an organising framework for the assessment blueprint.
  3. Allocating each learning outcome to the most appropriate evidence based assessment method/s15. These are listed under 'the assessment program' heading in Step 3.
  4. Developing a common classification scheme to identify the learning outcome, which included: the curriculum area, the domain, and the gender, age, presenting problem, and taxonomy of the case. This would enable sorting by classification in the database, thereby systematically blueprinting each assessment tool.



Figure 1: The ACRRM domains of rural and remote medical practice.

Step 2: Identifying the assessment model

Numerous factors influenced the decisions that were made in choosing the best assessment model for Fellowship of ACRRM. Overall, these included developing a set of assessment principles and a program of assessment that was acceptable to the profession, cost effective, valid, reliable, timely, and legally defensible, and that would meet the needs of the profession and the AMC Standards for Accreditation20. A literature review was undertaken to identify the evidence on which to base ACRRM's assessment program.

The assessment program is based on the 'programmatic model' as described by van der Vleuten and Schuwirth18. This is where assessment is seen as a 'program' across the entire training, rather than a specific instrument. This model of assessment enables ACRRM to collect a range of information about candidates over their entire training time, which can be 'aggregated' into summaries on which to make final decisions about their performance. The benefits of this approach are that it enabled multiple methods to be used, that suit a variety of candidate learning styles and needs, which cover the range of learning outcomes, and provide sufficient flexibility to assist candidates to plan their learning to meet the requirements of the FACRRM. An important consideration in planning the assessment blueprint was the balance between formative and summative assessment for FACRRM candidates, discussed in the next section.

Step 3: Choosing the assessment methods

In high stakes medical assessment processes it is vitally important that the assessment program is defensible. No single assessment method has all of the required qualities. Therefore, a combination of methods over a range of times was developed, which together forms a rigorous, valid, reliable, clinically relevant and educationally sound assessment program.

The assessment program was structurally based on the learning and performance benchmarks of Miller's Pyramid (Fig221). This model diagrammatically represents a behavioural approach to teaching, learning and assessment with four progressive hierarchical phases of competence. The first is that the candidate 'knows', the second that they 'know how', the third that they can 'show how' and, finally, what the candidate actually 'does' in the workplace22. Assessment methods were then allocated to each phase based on the literature, with the greatest importance being attributed to the higher levels of the pyramid. For example, multiple choice questions (MCQ) are best for the testing of knowledge, which indicates what the candidate knows, and are therefore placed at the bottom of the pyramid and built upon.

The representation of the pyramid highlights the well-established principle that assessment of a candidate's knowledge is important, but it is not sufficient to predict that they can and will apply their knowledge in their practice22,23. Therefore, in a high-stakes medical assessment process a variety of different formative and summative assessment methods, across the four stages of Miller's Pyramid, were chosen to make up the assessment program for the Fellowship of ACRRM. When an appropriate mix of assessment methods is combined, the aggregation contributes to a rigorous, defensible, formative and summative assessment program with a desirable educational impact.



Figure 2: Miller's Pyramid, used to select and combine assessment methods - adapted from21.

The assessment program: Six summative assessment methods were chosen, some of which are also used formatively. They are summarised and described in Table 1.

  1. Written examination - a 3 hour MCQ examination undertaken via the Internet using Type A questions (single best response with 4-5 options). Multiple choice question examinations are a reliable, valid, efficient and acceptable assessment method because they use a controlled standardised environment, allow assessment of a large range of applied clinical knowledge, and are relatively easy to administer. The exam will be undertaken during the second half of training.
  2. StAMPS examination - Structured Assessment using Multiple Patient Scenarios - an innovative new assessment method, developed specifically for ACRRM, consisting of a two-hour 8 station assessment of clinical reasoning undertaken via videoconference. This approach was developed to meet two purposes. First, there was a desire for an interactive assessment tool that candidates could undertake in their own practice, that is, making use of distance technology to minimise the time, cost and inconvenience of candidates having to travel away from their rural and remote community. Second, there was a desire to develop an adaptive assessment method so that it would allow an examiner to explore a variety of options in a clinical scenario, including how a candidate's responds to changing circumstances such as variations in a patient's condition or resource availability. Technical advice was sought, with an objective structured clinical examination (OSCE)-style format developed using multiple examiners who saw all candidates via multipoint videoconference. The pilot demonstrated this to be a very reliable and valid assessment tool24,25.
  3. Clinical skills log book - An electronic logbook was already in existence and was revised to link with the assessment blueprint, to record achievement of specified psychomotor clinical skills, as well as a means of monitoring practice experiences and competency throughout the training program
  4. Multi-source feedback - Also known as 360 degree feedback, it is used to assess the candidate's interpersonal and professional attributes in health professional settings in undergraduate and post graduate training26,27. This practice-based assessment will be undertaken throughout the entire training program by a variety of assessors such as supervisors, specialists, practice managers and patients, who have working relationships with the candidate.
  5. Mini-CEX - Mini-clinical evaluation exercises assess clinical and interpersonal skills. The mini-CEX involves a series of 'snapshots' of 15-20 min clinical encounters, allowing assessment across multiple patients and problems, in a wide range of clinical settings, requiring (real-life) prioritisation. It has been found to be reliable and to provide an opportunity for assessment and feedback28,29. It is a well-accepted approach that was developed from the traditional long case, involving a realistic clinical challenge requiring a comprehensive history and examination.
  6. Portfolio - Learning plans, the results of other formative and summative assessment items, including accredited courses, completed modules and other activities, are recorded electronically via the Learning Planner Management System found in RRMEO, to assist in demonstrating the meeting of the Fellowship of ACRRM requirements throughout the entire training program.

Table 1: Summary of the summative assessment program for FACRRM candidates



Step 4: Writing the assessment items

An assessment writing workshop was hosted by the School of Medicine at James Cook University in Townsville, Queensland in September 2006, to develop the assessment items. The workshop participants included: 21 experienced rural and remote doctors, and ACRRM staff and academics from Australia and New Zealand, supported by a medical editor and administrative staff. The week-long workshop resulted in the development of: 208 MCQ; 9 StAMPS examination scenarios; the criteria and guidelines for the two evidence based practice-based assessments - the Mini-Clinical Evaluation Exercise (CEX) and the Multi-Source Feedback; and a draft regulatory framework, including a policy for recognition of prior learning, and secure assessment item bank management system.

Workshop participants were attracted through advertising via ACRRM networks and short listed. Writers with a range of experience were provided with written information about each assessment tool prior to the workshop. An MCQ writing workshop was held on the first day. The participants worked to develop a bank of MCQ and participated in the development of one of the other six assessment tools that were streamed throughout the week.

The MCQ review process included on-site editing and reviewing by three external reviewers, and determining the appropriate pass standard using the Angoff method30. Guidelines were developed to ensure all participants and reviewers maintained item identification systems and security. Once the editing, review and classification of the MCQ was complete, the MCQ were entered into an Excel database, using the classification system to enable subsequent blueprinting. A more specialised database is currently under construction, because an existing postgraduate database was not identified.

Step 5: Investigating the feasibility of purchasing assessment items

At this time, to save time and cost, we also investigated the feasibility of purchasing assessment items from eleven other medical colleges and boards in Australia and overseas. The American Board of Family Medicine (ABFM) in-training examination was identified as the most suitable examination, as its items incorporate both hospital and general practice assessment areas, they examine across several disciplines, and are aligned more closely with ACRRM's curriculum than the other post-graduate examinations. An additional advantage was that the examination enables candidates to determine their ranking against international benchmarks and to gain experience in undertaking an internet-based examination, which provides candidates with feedback and references to all questions to guide their learning. As well as allowing candidates to benchmark their performance, the aggregated performance of all candidates will help ACRRM gauge the international standard of its candidates. The disadvantage is that approximately 20% of the questions had a north American focus and were not necessarily relevant to rural and remote medicine in Australia. It was, however, still considered useful as a formative process for ACRRM candidates, to guide learning and help them prepare for the final summative assessment.

Step 6: Piloting the chosen assessment methods

Three pilots were undertaken during November 2006 to evaluate the validity, reliability, acceptability, technology and educational impact of the tools. The pilots included over 100 people (48 candidates, 10 examiners, and 45 others) who were involved in the organisational and academic aspects of the piloted tools. The three tools piloted were the:

  1. StAMPS examination
  2. MCQ examination for rural and remote medical practice
  3. American Board of Family Medicine web-based in-training assessment examination.

Detailed evaluations were conducted on each of these tools with recommendations incorporated in the final report to ACRRM25.

Results: Pilot

StAMPS examination


The StAMPS pilot was undertaken by 14 registrars, using 9 examiners and one standardised patient. It occurred over a weekend at James Cook University in Cairns, Queensland, which was the only suitable site identified in Australia with at least eight videoconferencing facilities. The pilot results indicate that this assessment tool is reliable with a generalisability coefficient of 0.76, which places it internationally as one of the more reliable assessments of clinical reasoning and decision-making24. It was technically feasible and most reliable if a central hub of eight videoconference rooms could be used, where all eight examiners rotate around these rooms. Candidates and examiners found the process very acceptable as they could undertake the examination from their own rural environments and did not have to travel long distances. It was also fiscally viable compared with the existing option that requires candidates to leave their communities, often for several days, and incur costs for a locum and travel in order to undertake an examination in a central city. The candidates and examiners both found the examination assessed a good range of important and relevant situations that are commonly encountered in rural and remote practice.

I could name a patient I've seen with each of the scenarios (Examiner)

I thought it was very good, very fair, and excellent really. I thought they covered the breadth of things really well. (Candidate 1)

The pilot has shown that StAMPS is a valid, reliable, acceptable, and feasible summative assessment tool with a desirable educational impact24. It is suited as an exit assessment for the Fellowship of ACRRM, and it also has the potential to be used by other colleges or disciplines.

Multiple choice questions pilot

The MCQ pilot was undertaken face to face with 22 candidates at various levels of training and experience. The intended ACRRM web-based infrastructure was still under development and therefore not used in this pilot. The examination paper consisted of 50 of the newly developed type A MCQs. A post-pilot examination focus group was conducted with all candidates. This found that the applied clinical nature of the questions and their wide range of scenarios proved a very acceptable examination to the profession. The content was found to be valid, because it assessed areas of importance to rural and remote doctors. The statistical analysis indicated that there were 15 poorly performing questions, which were removed, and a second reliability estimate was undertaken. The remaining 35 items had a Cronbach's alpha reliability estimate of 0.64. Given that the MCQ pilot was conducted across several curriculum areas and domains, with a small number of items, by a small number of candidates with mixed abilities, a reliability estimate of 0.64 is considered relatively acceptable (an alpha level of >0.7 is considered ideal and >0.8 is considered to be the gold standard). Significant item development, inclusion of more items, and further reviewing will be required to further develop this unique item bank.

American Board of Family Medicine - web-based in-training assessment examination pilot

In November 2006, 12 candidates in five different Australian states, using the required ABFM web-based technology, undertook the ABFM pilot, which was also undertaken by over 9000 candidates worldwide at that time. The ACRRM candidates felt the examination was relevant as a formative assessment process in their early years of training, although they estimated that approximately 20% of the content was not relevant to the Australian context, due to the need to understand the US system, practice guidelines, culture and units of measure. Despite this, ACRRM candidates' performance was at a reasonable level with 75% (n = 9) of the candidates performing at a standard equivalent to, or above, a first year USA residency level. Clearly, the issue of American content and context of questions will always be a factor with this type of examination, though it will serve candidates well as a formative process until a suitable number of ACRRM-developed MCQ are available.

Discussion

Developing an assessment program for rural and remote medical practitioners that meets their unique needs, enables them to undertake it in their own communities, while also meeting the AMC guidelines for accreditation, provided ACRRM with specific challenges. This assessment strategy achieved a number of key goals. The programmatic approach described enabled a number of assessment methods to be used, each with particular strengths and weaknesses, which maximises flexibility, and achieves synergies between formative and summative assessment. Defining the learning outcomes to be assessed enabled the development of a unique assessment blueprint based on ACRRM's domains of rural and remote medical practice and their existing 22 curriculum statements. This provides candidates with explicit guidance to plan their learning. The suite of assessment formats chosen maximises flexibility for candidates, in particular allowing them to undertake all assessment processes in, or close to, their rural or remote location. Finally, the process and outcomes demonstrated by ACRRM in this project can serve as a model of assessment program development and implementation that other colleges could emulate. The outcome of this process resulted in an assessment program that was tailored to the Australian rural and remote medical practice context for the first time.

The use of multiple test formats with substantial testing time, a high number of items and multiple assessors is expected to provide high reliability18. The validity of the assessment should also be high, because it is based on ACRRM's decade of experience in curriculum development for rural and remote medicine, with assessment items purpose-designed by experienced rural doctors and rural academics. A number of approaches such as StAMPs also had high face validity. Acceptability and feasibility of the assessment approach have been demonstrated in the pilot, with good evidence from the experiences of the candidates and examiners involved and the various approaches trialed. Educational impact should also be high, given the synergies and logical progression from formative to summative assessment and the emphasis on the higher-order assessment approaches in the upper levels of Miller's Pyramid.

Conclusion

This assessment program breaks new ground for assessing rural and remote doctors in Australia. It also provides new evidence regarding how a comprehensive and contemporary assessment system within a postgraduate medical setting can work, because it was recognised by the accrediting body, the AMC, with very few queries or concerns.

ACRRMs future challenges will be to evaluate the process to ensure these early successes are maintained, and to address issues of feasibility that may arise. Above all, the goal must be to ensure ACRRM certifies only graduates of the highest quality, who are safe practitioners and reflect the unique attributes of advanced rural and remote medical practitioners.

Acknowledgements

The authors acknowledge the 100+ rural and remote medical practitioners, registrars and academics, who freely contributed their time and expertise in writing assessment items, and participating in the pilot assessments. Also acknowledged is ACRRMs SEAC Committee and The American Board of Family Medicine who kindly included Australian candidates in their international pilot. This project was funded by the Australian College of Rural and Remote Medicine as part of a grant from the Australian Government, Department of Health and Ageing.

References

1. Wakerman J. Defining remote health. Australian Journal of Rural Health 2004; 12: 215-219.

2. Smith JD. Australia's rural and remote health: A social justice perspective, 2nd edn. Melbourne: Tertiary Press, 2007.

3. Smith JD, Hays R. Is rural medicine a separate discipline? Australian Journal of Rural Health 2004; 12: 67-72.

4. Strasser R, Hays R, Kamien M, Carson D. Is Australian rural practice changing? Findings from the national rural general practice study. Australian Journal of Rural Health 2000; 8: 222-226.

5. Australian Bureau of Statistics. Australian Bureau of Statistics. (Online) 2006. National Aboriginal and Torres Strait Islander Health Survey 2004-05. Available: www.abs.gov.au (Accessed 3 October 2007).

6. Strong K, Trickett P, Titulaer I, Bhatia K. Health in rural and remote Australia: The first report of the Australian Institute of Health and Welfare on rural health. Cat no PHE 6. Canberra, ACT: Australian Institute of Health and Welfare, 1998.

7. Australian College of Rural and Remote Medicine. Australian Medical Council submission. Brisbane, QLD: ACRRM, 2004.

8. Humphreys J, Jones J, Jones M, Mildenhall D, Mara P, Chater A et al. The influence of geographical location on the complexity of rural general practice activities. Medical Journal of Australia 2002; 179: 416-421.

9. Wise A, Hays RB, Adkins P, Craig M, Mahoney P, Sheehan M et al. Training for rural general practice. Medical Journal of Australia 1994; 161: 314-318.

10. Britt H, Miller G, Valenti L. It's different in the bush - A comparison of general practice activity in metropolitan and rural areas of Australia 1998 - 2000. Cat no GEP 6. Sydney: Australian Institute of Health and Welfare, 2001.

11. Department of Health and Aged Care. General practice in Australia: 2000, 1st edn. Canberra: Department of Health and Aged Care, 2000.

12. Carter R. Training for rural medical practice: what's needed? Canadian Family Physician 1987; 33: 1713-1715.

13. Chan B. Atlas Reports: use of health services. Report 1: supply of physicians services in Ontario. Toronto, ON: Institute for Clinical Evaluative Sciences, 1999.

14. Hays RB. Pathways to Fellowship of the Australian College of Rural and Remote Medicine. Brisbane: Australian College of Rural and Remote Medicine, 2001. [Discussion paper]

15. Hays RB, Davis H, Beard J, Caldon L, Farmer E, Finucane P et al. Selecting performance assessment methods for experienced physicians: Papers from the 10th Cambridge conference. Medical Education 2002; 36: 910-917.

16. Australian College of Rural and Remote Medicine. ACRRM website. (Online) 2006. Available: www.acrrm.org.au (Accessed 3 October 2007).

17. Newble DI, Swanson DB. Psychometric characteristics of the objective structured clinical examination. Medical Education 1988; 22: 325-334.

18. van der Vleuten CPM, Schuwirth L. Assessing professional competence: from methods to programs. Medical Education 2005; 39: 309-317.

19. Van der Vleuten CPM. Guidelines for assessing clinical competence. Teaching and Learning in Medicine 1994; 6: 213-220.

20. Australian Medical Council. Accreditation of Medical Specialist Education and Training and Professional Development Programs: Standards and Processes. (Online) 2006. Canberra: Australian Medical Council. Available: http://www.amc.org.au/specguide.asp (Accessed 3 October 2007).

21. Miller G. The assessment of clinical skills/competence/performance. Academic Medicine 1990; 65 Suppl: S63-S67.

22. Hays R, Strasser R, Wallace A. Development of a national training program for rural medicine in Australia. Education for Health 1997; 10: 275-285.

23. Bashook P. Best practices for assessing competence and performance of the behavioural health workforce. (Online) 2005. Available: http://www.annapoliscoalition.org/pdfs/Best%20Practices%20in%20Workforce%20Assessment.pdf#search=%22miller's%20assessment%20triangle%201990%22 (Accessed 3 October 2007).

24. Wilkinson T, Smith JD, Margolis S, SenGupta T, Prideaux D. Structured Assessment using Multiple Patient Scenarios by Videoconference in Rural Settings. Medical Education 2007; (in press).

25. RhED Consulting. An Assessment Program for the Fellowship of ACRRM, Final Report Dec. Ocean Shores, NSW: Ruralhealth Education Development Consulting Pty Ltd, 2006.

26. Rees C, Shepherd M. The acceptability of 360-degree judgements as a method of assessing undergraduate medical students' personal and professional behaviours. Medical Education 2005; 39: 49-57.

27. Lockyer J, Blackmore D, Fidler H, Crutcher R, Salte B, Shaw K et al. A study of a multi-source feedback system for international medical graduates holding defined licences. Medical Education 2006; 40: 340-347.

28. Norcini JJ. The Mini Clinical Evaluation Exercise (miniCEX). The Clinical Teacher 2005; 2: 25-30.

29. Norcini JJ BL, Duffy FD, Fortna GS. The Mini-CEX: a method for assessing clinical skills. Annals of Internal Medicine 2003; 138: 476-481.

30. Angoff W. Scales, norms, and equivalent scores. In: R Thorndike (Ed.). Educational measurement. Washington, DC: American Council on Education; 1971.

This PDF has been produced for your convenience. Always refer to the live site https://www.rrh.org.au/journal/article/805 for the Version of Record.