WPBA Tools

The RCEM utilises standard and specialty specific WPBA tools, which are made up of:

  • RCEM assessment app
  • Mini-Clinical Evaluation Exercise (Mi or Mini-CEX, in anaesthesia A or Anaes-CEX)
  • Direct Observation of Procedural Skills ( DOPS)
  • Multi-Source Feedback (MSF)
  • Case-Based Discussions (CbD)
  • ESLE (Extended Supervised Learning Event) Tool
  • Patient Survey
  • Acute Care Assessment Tool (ACAT)
  • Audit Assessment
  • Teaching Observation
  • Structured Teaching Assessment Tool (STAT)
  • Journal Club Form (JCF)
  • Applied Critical Appraisal Form (ACAF)
  • Quality Improvement Assessment Tool (QIAT)
  • Leadership Assessment Tool

Details of these are given below and further information is available on the e-portfolio trainee section.

The following methods of assessment will provide evidence of progress in the integrated Programme of Assessment.

All individual assessments in the workplace are formative, assessment for learning, and therefore developmental in nature. That means they cannot be failed.  These episodes are an opportunity for learners to receive feedback about progress towards key progression points.  They are designed for that purpose.

WPBAs are anchored to the same entrustment scale that is used for summative decision making.  In that way, each episode provides the opportunity for clear developmental feedback to be given across the Clinical SLOs.

Assessment in the workplace should start right at the beginning of the training post and continue regularly thereafter. It is the responsibility of the learner to seek out, with the full support of the training faculty, learning opportunities that allow progress against each of the relevant Clinical SLOs to be reflected and recorded.

The collation of a range of evidence in formative assessment from the start of each placement is a clear indication of engagement in training and helps ensure the trainee gets full benefit from the learning opportunities in their placement. The formative WPBA tools in ACCS are listed below.

WPBA tools

Multi-source feedback (MSF)

This tool is a method of assessing generic skills such as communication, leadership, team working, reliability etc, across the domains of Good Medical Practice. This provides systematic collection and feedback of performance data on a trainee, derived from a number of colleagues. ‘Raters’ are individuals with whom the trainee works, and includes doctors, administrative staff, and other allied professionals. The trainee will not see the individual responses by raters. Feedback is given to the trainee by the Educational Supervisor.

Mini-Clinical Evaluation Exercise (Mini-Cex)

This tool evaluates a clinical encounter with a patient to provide an indication of competence in skills essential for good clinical care such as history taking, examination and clinical reasoning. The trainee receives immediate feedback to aid learning. The mini-CEX can be used at any time and in any setting when there is a trainee and patient interaction and an assessor is available.

Direct Observation of Procedural Skills (DOPS)

A DOPS is an assessment tool designed to evaluate the performance of a trainee in undertaking a practical procedure, against a structured checklist. The trainee receives immediate feedback to identify strengths and areas for development.

Case-based Discussion (CbD)

The CbD assesses the performance of a trainee in their management of a patient to provide an indication of competence in areas such as clinical reasoning, decision-making and application of medical knowledge in relation to patient care. It also serves as a method to document conversations about, and presentations of, cases by trainees. The CbD should focus on a written record (such as written case notes, out-patient letter, discharge summary).

Extended Supervised Learning Event (E or ESLE)

The ESLE is an extended event of observation in the workplace across cases. It covers interactions, decision-making, management and leadership, as well as the trainee’s individual caseload. The event will characteristically be 3 hours in length, with around two hours of observation followed by around one hour of feedback. The trainee will be observed during their usual work on shift, but the consultant observer will be supernumerary, i.e. ‘not in the clinical numbers’. Feedback will take place in a debrief using the RCEM non-technical skills feedback tool. These events will be completed by the educational/ clinical supervisor and at least one other consultant or equivalent. Each will yield an educational prescription to facilitate development across the academic year. The first must be completed within the first three months of the training year.

Acute Care Assessment Tool (ACAT (GIM), ACAT(EM))

The ACAT is designed to assess and facilitate feedback on a doctor’s performance during their practice on the acute medical take, and is used in AM (ACCS). Any doctor who has been responsible for the supervision of the acute medical take can be the assessor for an ACAT. This tool can also be used to assess other situations where a trainee is interacting with a number of different patients (eg in a day hospital or a business ward round)

Patient Survey (PS)

The PS addresses issues, including the behaviour of the doctor and effectiveness of the consultation, which are important to patients. It is intended to assess the trainee’s performance in areas such as interpersonal skills, communication skills and professionalism by concentrating solely on their performance during one consultation.

Audit Assessment Tool (AA)

The Audit Assessment Tool is designed to assess a trainee’s competence in completing an audit. The Audit Assessment can be based on review of audit documentation or on a presentation of the audit at a meeting. If possible the trainee should be assessed on the same audit by more than one assessor.

The Structured Teaching Assessment Tool (STAT)

The assessment schedule has an expectation that trainees develop their teaching throughout their training and teaching showed be viewed as a core part of the requirements of the emergency physician. The STAT is available on the e-portfolio to guide trainees through such an exercise and for this to be reviewed by their clinical or educational supervisor.  The tool can be used for both face-to-face and online teaching and is adaptable to all types of teaching episode.

Applied Critical Appraisal Form (ACAF)

The assessment schedule has an expectation that trainees develop the ability to consider their clinical work and identify questions that they would like to seek further evidence form the medical literature to help answer. These may come from clinical encounters in the workplace, or from workplace-based assessment discussions. The ACAF is available on the e-portfolio to guide trainees through such an exercise and for this to be reviewed by their clinical or educational supervisor.

Quality Improvement Assessment Tool (QIAT) 

The QIPAT is designed to assess a trainee’s competence in completing a quality improvement project. The QIPAT can be based on review of quality improvement project documentation or on a presentation of the quality improvement project at a meeting. If possible the trainee should be assessed on the same quality improvement project by more than one assessor.

Teaching Observation (TO)

The TO form is designed to provide structured, formative feedback to trainees on their competence at teaching. The TO can be based on any instance of formalised teaching by the trainee which has been observed by the assessor. The process should be trainee-led (identifying appropriate teaching sessions and assessors).

Decisions on progress (ARCP)

ARCP and progression decision making

The ARCP is the formal process where training progress is reviewed, usually on an annual basis. This process should be used to collate and systematically review evidence about a trainee’s performance and progress in a holistic way and make decisions about their achievement of expected outcomes and subsequent progression in training.

Throughout training, ACCS trainees should engage with the learning process by using the RCEM e-portfolio to demonstrate that they are meeting the requirements of the curriculum.

The evidence collected on the RCEM e-portfolio includes:

  • Faculty Governance Statements
  • placements in programme
  • personal development plans
  • logbook data
  • evidence of supervisory meetings
  • workplace based assessments
  • MSFs
  • evidence of reflection
  • Evidence of interaction with the Programme of Learning

This evidence should form the basis of the ES STR that is reviewed at the ARCP and considered when awarding an ARCP outcome. A satisfactory outcome at the ARCP is required in order to progress through the training programme. The ARCP process is described in the ‘Gold Guide’ and the LETBs/deaneries are responsible for organising and conducting ARCPs. The evidence to be reviewed by ARCP panels should be collected in the trainee’s RCEM e-portfolio.

The requirements for each of the RCEM SLOs are listed in the ARCP decision guide (Appendix 1). There is no absolute number or requirement for WPBA, but there are recommendations. Each trainee is different, and a bespoke programme will develop as supervisors learn more about strengths and areas to work on.

Trainees will be expected to seek opportunities to learn in each of the RCEM SLOs from the outset of training, and build a body of evidence that reflects their growing clinical ability and confidence. The inclusions should be meaningful and reflect episodes where the trainee learnt something important about acute care or themselves as a practitioner. Their training record should be a matter of professional pride and reflect the wide ranging experience in EM training.

Comments are closed