Programme of Assessment

Provisionally approved, subject to final GMC approval

The purpose of assessment

The purposes of the RCEM Programme of Assessment fall into three broad categories:

Assurance

• demonstrate trainees have acquired the Generic Professional Capabilities and meet the requirements of Good Medical Practice

• ensure that trainees possess the essential underlying knowledge required for their specialty

• provide robust, summative evidence that trainees are meeting the curriculum standards during the training programme

Regulating progression & targeting remediation:

• assess trainees’ actual performance in the work place 

• inform the ARCP, identifying any requirements for targeted or additional training where necessary and facilitating decisions regarding progression through the training programme

• identify performance concerns and ultimately trainees who should be advised to consider changes of career direction

Fostering self-regulated learners:

• enhance learning by providing formative assessment, enabling trainees to receive immediate feedback, understand their own performance and identify areas for development

• drive learning and enhance the training process by making it clear what is required of trainees and motivating them to ensure they receive suitable training and experience

• To identify and encourage excellence 

The purposes above have driven the design of the RCEM assessment strategy from start to finish. We have sought to define a fully integrated and complementary programme of assessment that recognises the strengths and limitations of its constituent parts to deliver a programme as a whole. The programme of assessment is made up of three major elements.

1. A suite of formal RCEM examinations

2. A programme of work place based assessments (WPBAs)

3. A programme of regular, panel-based, information-rich, individualised judgements that regulate each trainee’s progression and remediation (where necessary).

Formal examinations

The formal RCEM examinations prioritise assurance for all stakeholders including trainees.  Consequently, the design focus is on reliability and a nationally consistent hurdle with transparent standards. The suite of examinations provides assurance that a standard of knowledge and basic technique.  The MRCEM, in particular, aims to assess not only the attainment of learning, but also the potential to develop and thrive as an independent leader and practitioner following training.

Work Place Based Assessments (WPBAs)

The WPBA is designed to foster self-regulated learners and to provide the all-important information that will regulate trainees’ progression through the programme.

WPBA provides a structure for observing the individualised and contextualised application of learning.  By providing feedback and encouraging reflection it also helps trainees develop self-regulated learning skills.  The transparent links between the WPBA judgements, the entrustability judgements made by Faculty Entrustment Group  panels, and the levels of independence expected at each of the thresholds are critical for orienting learners to what is expected of them; this gives them both the stimulus and the data that they need to regulate their own learning.

Despite compromises in reliability, WPBA offers a better prediction of day-to-day performance than formal examinations with all the complexity that EM work includes.  In particular, the RCEM instruments have been designed to make it easy for supervisors and others to flag up concerns about any given trainee.  Conventional WPBA questions allow clear concerns to remain unshared, and this would create problems both for patients, trainees and services.

The WPBA programme is designed to be used throughout training, and so offers the opportunity for pertinent developmental feedback and the highlighting of concerns at regular intervals through training when there is a chance to define plans to support learning.

Panel-based judgements

FEG statements work with the ARCP process to provide regular, panel-based, information-rich, individualised judgements that regulate each trainee’s progression and remediation (where necessary).  Like the WPBA programme, they are designed to foster self-regulated learners and to regulate trainees’ progression through the programme.  The faculty will consider the trainee’s work place  performance and provide a summative recommendation about whether a trainee has met the standard in the SLOs relevant to their stage of training.  This information is combined with other evidence in a Structured Training Report (STR) that is completed by the trainee’s Educational Supervisor at the end of a block of training.  This, in turn, is reviewed by the ARCP panel who will make a decision regarding progression. 

These elements are phased, reflecting the growing knowledge and experience of trainees.   At key thresholds in training the work place based assessment and RCEM examinations are co-ordinated to enable ARCP panels to adjudge readiness to cross a threshold in training.  This approach acknowledges the complementary nature of the component parts of the assessment programme.

The flow of information in the new programme of assessment is shown in Figure 1.

a. The Training Faculty will deliver a summative recommendation on each of the Clinical SLOs that are relevant to the trainee’s stage of training, i.e. have they met the standard for entrustment.  This is summarised within a FEG Statement. 

b. The Educational Supervisor reviews the evidence collated for each of the Generic/ Supporting SLOs and offers a judgement on progress in these.  A matrix providing guidance for Educational Supervisors in these SLOs is available. (Appendix 2 below)

c. The Educational Supervisor also reviews WPBAs, Multi-Source Feedback and other relevant data, such as case load, critical incidents, reflections, log books and considers and offers insight on flags of concern.  This allows for an integrated and individualised collation of diverse evidence.

These three elements form the basis of the Educational Supervisor’s STR.  This, in turn, is reviewed by the ARCP panel. The panel will have access to all the relevant source material and will be able to provide oversight and ensure a nationally consistent approach and standard. The ARCP panel will make the final summative decision about progression.

When an ARCP occurs at a threshold in training (Threshold ARCP), the data held within the Educational Supervisor’s report will be combined with RCEM examination data to arbitrate on whether a trainee can cross the threshold, either into Higher Training, or to complete.

Figure 1. Information flow in RCEM Programme of Assessment

 

The RCEM assessment blueprint

The blueprint maps the Programme of Assessment to the curriculum (table 3).  It shows that each of the SLOs is assessed in a number of ways. 

The SLOs provide the structure for the formal RCEM examinations.  The RCEM examinations are also tagged to the RCEM Clinical Syllabus.  In essence, anything that is in the Clinical Syllabus can appear in the formal examinations, and each of the relevant SLOs will be tested.

For the WPBA programme, it is not necessary to use each of the tools shown in the blueprint table for each of the SLOs.  These are examples of tools that might be used to provide evidence of learning in each of these.  The ‘summative’ element of the WPBA programme is the entrustment decision for the Clinical SLOs and the Educational Supervisor’s review of the Generic/Supporting SLOs. 

Although we have moved away from a tick list, it is important for the trainee to show their development as a self-regulating learner by recording and reflecting on evidence in each of the Key Capabilities in the relevant SLOs relevant to their stage of training throughout each training attachment, from start to finish.  Engagement in training is very important and a marker of a trainee who is seeking to develop beyond their current capabilities and is a key principle that underpins the ethos of assessment in the work place.

RCEM Examinations

The RCEM examinations are an integral part of RCEM’s programme of assessment. They provide trainees with the opportunity to demonstrate, at critical progression points, the required outcomes of their training programme.  The RCEM exams comprise a programme of summative assessments in two parts: Membership of the Royal College of Emergency Medicine (MRCEM) and Fellowship of the Royal College of Emergency Medicine (FRCEM) examinations.  Each part uses validated assessment methods to test a broad-spectrum of knowledge, understanding, skills, behaviours and attitudes, as defined by the RCEM curriculum.  

The MRCEM examination consists of three components:  Two Single Best Answer (SBA) multiple-choice papers and an Objective Structured Clinical Examination (OSCE).  All MRCEM components are blueprinted to the RCEM curriculum – focussing on SLOs 1-7 and 9.  Questions used in each component are tagged to the RCEM basic science and clinical syllabi. The SLOs and the capabilities relevant to each SLO are set out in the assessment blueprint. Successful completion of all three Primary exam components is required to complete intermediate training as part of a programme of assessment designed to ensure readiness for Higher Training.

The FRCEM examination consists of two components: A Single Best Answer (SBA) test of knowledge, and a Multi Station Oral (MSO) exam.  The examinations used in the FRCEM are blueprinted to SLOs 1-12.  The questions used in the FRCEM examinations are also tagged to the RCEM Clinical Syllabus. The SLO and the relevant capabilities are set out in the assessment blueprint.  Successful completion of the FRCEM examination is the part of the Programme of Assessment that signifies readiness for independent practice at completion of training.

The MRCEM examinations: 

MRCEM Primary – Single Best Answer paper. This examination samples the basic science syllabus, ensuring a sound background knowledge in the basic science underpinning EM care. It can be undertaken at any point post registration as a medical practitioner. It comprises 180 questions to be answered in three hours.

MRCEM Intermediate – Single Best Answer paper. This examination samples the clinical syllabus and ensures a sound understanding of the full range of conditions and presentations that may present to the ED. The exam samples how the trainee synthesises and interprets data and clinical findings to make decisions and inform management plans.  It It comprises 180 questions, four hours testing time sat in two papers on the same day.

MRCEM OSCE – OSCE.  This examination objectively samples the clinical skills of trainees and ensures they are those of someone ready for Higher Training.  It includes history taking, examination, communication, decision making, dealing with challenging situations and resuscitation scenarios, using medium fidelity simulation.  There are 18 stations blueprinted to SLOs 1-7 and 8 and to the RCEM Clinical Syllabus.

The purpose of the MRCEM examinations

The MRCEM Primary ensures a high standard of knowledge of basic sciences relevant to EM at the outset of training. It functions at the base of Bloom’s revised taxonomy, remembering and understanding. It is an examination that can be sat during foundation training and doesn’t require clinical experience to complete.

The MRCEM Intermediate examination tests whether the trainee can integrate and synthesise information. It functions at the levels of understanding and applying in Blooms revised taxonomy. This examination does require a measure of clinical experience, and is designed and timed to build on the Primary.

Together these examinations ensure a thorough grounding in clinical knowledge and the ability to interpret and utilise clinical information and diagnostic results to develop safe and effective management decisions – a fundamental requirement for higher specialist training.

The MRCEM OSCE is a summative assessment of a candidate’s clinical and communication skills, applied technical knowledge of equipment, time management, and decision making under time pressure.  It uses a simulated clinical environment, including communication with simulated patients.  The MRCEM OSCE functions at the applying and analysing levels of Bloom’s revised taxonomy.  A successful candidate will have demonstrated the clinical skills and applied technical knowledge across multiple clinical scenarios required of a clinician entering HST.

Candidates must pass all parts to be awarded MRCEM.  They are designed to complement one another.  The primary purpose of the OSCE is to ensure that trainees are ready to be safe practitioners during HST.  The additional purpose of the Primary and the Intermediate is to ensure that they have foundations in place that will give them the potential to develop, in time, into independent and able leaders with the ability to work beyond protocols to solve unforeseen problems at the end of HST.

The FRCEM Examinations

FRCEM SBA

This is a Single Best Answer (SBA) paper.  This examination is blueprinted to the advanced clinical elements within the SLOs and the RCEM Clinical Syllabus.  In addition to specific knowledge-based competences, examination material may be developed from guidance or recommendations published by healthcare organisations such as the RCEM, NCEPOD, NICE, SIGN, NPSA, etc.  The public expects doctors to keep up to date with important developments and such material may be examined under the collective umbrella of ‘professionalism’.  This paper can be sat from the start of ST5.  It comprises 180 questions, four hours testing time sat in two papers on the same day.

The purpose of the FRCEM SBA

The FRCEM SBA examination is a summative assessment, blueprinted to all 12 RCEM SLOs.  The exam assesses the knowledge required of a trainee readying for independent practice. It tests factual knowledge and understanding and the ability to prioritise information.

FRCEM OSCE exam 

Face-to-face, multi station examination.  This examination is blueprinted to the complex or challenging situations an EM clinician will face, including the requirements of leadership and support within the ED.  These are found in SLOs 3, 7-12.  It will consist of 16 stations.  It will involve simulation using simulated patients and medium fidelity simulation equipment.  A key feature of this assessment is structured viva exploring decision-making and data analysis.  It is geared towards ‘evaluating and analysing’ in Bloom’s Revised Taxonomy .  Here we are more concerned with the trainee’s reasoning than with whether they simply reach a reasonable conclusion or undertake an appropriate action.  Reflection-in-action may be a more reliable predictor of future performance than an appropriate response to a single challenge. The examination consists of 16 stations with one examiner per station. This examination can be sat from the start of ST5.

3. Crossley, J. ‘Work place  Based Assessment’ in Delaney, C. and Molloy, E (eds.) Learning and Teaching in Clinical Contexts: a Practical Guide (Chatsworth NSW: Elsevier, 2018) p.255-267

The purpose of the FRCEM OSCE

The FRCEM OSCE is a summative assessment of a candidate’s knowledge, understanding and decision-making abilities in EM and the applied clinical science underpinning it.  It is also blueprinted to key elements of consultant practice within the curriculum, such as supporting other team members on the floor in the role as Emergency Physician in Charge, dealing with uncertainty and challenging emergent situations and understanding data and medical literature as it is relevant to decision-making in the ED, as well as the ability to communicate complex or sensitive issues expertly with patient and colleagues.  It is the last component of the FRCEM examination suite to be taken, and successful candidates are awarded the Fellowship.  The OSCE complements formative assessments undertaken in the work place, such as ESLE.  Together this programme of assessment in HST and provides assurance that candidates have reached the accepted national standard required of an independent practitioner in EM.

Validity of the RCEM Examinations

As defined in the GMC standards, examinations must consist of “an integrated set of assessments [which are blueprinted against, and support, the approved curriculum.  It may comprise of different methods”.  This standard refers to face and content validity.  In this regard, the components that make up the MRCEM and FRCEM are not only blueprinted to the curriculum, but have also been selected to assess the four levels of Miller’s pyramid[4]; knows, knows how [and why] and shows.  The fourth level, ‘Does’, is assessed by workplace-based assessments and relates to behaviour in real life situations.

[3] Crossley, J. ‘Workplace Based Assessment’ in Delaney, C. and Molloy, E (eds.) Learning and Teaching in Clinical Contexts: a Practical Guide (Chatsworth NSW: Elsevier, 2018) p.255-267

As defined in the GMC standards, examinations must consist of “an integrated set of assessments [] which are blueprinted against, and support, the approved curriculum.  It may comprise of different methods”.  This standard refers to face and content validity.  In this regard, the components that make up the MRCEM and FRCEM are not only blueprinted to the curriculum, but have also been selected to assess the four levels of Miller’s pyramid; knows, knows how [and why] and shows.  The fourth level, ‘Does’, is assessed by workplace-based assessments and relates to behaviour in real life situations.

 

Component Miller’s level Testing aims
MRCEM KnowsKnows how  Breadth of factual knowledge (Primary)Application of knowledge (Intermediate)
FRCEM SBA KnowsKnows how Depth of knowledge, understanding and application of knowledge
MRCEM OSCE Shows  Skills (procedural and cognitive), underpinned by knowledge.
FRCEM OSCE Shows Communication, prioritisation, option generation, scholarship, teaching

Whilst the MRCEM OSCE and FRCEM OSCE are aimed at the same levels of Miller’s pyramid (shows how), the FRCEM OSCE is designed to test a candidate’s knowledge beyond recall and recognition of facts. The structured interactions in the FRCEM OSCE allow examiners to explore a number of higher order domains within Bloom’s revised taxonomy [5] see figure 2 below, such as understanding, application of knowledge, analysing and evaluating.

[4] Miller, G. (1990).  The assessment of clinical skills/competence/performance. Academic Medicine, 65.

[5] Anderson, L. W. & Krathwohl, D.R., et al. (2001). A taxonomy for learning, teaching and assessing

Bloom’s revised taxonomy (BRT)

 

 

Standard setting

The MRCEM and FRCEM examinations are high-stakes summative assessments that have the potential to impact on trainee careers and patient safety.  The processes that underpin pass/fail decisions must be robust, consistent and fair. 

Each element of the examination schedule has changed in either content or format for this current Programme of Assessment and the standard-setting schemes for these new components are set out below.

Whole programme

The standards across the programme are absolute (criterion-referenced) rather than relative (norm-referenced).  This means that the standard required to pass each stage of the assessment does not vary according to the ability of the cohort either nationally or locally.

The separate elements of the assessment programme (MRCEM Primary, MRCEM Intermediate, MRCEM OSCE, FEG judgements, STR, ARCP, FRCEM SBA and FRCEM OSCE) are treated in a conjunctive rather than compensatory manner.  In other words, each trainee must reach the required standard in every separate component in order to pass.  A good performance in one element cannot compensate for a poor performance in another element.

The way in which the standard is defined within each element is described below.

Written papers

All written papers (MRCEM Primary, MRCEM Intermediate, FRCEM SBA) are now in the Single Best Answer (SBA) format.  There will only be one appropriate response (objective).  The format is appropriate for a question-centred (item-centred) standard-setting method.  We will use the Angoff approach.

In following best practice, a dedicated Angoff referencing group of examiners use the Angoff process to determine a cut score and make an adjustment of 1 Standard Error of Measurement (SEM) to arrive at the pass mark.  Training is given to all members of the MRCEM and FRCEM Angoff reference groups, and to develop a collective understanding of the ‘minimally competent’ candidate, as defined below:

“For the purposes of the SBA  examination, a ‘minimally competent’ candidate is one who has only just enough depth and breadth of the knowledge stipulated within the (Core/Intermediate level) curriculum to underpin their current clinical practice and equip them for the next phase of  training. 

In determining the ‘minimally competent’ candidate, members of the Angoff referencing groups are encouraged to use personal experience of trainees sitting the exam at the particular stage of training.

After each examination, item analysis provides the exam board with data on items with unexpected performance statistics.  Highlighted items are reviewed, and if the item itself is problematic it is removed from the paper before scores are finalised.

Face-to-face examinations

The MRCEM OSCE and FRCEM OSCE are different from the SBAs in that it is not possible to define the only appropriate response or responses ahead of time.  This means that a candidate-centred (performance-centred) method of standard setting is required.  We will use the borderline regression method for setting the standard for the examination.

We have also chosen to extend the conjunctive approach used for the programme and treat some areas of performance within the face-to-face examinations in a conjunctive manner.  This is because we consider that there are some areas of practice where a weak candidate should not be able to progress to the next state of training (or exit training) however strong their performance elsewhere in the examination.  This is, in essence, a standard-setting position that raises the required standard beyond the purely compensatory approach to stations.

Which domains, stations and combinations of stations will be treated conjunctively will be determined at the blueprinting stage on a cycle-by-cycle basis, and will be reviewed at exam board in the light of station analysis data from the sitting.  However, examples may include:

 

  • Candidates who perform poorly in any one domain across the whole examination (e.g. communication).
  • Candidates who perform poorly in more than one station blueprinted to the same SLO (e.g. PEM – SLO 5).
  • Candidates who fail outright a station that performs well and is regarded as being of non-negotiable importance in any given examination (e.g. resus – SLO 3).

Fairness

The fairness of an exam refers to its freedom from bias.  To this end and in the exercising of its duties under the Equality Act 2010, the RCEM must give due regard to:

• eliminating discrimination, harassment, victimisation and any other unacceptable conduct;

• advancing equality of opportunity between persons who share a relevant protected characteristic and persons who do not share it; 

• fostering good relations between persons who share a relevant protected characteristic and persons who do not share it. 

The RCEM aims to ensure that everyone has equal opportunity to demonstrate their ability in the examinations and that no candidate is treated less favourably than another on grounds of race, disability, sex, transgender, sexual orientation, age, religion, or pregnancy and maternity. 

To ensure RCEM exams do not disadvantage any candidate, or group of candidates, on any basis other than the candidate’s lack of the knowledge and skills, all RCEM exam item developers take particular care in the wording of questions to avoid ambiguity or offence across cultures. Additionally, examiners seek advice from specific experts/organisations/ associations or the experience of individuals in the work place on topics which may potentially affect certain ethnicities or cultures. 

RCEM encourages feedback from candidates on the examination process and its content, and this provides access to a viewpoint on some protected characteristics not reflected in our examiner groups. 

Reasonable adjustments

 RCEM considers reasonable adjustments for exam candidates with disability, as set out in the examination regulations. Special arrangements for pregnancy and temporary medical conditions are also provided where necessary.  Full details are available within the RCEM examinations regulations. 

Equality analysis 

Equality analysis is an integral part of examination policy, content and practice. RCEM carries out objective, evidence-based equality analysis when making decisions relating to exam changes, policies, question writing and practices. This ensures that full consideration is given to the effect such decisions may have on the fairness of the exam, and aims to prevent discrimination, promote diversity and inclusivity for all groups of people.

Quality assurance

A full Person Specification and Job Description is provided for examiners.  The Examiner Regulations give full details of all parts of the selection and appointment process used by the RCEM. 

Examiner training is mandatory for new examiners, which includes the principles of assessment, standard setting, examiner practice and calibration, and known and unknown bias.  After attendance at an Examiner Workshop, examiners are required to observe an OSCE day prior to examining.  This gives them the opportunity to assess candidates in a real examination.  Examiners are required to attend the Examiners’ Workshop every five years.

RCEM exams do not currently use lay examiners, although lay input is encouraged through attendance at committees and working groups. 

Feedback

RCEM believes it is important to provide feedback to candidates beyond a standard pass-fail result, to assist them in understanding and interpreting their overall result. RCEM does not attempt to justify the result given or the marks awarded, whether overall or for specific sections or skill domains. Marks are awarded using strict guidelines. The decision on marks awarded is final and therefore papers cannot be remarked. 

The following feedback is provided in all exam results letters/feedback enclosures:

 

  • Confirmation of the candidate’s pass-fail result 
  • Confirmation of the number of attempts used/maximum number of attempts 
  • The examination pass mark as a raw score in relation to the maximum achievable test score (e.g. 315/420) and/or the percentage value (e.g. 75%) 
  • The candidate’s overall score as a raw score and/or as a percentage 

Entrustment Decisions

Transitions and the crossing of thresholds are about taking on new responsibilities with a higher degree of independence.  Knowing whether a trainee is ready to do so is complex.  It requires a clear working knowledge of what the responsibilities involve, and the ability to predict how a trainee will respond when given responsibility.  An example is the care of patients in the resuscitation room (SLO3) when the consultant is at home when on call.

This kind of assessment is an example of ‘judgement-based’ assessment.  Scholarship in this field has seen a major transition from reductionism (breaking the assessment down to multiple ‘objective’ elements and assessing these) to entrustment (making the most of the sophisticated, contextual, individualised global judgements of which clinician trainers are capable).  Key features of good judgement-based assessment are asking the right people and asking the right questions3.  The FEG panels are composed of staff who know the trainee well and know the responsibilities of the job well.  This provides us with the best chance of meaningful FEG judgements.  Critically, the judgements are framed in terms of entrustment and independence.  This aligns with the natural decision-making heuristics of clinician supervisors, and there is good empirical evidence that that such ‘construct aligned’ judgements are significantly more dependable that judgements framed in terms of training stage or merit (e.g. poor, satisfactory, or good).4

The WPBA approach is built around preparing trainees for thresholds in training.  To that end, assessments in the work place are also aligned to entrustment/ independence.  The RCEM entrustment scale is shown in table 4.

4. Crossley J, Jolly B Making sense of work-based assessment: ask the right questions, in the right way, about the right things, of the right people. Medical Education 2012 46(1):28-37

5. Learning and Teaching in Clinical Contexts: A Practical Guide Delaney and Molloy 2018 ISBN 9780729542722

Table 4. RCEM entrustment scale

 

1 Direct supervisor observation/involvement, able to provide immediate direction/ assistance
2a Supervisor on the ‘shop-floor’ (e.g. ED, theatres, AMU, ICU), monitoring at regular intervals
2b Supervisor within hospital for queries, able to provide prompt direction or assistance and trainee knows reliably when to ask for help
3 Supervisor ‘on call’ from home for queries, able to provide directions via phone and able to attend the bedside if required to provide direct supervision
4 Would be able to manage with no supervisor involvement (all trainees practice with a consultant taking overall clinical responsibility) 

The expectation of EM trainees at each of the key thresholds in training are shown in figure 3.  This ensures that the requirements are transparent and explicit for all – trainers, trainees and the public.  Making these expectations transparent for trainees is one of the ways our assessment scheme is designed to foster self-regulating learners.  By providing a common and transparent map of what is expected from start to finish over the training journey, we give trainees the best chance of orienting themselves in terms of the progress so far and their next steps.  We also unify consistency of feedback across the whole learning journey making it more credible to learners.

FEG decisions are extremely important for trainees and should not come as a surprise at the end of a period of training.  The design of WPBAs, with entrustment score offered in feedback, means that should not be the case if trainees engage with training opportunities available.

 

 

Figure 3. RCEM Entrustment requirements

 

6. Ten Cate O. Nuts and bolts of entrustable professional activities. J Grad Med Educ. 2013;5(1):157–158

7. James G. M. Crossley (2014) Addressing learner disorientation: Give them a roadmap, Medical Teacher, 36:8, 685-691

 

  ACCS: 6 months covering each of anaesthetics, acute medicine, intensive care medicine and emergency medicine. Can be worked in any order. SLOs reviewed in each post. Entrustment level required by end of ACCS Intermediate: paediatric emergency medicine, leadership roles, support wider EM team

Higher Specialty Training:

leading, EPIC, delivering challenging cases, management and administration toolkit, research, supervision and teaching

 
Clinical SLOs EM IM An ICM   Intermediate   ST4 ST5 ST6  
Care for physiologically stable patients attending the ED across the full range complexity 2b 2b       3   * * 4  
Answer clinical questions 2a 2a       3   * * 4  
Resuscitate and stabilise 2b 2b 2b 2b   3   * * 4  
Care for an injured patient 2b         3   * * 4  
Care for children in the ED           3   *> * 4  
Deliver key procedural skills Refer to Clinical ACCS LO 5 table   3   * * 4  
Deal with complex situations in the workplace 2a 2a 2a 2a   3   * * 4  
Lead the ED shift           3   * * 4  
Provide basic anaesthetic care (ACCS)     2b                
Manage patients with organ dysfunction and failure (ACCS)       2a              
Supporting SLOs                      
Teach and supervise ES report   ES report ES Report  
Participate in research  
Patient safety & quality improvement  
Lead, manage, administer  

 

 

*Progress in each of the clinical SLOs towards independence will be assessed in each year in HST

** Entrustment level for procedural skills varies in ACCS. Please see Section 5.5.1 for details

Faculty Entrustment Group (FEG) Statement

What is it?

This is a statement that summarises the collated views of the training faculty as to the progress of a trainee, specifically, their suitability to move to the next stage of training.  This judgement is based on the observation of the trainee in the work place , on feedback from staff and patients and what faculty members have learned about trainee’s performance in conducting WPBAs.  (Individual WPBAs and reflections need not be reviewed by the training faculty at each FEG meeting, but they are available for review if the faculty judges that they need more data to make their judgement.).  Within this statement, the strengths of the trainee are also summarised as well as areas to develop thus giving the opportunity to reflect and encourage excellence. The FEG panel can also offer a suggestion for how the trainee might address any on-going training needs, potentially making the FEG an ‘adaptive’ or individualised assessment.

The FEG Statement was introduced in RCEM training in 2015, with a decision relating to the whole training year in general.  The evolution in this current programme of assessment is that the decision is now linked explicitly to progress in the relevant Specialty Learning Outcomes.  Anchoring this decision to independence with a clear description of what is required will be a significant benefit to trainees and trainers in making these decisions fairer and more transparent. 

The FEG Statement serves a summative purpose within our assessment programme.  It is then triangulated with other information in the Educational Supervisor’s report, to inform ARCP decision making.  The FEG Statement is held on the e-portfolio and is accessed by the trainee and the Educational /Clinical supervisor and Training Programme Director only.

The FEG process provides the opportunity for deeper, more timely, and more information-rich scrutiny of progress towards the key work place  SLOs than the old Supervisor Report was able to deliver.

How is it done?

The FEG Statement can be made in different ways according to local arrangements.  However, the key feature of the FEG is that it includes the views of the right people – those who know the trainee and know the responsibilities of the job.  It must represent the collated views of the training faculty as to whether they believe a trainee has met the requirement for practise in each of the relevant SLOs at the level of independence specified for their stage of training.  The decision will relate to the Key Capabilities for each SLO that are relevant to the trainee’s stage of training.

The faculty is bound by the requirements on them of the GMC’s Good Medical Practice guidance, by the requirements for fairness and transparency, the requirement that equality and diversity is respected and by the personal ethics and probity of individual members.

Good practice from a number of centres has been that ‘educational governance’ is a standing agenda item at consultant meetings and discussions of all trainees occur at regular (e.g. two- monthly) intervals.  This approach ensures that concerns are documented and  shared early and trainees can be better supported.   It facilitates encouragement of trainees and the feedback of excellence. It is also fair to trainees who will receive a summative decision from the same panel that they are fully aware of how that group are minded towards their progress in each of the relevant SLOs.

The final meeting is for the purposes of FEG Statement completion.  A quorate meeting would include at least three consultants, who must be trained Educational Supervisors. 

Other centres have a designated training faculty from among their consultant body that perform this function at a formal Educational Governance meeting comprised of the College Tutor (or equivalent), Educational /Clinical supervisor and at least two other consultant trainers.  At this meeting the progress of each trainee against each SLO is discussed and the output of this meeting is the Faculty Education Governance Statement.

Example:

SLO1: Care for a complex stable patient across the full range of complexity. Core (ACCS) trainee:

 ‘We believe this trainee can be trusted take a history, examine the patient and elicit key clinical signs, construct a differential diagnosis that considers a realistic worst case scenario and describes an appropriate management plan with senior help available, but not directly overlooking their work. The trainee can be relied upon to seek help when required’ 

This is the Key Capability for SLO 1 and describes entrustment level 2b.

The panel’s view is sought.  Panellists will be asked to reflect on their experience of trainees across the full spectrum of cases.  This decision is a statement about the confidence of the team that a learner can be relied upon to make a safe assessment and seek help as needed.  A yes/ no answer is required. 

This process is repeated for the other SLOs that are relevant to the current phase of training. 

The FEG Statement is recorded in the trainee’s e-portfolio by their Educational or Clinical Supervisor. The FEG Statement also includes general feedback on trainee strengths and areas to develop.

When is it done?

Final FEG statements are made towards the end of a given block of training in an Emergency Medicine placement.  This is typically six months (whole time equivalent) during ACCS and Intermediate Training, and yearly in Higher Training.  However, with most approaches to FEG, it should be possible for the faculty to indicate to the trainee their general progress towards the final FEG statement at regular intervals ahead of time.  WPBA performance should also give a strong indication of progress.

What if a trainee is deemed not ready to progress?

For the large majority of trainees these decisions will be positive.  However, if problems or concerns are raised about a trainee in departmental education governance meetings, or by other means, these can be fed back with learning needs identified and a plan to remediate put in place.  If these persist throughout an entire block of training this will be reflected in the FEG Statement and the subsequent ARCP panel will outline an appropriate training plan.

An opinion that a trainee is not ready to progress should not come as a surprise at the end of a placement, and should not be seen as punitive by the trainee or trainers.  It is a formal recording of the opinion of the faculty on progress at the end of that training block and reflects support and deliberation throughout the block.

Indeed, the fact that not all trainees will reach the threshold after the same duration of training is a realistic reflection of the variation that exists between learners, their experience and the complex and highly responsible role a higher trainee and consultant in EM clinician embodies. 

Assessment of Speciality Learning Outcome 6-Procedural skills

Procedural skills in ACCS

 

Procedure End of of ACCS

Pleural aspiration of air

Entrustment requirement: 2b

Programme of learning

e-learning module

Simulated practice or supervised practice on patient

Programme of assessment

DOPS assessment

Chest drain: Seldinger technique

Entrustment requirement: 2b

Programme of learning

e-learning module

>Simulated practice and/or supervised practice on patient

Programme of assessment

DOPS assessment

Chest drain: open technique

Entrustment requirement: 1

Programme of learning

e-learning module

Simulated practice and/or supervised practice on patient

National Safety Standards for Invasive Procedures (NatSSIPs) checklist

Programme of assessment

DOPS assessment OR

Supervised practice on patient with reflection recorded

Simulated practice with reflection recorded

Establish invasive monitoring (CVP and arterial line)

Entrustment requirement: 2b

Programme of learning>

Simulated practice and/or supervised practice

Programme of assessment

DOPS assessment for CVP line AND

DOPS assessment for arterial line

Vascular access in emergency (IO and femoral vein)

Entrustment requirement: 2b

Programme of learning

Simulated practice and/or supervised practice

Programme of assessment

DOPS assessment on either OR

Supervised practice on patient with reflection recorded

Simulated practice with reflection recorded

Fracture/dislocation manipulation

Entrustment requirement: 1

Programme of learning

Supervised practice on patient

Programme of assessment

DOPS assessment OR

Supervised practice with reflection recorded

External pacing

Entrustment requirement: 2b

Programme of learning

e-learning module on bradyarrhythmias

Simulated practice and/or supervised practice on patient

Programme of assessment

DOPS assessment OR

Supervised practice on patient with reflection recorded OR

Simulated practice with reflection recorded

DC cardioversion

Entrustment requirement: 2b

Programme of learning

e-learning module on broad and narrow complex tachycardias

Simulated practice and/or supervised practice

Programme of assessment

DOPS assessment OR

Supervised practice on patient with reflection recorded OR

Simulated practice with reflection recorded

Point of care ultrasound-guided vascular access and fascia iliaca nerve block

Entrustment requirement: 2b

Programme of learning

Simulated practice and/or supervised practice

Programme of assessment

DOPS assessment for vascular access AND

DOPS assessment for fascia iliaca nerve block

Lumbar puncture

Entrustment requirement: 2b

Programme of learning

e-learning module

Simulated practice and/or supervised practice on patient

Programme of assessment

DOPS assessment

Continued performance of ACCS procedural skills in intermediate and higher training will be recorded in the RCEM log book. This record includes any complications and the level of support received if relevant. Episodes where the trainee was supervising others in these skills will also be recorded.

Procedural Skills in Intermediate and Higher Training

Procedure End of intermediate training  End of higher training 
Paediatric sedation Programme of learningCompletion of RCEM e-learning module on paediatric sedationAttendance at paediatric sedation simulation session Performance of observed procedural sedation patients under the direct supervision of an ED ConsultantProgramme of assessmentCertificate of completion of RCEM e-learning moduleCertificate of attendance at paediatric sedation simulation dayDOPS for paediatric sedation (simulation) Programme of learningPerformance of observed procedural sedation patients under the direct supervision of an ED ConsultantProgramme of assessmentLogbook recordDOPS for sedation (not on simulated patient) – signed off as level of competent for independent practice
Advanced airway management Emergency surgical airwayRSI Programme of learningSimulated skills practice including surgical airway trainingProgramme of assessmentDOPS evidence of simulated practiceLog book record  Emergency Surgical airwayRSI Programme of learning10 Intubations a year on patients- may include RSI activity in ED/ theatres (including as part of a multi-disciplinary team)Simulated skills practice session including surgical airway training
Programme of assessmentLogbook record
Non-invasive ventilation Programme of learningRCEM e-learning module on NIVSimulated practice of NIV initiationProgramme of assessmentCertificate of completion of RCEM e-learning moduleDOPS for NIV initiation Log book record of skill maintenance
Open Chest drain Understanding of National Safety Standards for Invasive Procedures (NatSSIPs) checklistSupervised practice of chest drain insertion on patient (open)Programme of assessmentDOPS for simulated chest drain insertion, or satisfactory supervised practice (open)Log book record Log book record of skill maintenance
Resuscitative thoracotomy Programme of learningRCEM learning module on resuscitative thoracotomySimulated practice (ideally cadaveric)AssessmentCertificate of completion of RCEM e-learning module Programme of learningSimulated practice AssessmentCertificate of completion of simulated practice
Lateral Canthotomy Programme of learningRCEM learning module Simulated practiceProgramme of assessmentCertificate of completion of RCEM e-learning module Programme of learningSimulated practiceProgramme of assessmentCertificate of completion of simulated practice
DC cardioversion Programme of learningRCEM learning module on Broad and Narrow Complex tachycardiasSimulated practiceProgramme of assessmentEvidence of theoretical knowledge on the management of arrhythmias DOPS for DC cardioversionLogbook record Programme of learningMaintenance of skills throughout HST Programme of assessmentLogbook record
External pacing Programme of learningRCEM learning module on bradyarrhythmiasALS courseSimulated practiceProgramme of assessmentCompletion of eLearning moduleDOPS for external pacingLogbook record Programme of learningMaintenance of skills throughout HST Programme of assessmentLogbook record
Pericardiocentesis Programme of learningSimulated practiceProgramme of assessmentEvidence of theoretical knowledge on the technique DOPS for pericardiocentesis in simulated environmentLogbook record Programme of learningSimulated practiceProgramme of assessmentEvidence of theoretical knowledge on the technique Evidence of practical knowledge on the technique (e.g. OSCE)DOPS for pericardiocentesisLogbook record
ED management of life-threatening haemorrhage Nasal Packing Splints (e.g. pelvic sling, traction splint) Tourniquet Haemostatic agents Programme of learningSimulated practiceSupervised practiceProgramme of assessmentLogbook record Maintenance of skills in HSTLogbook record
Emergency delivery Programme of learningE-learning moduleSimulated practiceProgramme of assessmentLogbook record Programme of learningSimulated practiceProgramme of assessmentLogbook record
Peri-mortem c-section Programme of learningTheoretical training eg RCEM learning moduleSimulated practice Programme of assessmentLogbook record Programme of learningPractice skill in simulated environment, including decision making and human factors.prepared to act independently
Programme of assessmentLogbook record
Fracture / Dislocation manipulation Programme of learningSupervised practice
Programme of assessmentDOPS for fracture reduction / joint manipulationLogbook record
Programme of learningMaintenance of skills throughout HST 
Programme of assessmentLogbook record
Large joint aspiration Programme of learningSimulated practiceSupervised practice
Programme of assessmentDOPS for large joint aspiration (e.g. knee)Logbook record

Programme of learningMaintenance of skills throughout HST 

Programme of assessmentLogbook record

Suprapubic catheter re-insertion Programme of learningSimulated practiceSupervised practiceAssessmentDOPS for suprapubic catheter re-insertion Logbook record Programme of learningMaintenance of skills throughout HST AssessmentLogbook record of skills maintenance
Point of care Ultrasound (Diagnostic) (Modular level 1 theory training completed in ACCS) Basic Echo Life Support (BELS)- Is the heart beating? Is there cardiac tamponade? Is there right ventricular dilatation?AAA
eFAST / Focussed Assessment for Free Fluid (FAFF)Programme of LearningRCEM learning USS resources on data interpratationObserved practice
Programme of AssessmentLog Book recordDOPSEducational supervisor review of logbook regarding progress towards sign off when competent?
Echo in Life support (ELS)-including IVC measurement, global contractility and assessment of fluid overload.AAAeFAST / Focussed Assessment for Free Fluid (FAFF) Programme of LearningRCEM learning USS resources on data interpretationObserved practiceProgramme of AssessmentLog Book record DOPSEducational supervisor review of logbook and sign off for :BELSELSAAAeFAST/ FAFF by the end of HST Entrustment  based- for guidance approximate number of scans expected: BELS 10; AAA 25;  ELS 25; eFAST/FAFF 25. Scans recorded in the log book through training. Maintain log book for each modality when scanning  independently  

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.