Programme of Assessment

The purpose of assessment

The purposes of the RCEM Programme of Assessment fall into three broad categories:

Assurance:

• demonstrate trainees have acquired the Generic Professional Capabilities and meet the requirements of Good Medical Practice

• ensure that trainees possess the essential underlying knowledge required for their specialty

• provide robust, summative evidence that trainees are meeting the curriculum standards during the training programme

Regulating progression & targeting remediation:

• assess trainees’ actual performance in the work place

• inform the ARCP, identifying any requirements for targeted or additional training where necessary and facilitating decisions regarding progression through the training programme

• identify performance concerns and ultimately trainees who should be advised to consider changes of career direction

Fostering self-regulated learners:

• enhance learning by providing formative assessment, enabling trainees to receive immediate feedback, understand their own performance and identify areas for development

• drive learning and enhance the training process by making it clear what is required of trainees and motivating them to ensure they receive suitable training and experience

• To identify and encourage excellence

The purposes above have driven the design of the RCEM assessment strategy from start to finish. We have sought to define a fully integrated and complementary programme of assessment that recognises the strengths and limitations of its constituent parts to deliver a programme as a whole. The programme of assessment is made up of three major elements.

1. A suite of formal RCEM examinations

2. A programme of work place based assessments (WPBAs)

3. A programme of regular, panel-based, information-rich, individualised judgements that regulate each trainee’s progression and remediation (where necessary).

Formal examinations

The formal RCEM examinations prioritise assurance for all stakeholders including trainees.  Consequently, the design focus is on reliability and a nationally consistent hurdle with transparent standards. The suite of examinations provides assurance that a standard of knowledge and basic technique has been demonstrated by the trainee

The MRCEM, in particular, aims to assess not only the attainment of learning, but also the potential to develop and thrive as an independent leader and practitioner following training.

Work Place Based Assessments (WPBAs)

The WPBA is designed to foster self-regulated learners and to provide the all-important information that will regulate trainees’ progression through the programme.

WPBA provides a structure for observing the individualised and contextualised application of learning.  By providing feedback and encouraging reflection it also helps trainees develop self-regulated learning skills.  The transparent links between the WPBA judgements, the judgements made by Faculty Entrustment Group  panels and the levels of independence expected at each of the thresholds are critical for orienting learners to what is expected of them; this gives them both the stimulus and the data that they need to regulate their own learning.

Despite compromises in reliability, WPBA offers a better prediction of day-to-day performance than formal examinations with all the complexity that EM work includes.  In particular, the RCEM instruments have been designed to make it easy for supervisors and others to flag up concerns about any given trainee.  Conventional WPBA questions allow clear concerns to remain unshared, and this would create problems both for patients, trainees and services.

The WPBA programme is designed to be used throughout training, and so offers the opportunity for pertinent developmental feedback and the highlighting of concerns at regular intervals through training when there is a chance to define plans to support learning.

Panel-based judgements

FEG statements work with the ARCP process to provide regular, panel-based, information-rich, individualised judgements that regulate each trainee’s progression and remediation (where necessary).  Like the WPBA programme, they are designed to foster self-regulated learners and to regulate trainees’ progression through the programme.  The faculty will consider the trainee’s work place  performance and provide a summative recommendation about whether a trainee has met the standard in the SLOs relevant to their stage of training.  This information is combined with other evidence in a Structured Training Report (STR) that is completed by the trainee’s Educational Supervisor at the end of a block of training.  This, in turn, is reviewed by the ARCP panel who will make a decision regarding progression.

These elements are phased, reflecting the growing knowledge and experience of trainees.   At key thresholds in training the work place based assessment and RCEM examinations are co-ordinated to enable ARCP panels to adjudge readiness to cross a threshold in training.  This approach acknowledges the complementary nature of the component parts of the assessment programme.

The flow of information in the new programme of assessment is shown in Figure 1.

a. The Training Faculty will deliver a summative recommendation on each of the Clinical SLOs that are relevant to the trainee’s stage of training, i.e. have they met the standard for entrustment.  This is summarised within a FEG Statement.

b. The Educational Supervisor reviews the evidence collated for each of the Generic/ Supporting SLOs and offers a judgement on progress in these.  A matrix providing guidance for Educational Supervisors in these SLOs is available. (Appendix 2 below)

c. The Educational Supervisor also reviews WPBAs, Multi-Source Feedback and other relevant data, such as case load, critical incidents, reflections, log books and considers and offers insight on flags of concern.  This allows for an integrated and individualised collation of diverse evidence.

These three elements form the basis of the Educational Supervisor’s STR.  This, in turn, is reviewed by the ARCP panel. The panel will have access to all the relevant source material and will be able to provide oversight and ensure a nationally consistent approach and standard. The ARCP panel will make the final summative decision about progression.

When an ARCP occurs at a threshold in training (Threshold ARCP), the data held within the Educational Supervisor’s report will be combined with RCEM examination data to arbitrate on whether a trainee can cross the threshold, either into Higher Training, or to complete.

Figure 1. Information flow in RCEM Programme of Assessment

The RCEM assessment blueprint

The blueprint maps the Programme of Assessment to the curriculum (table 3).  It shows that each of the SLOs is assessed in a number of ways.

The SLOs provide the structure for the formal RCEM examinations.  The RCEM examinations are also tagged to the RCEM Clinical Syllabus.  In essence, anything that is in the Clinical Syllabus can appear in the formal examinations, and each of the relevant SLOs will be tested.

For the WPBA programme, it is not necessary to use each of the tools shown in the blueprint table for each of the SLOs.  These are examples of tools that might be used to provide evidence of learning in each of these.  The ‘summative’ element of the WPBA programme is the entrustment decision for the Clinical SLOs and the Educational Supervisor’s review of the Generic/Supporting SLOs.

Bloom’s revised taxonomy (BRT)

Standard setting

The MRCEM and FRCEM examinations are high-stakes summative assessments that have the potential to impact on trainee careers and patient safety.  The processes that underpin pass/fail decisions must be robust, consistent and fair.

Each element of the examination schedule has changed in either content or format for this current Programme of Assessment and the standard-setting schemes for these new components are set out below.

Whole programme

The standards across the programme are absolute (criterion-referenced) rather than relative (norm-referenced).  This means that the standard required to pass each stage of the assessment does not vary according to the ability of the cohort either nationally or locally.

The separate elements of the assessment programme (MRCEM Primary, MRCEM Intermediate, MRCEM OSCE, FEG judgements, STR, ARCP, FRCEM SBA and FRCEM OSCE) are treated in a conjunctive rather than compensatory manner.  In other words, each trainee must reach the required standard in every separate component in order to pass.  A good performance in one element cannot compensate for a poor performance in another element.

The way in which the standard is defined within each element is described below.

Written papers

All written papers (MRCEM Primary, MRCEM Intermediate, FRCEM SBA) are now in the Single Best Answer (SBA) format.  There will only be one appropriate response (objective).  The format is appropriate for a question-centred (item-centred) standard-setting method.  We will use the Angoff approach.

In following best practice, a dedicated Angoff referencing group of examiners use the Angoff process to determine a cut-off score and make an adjustment of 1 Standard Error of Measurement (SEM) to arrive at the pass mark.  Training is given to all members of the MRCEM and FRCEM Angoff reference groups, and to develop a collective understanding of the ‘minimally competent’ candidate, as defined below:

“For the purposes of the SBA  examination, a ‘minimally competent’ candidate is one who has only just enough depth and breadth of the knowledge stipulated within the (Core/Intermediate level) curriculum to underpin their current clinical practice and equip them for the next phase of  training.

In determining the ‘minimally competent’ candidate, members of the Angoff referencing groups are encouraged to use personal experience of trainees sitting the exam at the particular stage of training.

After each examination, item analysis provides the exam board with data on items with unexpected performance statistics.  Highlighted items are reviewed, and if the item itself is problematic it is removed from the paper before scores are finalised.

Face-to-face examinations

The MRCEM OSCE and FRCEM OSCE are different from the SBAs in that it is not possible to define the only appropriate response or responses ahead of time.  This means that a candidate-centred (performance-centred) method of standard setting is required.  We will use the borderline regression method for setting the standard for the examination.

We have also chosen to extend the conjunctive approach used for the programme and treat some areas of performance within the face-to-face examinations in a conjunctive manner.  This is because we consider that there are some areas of practice where a weak candidate should not be able to progress to the next state of training (or exit training) however strong their performance elsewhere in the examination.  This is, in essence, a standard-setting position that raises the required standard beyond the purely compensatory approach to stations.


Which domains, stations and combinations of stations will be treated conjunctively will be determined at the blueprinting stage on a cycle-by-cycle basis, and will be reviewed at exam board in the light of station analysis data from the sitting.  However, examples may include:


  • Candidates who perform poorly in any one domain across the whole examination (e.g. communication).
  • Candidates who perform poorly in more than one station blueprinted to the same SLO (e.g. PEM – SLO 5).
  • Candidates who fail outright a station that performs well and is regarded as being of non-negotiable importance in any given examination (e.g. resus – SLO 3).

Fairness

The fairness of an exam refers to its freedom from bias.  To this end and in the exercising of its duties under the Equality Act 2010, the RCEM must give due regard to:

• eliminating discrimination, harassment, victimisation and any other unacceptable conduct;

• advancing equality of opportunity between persons who share a relevant protected characteristic and persons who do not share it; 

• fostering good relations between persons who share a relevant protected characteristic and persons who do not share it. 

The RCEM aims to ensure that everyone has equal opportunity to demonstrate their ability in the examinations and that no candidate is treated less favourably than another on grounds of race, disability, sex, transgender, sexual orientation, age, religion, or pregnancy and maternity. 

To ensure RCEM exams do not disadvantage any candidate, or group of candidates, on any basis other than the candidate’s lack of the knowledge and skills, all RCEM exam item developers take particular care in the wording of questions to avoid ambiguity or offence across cultures. Additionally, examiners seek advice from specific experts/organisations/ associations or the experience of individuals in the work place on topics which may potentially affect certain ethnicities or cultures. 

RCEM encourages feedback from candidates on the examination process and its content, and this provides access to a viewpoint on some protected characteristics not reflected in our examiner groups. 

Reasonable adjustments

RCEM considers reasonable adjustments for exam candidates with disability, as set out in the examination regulations. Special arrangements for pregnancy and temporary medical conditions are also provided where necessary.  Full details are available within the RCEM examinations regulations. 

Equality analysis 

Equality analysis is an integral part of examination policy, content and practice. RCEM carries out objective, evidence-based equality analysis when making decisions relating to exam changes, policies, question writing and practices. This ensures that full consideration is given to the effect such decisions may have on the fairness of the exam, and aims to prevent discrimination, promote diversity and inclusivity for all groups of people.

Quality assurance

A full Person Specification and Job Description is provided for examiners.  The Examiner Regulations give full details of all parts of the selection and appointment process used by the RCEM. 

Examiner training is mandatory for new examiners, which includes the principles of assessment, standard setting, examiner practice and calibration, and known and unknown bias.  After attendance at an Examiner Workshop, examiners are required to observe an OSCE day prior to examining.  This gives them the opportunity to assess candidates in a real examination.  Examiners are required to attend the Examiners’ Workshop every five years.

RCEM exams do not currently use lay examiners, although lay input is encouraged through attendance at committees and working groups. 

Feedback

RCEM believes it is important to provide feedback to candidates beyond a standard pass-fail result, to assist them in understanding and interpreting their overall result. RCEM does not attempt to justify the result given or the marks awarded, whether overall or for specific sections or skill domains. Marks are awarded using strict guidelines. The decision on marks awarded is final and therefore papers cannot be remarked. 

The following feedback is provided in all exam results letters/feedback enclosures:

  • Confirmation of the candidate’s pass-fail result 
  • Confirmation of the number of attempts used/maximum number of attempts 
  • The examination pass mark as a raw score in relation to the maximum achievable test score (e.g. 315/420) and/or the percentage value (e.g. 75%) 
  • The candidate’s overall score as a raw score and/or as a percentage 

Entrustment Decisions

Transitions and the crossing of thresholds are about taking on new responsibilities with a higher degree of independence.  Knowing whether a trainee is ready to do so is complex.  It requires a clear working knowledge of what the responsibilities involve, and the ability to predict how a trainee will respond when given responsibility.  An example is the care of patients in the resuscitation room (SLO3) when the consultant is at home when on call.

This kind of assessment is an example of ‘judgement-based’ assessment.  Scholarship in this field has seen a major transition from reductionism (breaking the assessment down to multiple ‘objective’ elements and assessing these) to entrustment (making the most of the sophisticated, contextual, individualised global judgements of which clinician trainers are capable).  Key features of good judgement-based assessment are asking the right people and asking the right questions.  The FEG panels are composed of staff who know the trainee well and know the responsibilities of the job well.  This provides us with the best chance of meaningful FEG judgements.  Critically, the judgements are framed in terms of entrustment and independence.  This aligns with the natural decision-making heuristics of clinician supervisors, and there is good empirical evidence that such ‘construct aligned’ judgements are significantly more dependable that judgements framed in terms of training stage or merit (e.g. poor, satisfactory, or good).

The WPBA approach is built around preparing trainees for thresholds in training.  To that end, assessments in the work place are also aligned to entrustment/ independence.  The RCEM entrustment scale is shown in table 4.

Table 4. RCEM entrustment scale

1 Direct supervisor observation/involvement, able to provide immediate direction/ assistance
2a Supervisor on the ‘shop-floor’ (e.g. ED, theatres, AMU, ICU), monitoring at regular intervals
2b Supervisor within hospital for queries, able to provide prompt direction or assistance and trainee knows reliably when to ask for help
3 Supervisor ‘on call’ from home for queries, able to provide directions via phone and able to attend the bedside if required to provide direct supervision
4 Would be able to manage with no supervisor involvement (all trainees practice with a consultant taking overall clinical responsibility) 

The expectation of EM trainees at each of the key thresholds in training are shown in figure 3.  This ensures that the requirements are transparent and explicit for all – trainers, trainees and the public.  Making these expectations transparent for trainees is one of the ways our assessment scheme is designed to foster self-regulating learners.  By providing a common and transparent map of what is expected from start to finish over the training journey, we give trainees the best chance of orienting themselves in terms of the progress so far and their next steps.  We also unify consistency of feedback across the whole learning journey making it more credible to learners.

FEG decisions are extremely important for trainees and should not come as a surprise at the end of a period of training.  The design of WPBAs, with entrustment score offered in feedback, means that should not be the case if trainees engage with training opportunities available.

Figure 3. RCEM Entrustment requirements

  ACCS: 6 months covering each of anaesthetics, acute medicine, intensive care medicine and emergency medicine. Can be worked in any order. SLOs reviewed in each post. Entrustment level required by end of ACCS Intermediate: paediatric emergency medicine, leadership roles, support wider EM team

Higher Specialty Training:

leading, EPIC, delivering challenging cases, management and administration toolkit, research, supervision and teaching

 
Clinical SLOs EM IM An ICM   Intermediate   ST4 ST5 ST6  
Care for physiologically stable patients attending the ED across the full range complexity 2b 2b       3   * * 4  
Answer clinical questions 2a 2a       3   * * 4  
Resuscitate and stabilise 2b 2b 2b 2b   3   * * 4  
Care for an injured patient 2b         3   * * 4  
Care for children in the ED           3   *> * 4  
Deliver key procedural skills Refer to Clinical ACCS LO 5 table   3   * * 4  
Deal with complex situations in the workplace 2a 2a 2a 2a   3   * * 4  
Lead the ED shift           3   * * 4  
Provide basic anaesthetic care (ACCS)     2b                
Manage patients with organ dysfunction and failure (ACCS)       2a              
Supporting SLOs                      
Teach and supervise ES report   ES report ES Report  
Participate in research  
Patient safety & quality improvement  
Lead, manage, administer  

*Progress in each of the clinical SLOs towards independence will be assessed in each year in HST

Entrustment level for procedural skills varies in ACCS. Please see Section 5.5.1 for details

Faculty Educational Governance Group (FEG) Statement

What is it?

This is a statement that summarises the collated views of the training faculty about the progress of a trainee, specifically, their suitability to move to the next stage of training.  This judgement is based on the observation of the trainee in the work place, on feedback from staff and patients and what faculty members have learned about the trainee’s performance from WPBAs.  (Individual WPBAs and reflections need not be reviewed by the training faculty at each FEG meeting, but they are available for review if the faculty judges that they need more data to make their judgement.).  Within this statement, the strengths of the trainee are also summarised as well as areas to develop thus giving the opportunity to reflect and encourage excellence. The FEG panel can also offer a suggestion for how the trainee might address any on-going training needs, potentially making the FEG an ‘adaptive’ or individualised assessment.

The FEG Statement was introduced in RCEM training in 2015, with a decision relating to the whole training year in general.  The evolution in this current programme of assessment is that the decision is now linked explicitly to progress in the relevant Specialty Learning Outcomes.  Anchoring this decision to independence with a clear description of what is required will be a significant benefit to trainees and trainers in making these decisions fairer and more transparent. 

The FEG Statement serves a summative purpose within our assessment programme.  It is then triangulated with other information in the Educational Supervisor’s report, to inform ARCP decision making.  The FEG Statement is held on the e-portfolio and is accessed by the trainee and the Educational /Clinical supervisor and Training Programme Director only.

The FEG process provides the opportunity for deeper, more timely, and more information-rich scrutiny of progress towards the key work place  SLOs than the old Supervisor Report was able to deliver.

How is it done?

The FEG Statement can be made in different ways according to local arrangements.  However, the key feature of the FEG is that it includes the views of the right people – those who know the trainee and know the responsibilities of the job.  It must represent the collated views of the training faculty as to whether they believe a trainee has met the requirement for practise in each of the relevant SLOs at the level of independence specified for their stage of training.  The decision will relate to the Key Capabilities for each SLO that are relevant to the trainee’s stage of training.

The faculty is bound by the requirements on them of the GMC’s Good Medical Practice guidance, by the requirements for fairness and transparency, the requirement that equality and diversity is respected and by the personal ethics and probity of individual members.

Good practice from a number of centres has been that ‘educational governance’ is a standing agenda item at consultant meetings and discussions of all trainees occur at regular (e.g. two- monthly) intervals.  This approach ensures that concerns are documented and  shared early and trainees can be better supported.   It facilitates encouragement of trainees and the feedback of excellence. It is also fair to trainees who will receive a summative decision from the same panel that they are fully aware of how that group are minded towards their progress in each of the relevant SLOs.

The final meeting is for the purposes of FEG Statement completion.  A quorate meeting would include at least three consultants, who must be trained Educational Supervisors. 

Other centres have a designated training faculty from among their consultant body that perform this function at a formal Educational Governance meeting comprised of the College Tutor (or equivalent), Educational /Clinical supervisor and at least two other consultant trainers.  At this meeting the progress of each trainee against each SLO is discussed and the output of this meeting is the Faculty Entrustment Group Statement.

Example:

SLO1: Care for a complex stable patient across the full range of complexity. Core (ACCS) trainee:

 ‘We believe this trainee can be trusted take a history, examine the patient and elicit key clinical signs, construct a differential diagnosis that considers a realistic worst case scenario and describes an appropriate management plan with senior help available, but not directly overlooking their work. The trainee can be relied upon to seek help when required’ 

This is the Key Capability for SLO 1 and describes entrustment level 2b.

The panel’s view is sought.  Panellists will be asked to reflect on their experience of trainees across the full spectrum of cases.  This decision is a statement about the confidence of the team that a learner can be relied upon to make a safe assessment and seek help as needed.  A yes/ no answer is required. 

This process is repeated for the other SLOs that are relevant to the current phase of training. 

The FEG Statement is recorded in the trainee’s e-portfolio by their Educational or Clinical Supervisor. The FEG Statement also includes general feedback on trainee strengths and areas to develop.

When is it done?

Final FEG statements are made towards the end of a given block of training in an Emergency Medicine placement.  This is typically six months (whole time equivalent) during ACCS and Intermediate Training, and yearly in Higher Training.  However, with most approaches to FEG, it should be possible for the faculty to indicate to the trainee their general progress towards the final FEG statement at regular intervals ahead of time.  WPBA performance should also give a strong indication of progress.

What if a trainee is deemed not ready to progress?

For the large majority of trainees these decisions will be positive.  However, if problems or concerns are raised about a trainee in departmental education governance meetings, or by other means, these can be fed back with learning needs identified and a plan to remediate put in place.  If these persist throughout an entire block of training this will be reflected in the FEG Statement and the subsequent ARCP panel will outline an appropriate training plan.

An opinion that a trainee is not ready to progress should not come as a surprise at the end of a placement, and should not be seen as punitive by the trainee or trainers.  It is a formal recording of the opinion of the faculty on progress at the end of that training block and reflects support and deliberation throughout the block.

Indeed, the fact that not all trainees will reach the threshold after the same duration of training is a realistic reflection of the variation that exists between learners, their experience and the complex and highly responsible role a higher trainee and consultant in EM clinician embodies. 

Assessment of Specialty Learning Outcome 6 Procedural skills

Procedural skills in ACCS (ACCS Learning Outcome 5)

There are a number of procedural skills in which a trainee must become proficient to the level expected by the end of ACCS.

ACCS trainees must be able to outline the indications for these procedures and recognise the importance of valid consent, aseptic technique, safe use of analgesia and local anaesthetics, minimisation of patient discomfort, and requesting for help when appropriate. For all practical procedures, the trainee must be able to recognise complications and respond appropriately if they arise, including calling for help from colleagues in other specialties when necessary.

ACCS trainees should ideally receive training in procedural skills in a clinical skills lab before performing these procedures clinically, but this is not mandatory. Assessment of procedural skills will be made using the direct observation of procedural skills (DOPS) tool on simulated or actual patients.

The table below sets out the minimum competency level expected for each of the practical procedures at the end of ACCS.

When an ACCS trainee has been signed off as being able to perform a procedure independently, they are not required to have any further assessment (DOPS) of that procedure, unless they or their educational supervisor think that this is required (in line with standard professional conduct). This also applies to procedures that have been signed off during other training programmes. They would be expected to continue to record activity in their logbook.

Procedure End of ACCS

Pleural aspiration of air

Entrustment requirement: 2b

Programme of learning

e-learning module

Simulated practice or supervised practice on patient

Programme of assessment

DOPS assessment

Chest drain: Seldinger technique

Entrustment requirement: 2b

Programme of learning

e-learning module

Simulated practice and/or supervised practice on patient

Programme of assessment

DOPS assessment

Chest drain: open technique

Entrustment requirement: 1

Programme of learning

e-learning module

Simulated practice and/or supervised practice on patient

National Safety Standards for Invasive Procedures (NatSSIPs) checklist

ATLS or equivalent trauma course

Programme of assessment

DOPS assessment OR

Supervised practice on patient with reflection recorded

Simulated practice with reflection recorded OR

ATLS certificate

Establish invasive monitoring (CVP and arterial line)

Entrustment requirement: 2b

Programme of learning

Simulated practice and/or supervised practice

Programme of assessment

DOPS assessment for CVP line AND

DOPS assessment for arterial line

Vascular access in emergency (IO and femoral vein)

Entrustment requirement: 1

Programme of learning

Simulated practice and/or supervised practice

ATLS or similar trauma course

Programme of assessment

DOPS assessment OR

Supervised practice on patient with reflection recorded OR

Simulated practice with reflection recorded

Fracture/dislocation manipulation

Entrustment requirement: 1

Programme of learning

Supervised practice on patient

Programme of assessment

DOPS assessment OR

Supervised practice with reflection recorded

External pacing

Entrustment requirement: 2b

Programme of learning

e-learning module on bradyarrhythmias

Simulated practice and/or supervised practice on patient

ALS course

Programme of assessment

DOPS assessment OR

Supervised practice on patient with reflection recorded OR

Simulated practice with reflection recorded

DC cardioversion

Entrustment requirement: 2b

Programme of learning

e-learning module on broad and narrow complex tachycardias

Simulated practice and/or supervised practice

ALS course

Programme of assessment

DOPS assessment OR

Supervised practice on patient with reflection recorded OR

Simulated practice with reflection recorded

Point of care ultrasound-guided vascular access and fascia iliaca nerve block

Entrustment requirement: 2b

Programme of learning

Simulated practice and/or supervised practice on patient

Modular level 1 theory training

Programme of assessment

DOPS assessment for peripheral and central vascular access AND

DOPS assessment for fascia iliaca nerve block

Lumbar puncture

Entrustment requirement: 2b

Programme of learning

e-learning module

Simulated practice and/or supervised practice on patient

Programme of assessment

DOPS assessment

Continued performance of ACCS procedural skills in intermediate and higher training will be recorded in the RCEM log book. This record includes any complications and the level of support received if relevant. Episodes where the trainee was supervising others in these skills will also be recorded.

Procedural Skills in Intermediate and Higher Training

During Intermediate and Higher training learners will be expected to become more expert in all the practical procedures previously undertaken.

Continued performance of ACCS procedural skills in intermediate and higher training will be recorded in the RCEM log book. This record should include any complications and the level of support received if relevant. Episodes where the trainee was supervising others in these skills should also be recorded as learning events.  The primary tool for the assessment of procedural skills is the direct observation of procedural skills (DOPS) tool which can be used on simulated or actual patients unless specified in the table below.

The purpose of this document is not to provide an exhaustive list of medical procedures which an Emergency Medicine Consultant may need or choose to perform over the course of their career and accordingly there are common medical procedures that are not included in the following list. It is intended to ensure that Emergency Medicine trainees are trained in emergency procedures typically required to preserve life, limb or treat painful emergency conditions in a timely and effective manner when there is no other more expert or appropriately trained practitioner immediately available.

It should be noted that there are a number of life-saving skills covered in this curriculum, such as resuscitative thoracotomy and resuscitative hysterotomy that are used rarely. These skills are included because it is conceivable that they may be required by a “day one” consultant in Emergency Medicine. Because these interventions are rare it is not expected that every trainee must have observed practice in these areas however it is required that higher trainees are able to outline the indications for these procedures, know where to request help and have had observed practice of the skill in a simulated environment, including decision making and human factors. Whilst it is not essential or mandatory it is recommended that this includes cadaveric experience, where appropriate, prior to completion of training.

We have indicated that by the end of intermediate training trainees should have progressed to the level 3 entrustment grade for all emergency procedures and level 4 but the end of higher training (see table 4 for the RCEM entrustment scale). This does not imply that the trainee would be entrusted to perform any or all procedures independently without assistance or supervision at the start of higher training. It is intended for this to be interpreted as that the trainee is entrusted to reliably recognise when a particularly procedure is indicated and begin preparation to proceed while waiting for appropriate assistance to attend in order for things to proceed safely. This is likely to include the Emergency Medicine consultant on duty and other members of the multidisciplinary team i.e. anaesthetist or relevant surgeon. After intermediate training the trainee would be expected to have reached a standard where they can play a valuable role in this team. This role would develop through higher training upon completion of which they would have reached a point where they could be entrusted to lead this multidisciplinary team or perform any role in it.

Minimum standards for progression from intermediate and higher training

Procedure End of intermediate training  End of higher training 
Paediatric sedation

Programme of learning

Completion of RCEM e-learning module on paediatric sedation

Attendance at paediatric sedation simulation session

Performance of observed procedural sedation patients under the direct supervision of an ED Consultant

Programme of assessment

Certificate of completion of RCEM e-learning moduleCertificate of attendance at paediatric sedation simulation day

DOPS assessment for paediatric sedation (simulation)

Programme of learning

Performance of observed procedural sedation patients under the direct supervision of an ED Consultant

Programme of assessment

Logbook record

DOPS assessment of observed practice on patient if intermediate DOPS was a simulation
Advanced airway management

Emergency surgical airway

RSI

Programme of learning

IAC training during ACCS

Ongoing observed or simulated practice

Simulated skills practice including surgical airway training

ATLS or similar trauma course

Programme of assessment

IAC certificate

DOPS assessment

Log book record

Emergency Surgical airway

RSI

Programme of learning

10 Intubations a year on patients- may include

RSI activity in ED/ theatres (including as part of a multi-disciplinary team)

Simulated skills practice session including surgical airway training

Programme of assessment

Logbook record

Non-invasive ventilation

Programme of learning

e-learning module on NIV

Observed or simulated practice of NIV initiation

Programme of assessment

Certificate of completion of e-learning module

DOPS assessment

 

Log book record of skill maintenance

Open Chest drain

Programme of learning

Understanding of National Safety Standards for Invasive Procedures (NatSSIPs) checklist

Observed or simulated practice

Programme of assessment

DOPS assessment

Log book record

Programme of learning

Observed practice

Observed instruction of technique

Programme of Assessment

DOPS assessment of observed practice on patient if intermediate DOPS was a simulation

Logbook record of skillmaintenance
Resuscitative thoracotomy

Programme of learning

e-learning module

Simulated practice

Programme of assessment

Certificate of completion of e-learning module

Programme of learning

Simulated practice

Programme of assessment

DOPS assessment
Lateral Canthotomy

Programme of learning

e-learning module

Simulated practice

Programme of assessment

Certificate of completion of e-learning module

Programme of learning

Simulated practice

Programme of assessment

DOPS assessment
DC cardioversion

Programme of learning

e-learning module on broad and narrow complex tachycardias 

OR ALS course

Observed or simulated practice

Programme of assessment

DOPS assessment

Logbook record

Programme of learning

Maintenance of skills throughout HST

Observed instruction of management of tachydysrythmias

Programme of assessment

Logbook record
External pacing

Programme of learning

e-learning module on bradyarrhythmias

OR ALS course

Observed or simulated practice

Programme of assessment

Certificate of completion of eLearning module OR

ALS certificate

DOPS assessment

Logbook record

Programme of learning

Maintenance of skills throughout HST

Observed instruction of management of bradydysrythmias

Programme of assessment

Logbook record
Pericardiocentesis

Programme of learning

e-learning module

 

Simulated practice

Programme of assessment

Certificate of completion of e- learning module

Programme of learning

Simulated practice

Programme of assessment

DOPS assessment

ED management of life-threatening haemorrhage

Programme of learning

e-learning modules

ATLS or similar trauma course

Observed or simulated practice of direct and indirect haemorrhage control techniques including but not exclusive to;

• Wound management

• Bleeding varicose veins

• Nasal packing

• Splints (e.g. pelvic sling, traction splint)

• Tourniquet

• Use of haemostatic agents

Programme of assessment

Certificates of completion of e-learning modules

DOPS assessments of relevant techniques

Logbook record

 

Programme of learning

Observed practice

Observed instruction of technique

Programme of Assessment

DOPs assessments for observed practice on patients of relevant techniques and teaching

Logbook record of skill maintenance
Emergency delivery

Programme of learning

E-learning module

Simulated practice

Programme of assessment

DOPS assessment

Logbook record

Programme of learning

Observed or simulated practice

Programme of assessment

DOPS assessment

Logbook record
Resuscitative hysterotomy

Programme of learning

e-learning module

Simulated practice

Programme of assessment

Certificate of completion of e- learning module

Programme of learning

Simulated practice

Programme of assessment

DOPS assessment
Fracture / Dislocation manipulation

Programme of learning

Supervised practice of fracture and dislocation manipulation and splinting techniques including but not exclusive to;

• Shoulder

• Elbow

• Wrist

• Finger

• Hip

• Femur

• Lower leg

• Ankle

• Toes

• Mandible

Programme of assessment

DOPS assessments for various fracture reduction and joint manipulation techniques

Logbook record

Programme of learning

Observed practice

Observed instruction of technique

Programme of assessment

DOPS assessments for relevant observed practice of various techniques and teaching

Logbook record of skill development and maintenance
Large joint aspiration

Programme of learning

e-learning module

Observed or simulated practice

Programme of assessment

DOPS assessment

Logbook record

Programme of learning

Maintenance of skills throughout HST

Programme of assessment

Logbook record
Suprapubic catheter re-insertion

Programme of learning

Simulated practice

Supervised practice

Assessment

DOPS for suprapubic catheter re-insertion Logbook record

Programme of learning

Maintenance of skills throughout HST

Assessment

Logbook record of skills maintenance

Point of care Ultrasound (Diagnostic)

Programme of Learning

e-learning USS resources on data interpretation

Ongoing observed and simulated practice of ACCS skills

Observed and simulated practice of;

• Echo in Life Support (ELS) Is the heart contracting? Is there pericardial effusion causing cardiac tamponade? Is there evidence of right ventricular strain?

• AAA

• eFAST / Focussed Assessment for Free Fluid (FAFF)

Programme of Assessment

Log Book record

DOPS assessments

Educational supervisor review of logbook regarding progress towards sign off when competent?

Programme of Learning

e-learning USS resources on data interpretation

Ongoing observed and simulated practice of ACCS and intermediate skills

Observed and simulated practice of;

  • Shock assessment – including IVC measurement, global contractility and assessment of fluid status (including overload and hypovolaemia).

Programme of Assessment

Log Book record

DOPS assessments

Educational supervisor review of logbook and sign off by the end of HST for :

  • ELS
  • Shock assessment
  • AAA
  • eFAST/ FAFF

Entrustment based- for guidance approximate number of scans expected: ELS 10; AAA 25; Shock Assessment 25; eFAST/FAFF 25. Scans recorded in the log book throughout training. Maintain log book for each modality when scanning independently



 

Comments are closed