Checklists used in the diagnostic reasoning process can be divided into two categories. Content-specific checklists provide clinicians with relevant knowledge during the diagnostic process or trigger them to activate their knowledge. For example, these may list specific diagnostic steps or suggestions of possible diagnoses that should be considered for a specific patient.
One example is a checklist to interpret electrocardiograms (ECGs) that included content-specific features, such as calculation of heart rate. Sibbald and colleagues found in several studies that the use of this checklist reduced diagnostic error based on interpretation of ECGs.7鈥9 Other studies also showed an effect of contentspecific checklists, but the effects are often small or apply only to a subgroup of clinicians.10,11
Process-focused checklists aim to trigger more deliberate thinking when diagnosing a patient. An example is a 鈥渄ebiasing鈥 checklist that aims to reduce errors that occur due to shortcuts in the diagnostic reasoning process (i.e., cognitive biases).12 These checklists often contain items such as 鈥渨hat else can it be?鈥
A recent study by O鈥橲ullivan and Shofield evaluated the use of a cognitive forcing mnemonic, called 鈥淪LOW鈥, on diagnostic accuracy. 鈥淪LOW鈥 is an acronym for four questions: (1) 鈥淪ure about that? Why?鈥 (2) 鈥淟ook at the data, what is lacking? Does it all link together?鈥 (3) 鈥淥pposite: what if the opposite is true?鈥 and (4) Worst case scenario; 鈥淲hat else could this be?鈥 A randomized trial found no effect of the SLOW intervention (compared with no intervention) on diagnostic accuracy based on clinical vignettes.13 Similarly, most studies that evaluated process-focused checklists found no significant effects on accuracy.14,15
Two studies have directly compared content-specific checklists and process-focused checklists. In a study by Sibbald and colleagues on ECG interpretation, the content-specific (knowledge-based) checklist as described before was compared with a process-focused (debiasing) checklist and a control group. The overall results did not show a significant improved performance on ECG interpretation with either checklist.14 This finding is in contrast to several earlier studies by Sibbald, et al., in which the content-specific checklist showed an effect.7,8
A study by Shimizu and colleagues compared medical students鈥 intuitive process (i.e., list the three most likely diagnoses) with one of two checklists: (1) a content-specific checklist that suggested differential diagnosis for the case at hand or (2) a process-focused checklist, i.e., a general debiasing checklist developed by Ely, et al.,5 with checklist items such as 鈥淒id I obtain a complete medical history?鈥 and taking a 鈥渄iagnostic time out.鈥
The authors exposed the participants to both simple and difficult clinical case vignettes based on actual patient experiences. Overall, they found that the use of a checklist did not improve accuracy in the easy cases; on the contrary, diagnostic accuracy was reduced by the use of checklists. For difficult cases, the content-specific checklist improved diagnostic accuracy, but the debiasing checklist was not effective in either simple or difficult cases.16
Taking all this research into account, content-specific checklists seem to be more promising than processfocused checklists, but the evidence is relatively thin with few studies.