1. Computerized clinical decision support (CDS) systems are currently unable to identify relevant appropriateness criteria for roughly two-thirds of advanced imaging studies ordered.
2. When able to identify ordered imaging studies as inappropriate, CDS systems affected either a change or cancellation of the ordered study roughly 6% of the time, and twice as often if able to suggest alternative studies.
Evidence Rating Level: 2 (Good)
Study Rundown: Clinical decision support (CDS) systems are an attempt to partially automate improvements in patient care by fostering increased adherence to algorithmic treatment recommendations. They function by cross-referencing patient characteristics, often inputted by the ordering physician, to established appropriateness criteria relevant to each diagnostic study ordered. When these criteria are not followed, the system would ideally recognize the error and notify the ordering physician while offering suitable alternatives. The current study sought to determine if CDS systems are able to match imaging studies to given appropriateness criteria, and after implementation, what proportion of orders are changed or cancelled. This large, multicenter observational study found that CDS systems were unable to identify relevant appropriateness criteria for roughly 65% of imaging orders. During the baseline period, the CDS system only tracked order appropriateness, identifying 11.1% of final orders as inappropriate versus 6.4% during the intervention period, during which the system could feedback appropriateness data to the ordering clinician. Of those studies initially identified as inappropriate, roughly twice as many orders were changed or cancelled when the CDS was able to suggest an alternative versus baseline. The study was strengthened by its scale and diversity, but limited primarily by poor physician uptake, and the variation and potential imprecision of user inputted patient data. Overall, the study suggests that CDS systems are primarily limited by an inability to reliably categorize orders within given appropriateness criteria, likely due to a paucity of patient data inputted by clinicians. Future studies should focus on improving data input methods to increase clinician uptake and provide higher quality datasets to CDS systems.
Click to read the study in JAMA
Relevant Reading: Effects of computerized clinical decision support systems on practitioner performance and patient outcomes
In-Depth [cross-sectional study]: A total of 117,348 diagnostic imaging studies were examined in this multicenter, observational study spanning a total of 22 months. Data was acquired from the orders of 3,340 clinicians in varied practice environments ranging from large academic medical centers to small independent practices. Six-months of baseline data were acquired wherein CDS systems were able to track the appropriateness of advanced imaging procedures but not provide any feedback to the ordering clinicians. This was followed by an 18-month intervention period in which feedback was permitted, and alternative studies were suggested when possible. During the baseline period, no relevant appropriateness criteria could be identified for 63.3% of orders, and 11.1% of final orders were determined to be inappropriate. Of those rated as inappropriate, 4.8% were changed and 1.9% were cancelled without any CDS system feedback. In the intervention period, 66.5% of studies had no relevant appropriateness criteria attached, and 6.4% of final orders were deemed inappropriate. Of those rated as inappropriate for which the CDS system suggested an alternative, 9.9% were changed while 0.4% were cancelled, versus 1.4% changed and 2.8% cancelled when no alternative could be suggested.
Image: PD
©2015 2 Minute Medicine, Inc. All rights reserved. No works may be reproduced without expressed written consent from 2 Minute Medicine, Inc. Inquire about licensing here. No article should be construed as medical advice and is not intended as such by the authors or by 2 Minute Medicine, Inc.