Comparison of a Prototype for Indications-Based Prescribing with 2 Commercial Prescribing Systems

Pamela M. Garabedian, Adam Wright, Isabella Newbury, Lynn A. Volk, Alejandra Salazar, Mary G. Amato, Aaron W. Nathan, Katherine J. Forsythe, William L. Galanter, Kevin Kron, Sara Myers, Joanna Abraham, Sarah K. McCord, Tewodros Eguale, David W. Bates, Gordon D. Schiff

Research output: Contribution to journalArticlepeer-review

16 Scopus citations

Abstract

Importance: The indication (reason for use) for a medication is rarely included on prescriptions despite repeated recommendations to do so. One barrier has been the way existing electronic prescribing systems have been designed. Objective: To evaluate, in comparison with the prescribing modules of 2 leading electronic health record prescribing systems, the efficiency, error rate, and satisfaction with a new computerized provider order entry prototype for the outpatient setting that allows clinicians to initiate prescribing using the indication. Design, Setting, and Participants: This quality improvement study used usability tests requiring internal medicine physicians, residents, and physician assistants to enter prescriptions electronically, including indication, for 8 clinical scenarios. The tool order assignments were randomized and prescribers were asked to use the prototype for 4 of the scenarios and their usual system for the other 4. Time on task, number of clicks, and order details were captured. User satisfaction was measured using posttask ratings and a validated system usability scale. The study participants practiced in 2 health systems' outpatient practices. Usability tests were conducted between April and October of 2017. Main Outcomes and Measures: Usability (efficiency, error rate, and satisfaction) of indications-based computerized provider order entry prototype vs the electronic prescribing interface of 2 electronic health record vendors. Results: Thirty-two participants (17 attending physicians, 13 residents, and 2 physician assistants) used the prototype to complete 256 usability test scenarios. The mean (SD) time on task was 1.78 (1.17) minutes. For the 20 participants who used vendor 1's system, it took a mean (SD) of 3.37 (1.90) minutes to complete a prescription, and for the 12 participants using vendor 2's system, it took a mean (SD) of 2.93 (1.52) minutes. Across all scenarios, when comparing number of clicks, for those participants using the prototype and vendor 1, there was a statistically significant difference from the mean (SD) number of clicks needed (18.39 [12.62] vs 46.50 [27.29]; difference, 28.11; 95% CI, 21.47-34.75; P <.001). For those using the prototype and vendor 2, there was also a statistically significant difference in number of clicks (20.10 [11.52] vs 38.25 [19.77]; difference, 18.14; 95% CI, 11.59-24.70; P <.001). A blinded review of the order details revealed medication errors (eg, drug-allergy interactions) in 38 of 128 prescribing sessions using a vendor system vs 7 of 128 with the prototype. Conclusions and Relevance: Reengineering prescribing to start with the drug indication allowed indications to be captured in an easy and useful way, which may be associated with saved time and effort, reduced medication errors, and increased clinician satisfaction.

Original languageEnglish
Article numbere191514
JournalJAMA Network Open
Volume2
Issue number3
DOIs
StatePublished - Mar 2019

Fingerprint

Dive into the research topics of 'Comparison of a Prototype for Indications-Based Prescribing with 2 Commercial Prescribing Systems'. Together they form a unique fingerprint.

Cite this