Health system guidance appraisal-concept evaluation and usability testing

被引:4
|
作者
Ako-Arrey, Denis E. [1 ]
Brouwers, Melissa C. [1 ]
Lavis, John N. [2 ]
Giacomini, Mita K. [3 ]
机构
[1] McMaster Univ, Hamilton, ON L8V 1C3, Canada
[2] McMaster Univ, Hamilton, ON L8S 4L6, Canada
[3] McMaster Univ, Hamilton, ON L8S 4K1, Canada
来源
IMPLEMENTATION SCIENCE | 2016年 / 11卷
基金
加拿大健康研究院;
关键词
Health system guidance; Guidance development; Guidance appraisal; Guidance reporting; Health system challenges; Health system arrangements; AGREE-HS; AGREE II; CARE;
D O I
10.1186/s13012-015-0365-3
中图分类号
R19 [保健组织与事业(卫生事业管理)];
学科分类号
摘要
Background: Health system guidance (HSG) provides recommendations aimed to address health system challenges. However, there is a paucity of methods to direct, appraise, and report HSG. Earlier research identified 30 candidate criteria (concepts) that can be used to evaluate the quality of HSG and guide development and reporting requirements. The objective of this paper was to describe two studies aimed at evaluating the importance of these 30 criteria, design a draft HSG appraisal tool, and test its usability. Methods: This study involved a two-step survey process. In step 1, respondents rated the 30 concepts for appropriateness to, relevance to, and priority for health system decisions and HSG. This led to a draft tool. In step 2, respondents reviewed HSG documents, appraised them using the tool, and answered a series of questions. Descriptive analyses were computed. Results: Fifty participants were invited in step 1, and we had a response rate of 82 %. The mean response rates for each concept within each survey question were universally favorable. There was also an overall agreement about the need for a high-quality tool to systematically direct the development, appraisal, and reporting of HSG. Qualitative feedback and a consensus process by the team led to refinements to some of the concepts and the creation of a beta (draft) version of the HSG tool. In step 2, 35 participants were invited and we had a response rate of 74 %. Exploratory analyses showed that the quality of the HSGs reviewed varied as a function of the HSG item and the specific document assessed. A favorable consensus was reached with participants agreeing that the HSG items were easy to understand and easy to apply. Moreover, the overall agreement was high for the usability of the tool to systematically direct the development (85 %), appraisal (92 %), and reporting (81 %) of HSG. From this process, version 1.0 of the HSG appraisal tool was generated complete with 32 items (and their descriptions) and 4 domains. Conclusions: The final tool, named the Appraisal of Guidelines for Research and Evaluation for Health Systems (AGREE-HS) (version 1), defines expectations of HSG and facilitates informed decisions among policymakers on health system delivery, financial, and governance arrangements.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] Testing and usability evaluation of the MRAGES force feedback glove
    Winter, Scott H.
    Bouzit, Mourad
    [J]. 2006 INTERNATIONAL WORKSHOP ON VIRTUAL REHABILITATION, 2006, : 82 - +
  • [42] Comparative Study of Heuristic Evaluation and Usability Testing Methods
    Thyvalikakath, Thankam Paul
    Monaco, Valerie
    Thambuganipalle, Himabindu
    Schleyer, Titus
    [J]. ADVANCES IN INFORMATION TECHNOLOGY AND COMMUNICATION IN HEALTH, 2009, 143 : 322 - +
  • [43] An empirical evaluation of the System Usability Scale
    Bangor, Aaron
    Kortum, Philip T.
    Miller, James T.
    [J]. INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2008, 24 (06) : 574 - 594
  • [44] Usability Evaluation of a Learning Management System
    Blecken, Alexander
    Brueggemann, Daniel
    Marx, Wiebke
    [J]. 43RD HAWAII INTERNATIONAL CONFERENCE ON SYSTEMS SCIENCES VOLS 1-5 (HICSS 2010), 2010, : 97 - +
  • [45] A field usability evaluation of a wearable system
    Siegel, J
    Bauer, M
    [J]. FIRST INTERNATIONAL SYMPOSIUM ON WEARABLE COMPUTERS - DIGEST OF PAPERS, 1997, : 18 - 22
  • [46] Multiplatform Career Guidance System Using IBM Watson, Google Home and Telegram A User Experience and Usability Evaluation
    Calvo, Daniel
    Quesada, Luis
    Lopez, Gustavo
    Guerrero, Luis A.
    [J]. UBIQUITOUS COMPUTING AND AMBIENT INTELLIGENCE, UCAMI 2017, 2017, 10586 : 689 - 700
  • [47] Usability testing: Evaluation and integration in the document development process
    Robbins, D
    [J]. STC 1996 PROCEEDINGS - 43RD ANNUAL CONFERENCE: EVOLUTION/REVOLUTION, 1996, : 494 - 498
  • [48] Perceived Usability Evaluation of Learning Management Systems: Empirical Evaluation of the System Usability Scale
    Orfanou, Konstantina
    Tselios, Nikolaos
    Katsanos, Christos
    [J]. INTERNATIONAL REVIEW OF RESEARCH IN OPEN AND DISTRIBUTED LEARNING, 2015, 16 (02): : 227 - 246
  • [49] An appraisal of published usability evaluations of electronic health records via systematic review
    Ellsworth, Marc A.
    Dziadzko, Mikhail
    O'Horo, John C.
    Farrell, Ann M.
    Zhang, Jiajie
    Herasevich, Vitaly
    [J]. JOURNAL OF THE AMERICAN MEDICAL INFORMATICS ASSOCIATION, 2017, 24 (01) : 218 - 226