Evaluating collaborative information seeking - synthesis, suggestions, and structure

被引:15
|
作者
Shah, Chirag [1 ]
机构
[1] Rutgers State Univ, New Brunswick, NJ 08873 USA
关键词
Collaboration; evaluation; framework; information seeking; AWARENESS; VALIDATION;
D O I
10.1177/0165551514530651
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Evaluating the performance of collaborative information seeking (CIS) systems and users can be challenging, often more so than individual information-seeking environments. This can be attributed to the complex and dynamic interactions that take place among various users and systems processes in a CIS environment. While some of the aspects of a CIS system or user could be measured by typical assessment techniques from single-user information retrieval/seeking (IR/IS), one often needs to go beyond them to provide a meaningful evaluation, helping to provide not only a sense of performance, but also insights into design decisions (regarding systems) and behavioural trends (regarding users). This article first provides an overview of existing methods and techniques for evaluating CIS (synthesis). It then extracts valuable directives and advice from the literature that inform evaluation choices (suggestions). Finally, the article presents a framework for CIS evaluation with two major parts: system-based and user-based (structure). The proposed framework incorporates various instruments taken from computer and social sciences literature as applicable to CIS evaluations. The lessons from the literature and the framework could serve as important starting points for designing experiments and systems, as well as evaluating system and user performances in CIS and related research areas.
引用
收藏
页码:460 / 475
页数:16
相关论文
共 50 条