Mass digitization and the garbage dump: The conflicting needs of quantitative and qualitative methods

被引:13
|
作者
Gooding, Paul [1 ]
机构
[1] UCL, London WC1E 6BT, England
来源
LITERARY AND LINGUISTIC COMPUTING | 2013年 / 28卷 / 03期
关键词
GOOGLE BOOKS;
D O I
10.1093/llc/fqs054
中图分类号
H0 [语言学];
学科分类号
030303 ; 0501 ; 050102 ;
摘要
There has been widespread excitement in recent years about the emergence of large-scale digital initiatives (LSDIs) such as Google Book Search. Although many have become excited at the prospect of a digital recreation of the Library of Alexandria, there has also been great controversy surrounding these projects. This article looks at one of these controversies: the suggestion that mass digitization is creating a virtual rubbish dump of our cultural heritage. It discusses some of the quantitative methods being used to analyse the big data that have been created, and two major concerns that have arisen as a result. First, there is the concern that quantitative analysis has inadvertently fed a culture that favours information ahead of traditional research methods. Second, little information exists about how LSDIs are used for any research other than quantitative methods. These problems have helped to fuel the idea that digitization is destroying the print medium, when in many respects it still closely remediates the bibliographic codes of the Gutenberg era. The article concludes that more work must be done to understand what impact mass digitization has had on all researchers in the humanities, rather than just the early adopters, and briefly mentions the work that the author is undertaking in this area.
引用
收藏
页码:425 / 431
页数:7
相关论文
共 50 条