ACRONYM: Context Metrics for Linking People to User-Generated Media Content

被引:1
|
作者
Monaghan, Fergal [1 ]
Handschuh, Siegfried [1 ]
O'Sullivan, David [1 ]
机构
[1] Natl Univ Ireland, Sch Engn & Informat, Galway, Ireland
基金
爱尔兰科学基金会;
关键词
Context Mining; Linked Data; Mobile Devices; Recommender Algorithms; Semantic Annotation; Social Media; ANNOTATION;
D O I
10.4018/jswis.2011100101
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the advent of online social networks and User-Generated Content (UGC), the social Web is experiencing an explosion of audio-visual data. However, the usefulness of the collected data is in doubt, given that the means of retrieval are limited by the semantic gap between them and people's perceived understanding of the memories they represent. Whereas machines interpret UGC media as series of binary audio-visual data, humans perceive the context under which the content is captured and the people, places, and events represented. The Annotation CReatiON for Your Media (ACRONYM) framework addresses the semantic gap by supporting the creation of a layer of explicit machine-interpretable meaning describing UGC context. This paper presents an overview of a use case of ACRONYM for semantic annotation of personal photographs. The authors define a set of recommendation algorithms employed by ACRONYM to support the annotation of generic UGC multimedia. This paper introduces the context metrics and combination methods that form the recommendation algorithms used by ACRONYM to determine the people represented in multimedia resources. For the photograph annotation use case, these result in an increase in recommendation accuracy. Context-based algorithms provide a cheap and robust means of UGC media annotation that is compatible with and complimentary to content-recognition techniques.
引用
收藏
页码:1 / 35
页数:35
相关论文
共 50 条
  • [21] Multimodal Analysis of User-Generated Content in Support of Social Media Applications
    Shah, Rajiv Ratn
    ICMR'16: PROCEEDINGS OF THE 2016 ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, 2016, : 423 - 426
  • [22] User-generated "content":: This is the promised land?
    Crawford, W
    ECONTENT, 2001, 24 (08) : 50 - 51
  • [23] Quality Characteristics for User-Generated Content
    Musto J.
    Dahanayake A.
    Musto, Jiri (jiri.musto@lut.fi), 1600, IOS Press BV (343): : 244 - 263
  • [24] User-Generated Content in Pervasive Games
    Kasapakis, Vlasios
    Gavalas, Damianos
    COMPUTERS IN ENTERTAINMENT, 2018, 16 (01):
  • [25] Tapping the grapevine: User-generated content
    Figallo, C
    Rhine, N
    ECONTENT, 2001, 24 (03) : 38 - +
  • [26] WARMING UP TO USER-GENERATED CONTENT
    Lee, Edward
    UNIVERSITY OF ILLINOIS LAW REVIEW, 2008, (05): : 1459 - 1548
  • [27] Assessing the Quality of User-Generated Content
    Stefan Winkler
    ZTE Communications, 2013, 11 (01) : 37 - 40
  • [28] The future of user-generated content is now
    Marino, Gregoire
    JOURNAL OF INTELLECTUAL PROPERTY LAW & PRACTICE, 2013, 8 (03) : 183 - 183
  • [29] Principles for Modeling User-Generated Content
    Lukyanenko, Roman
    Parsons, Jeffrey
    CONCEPTUAL MODELING, ER 2015, 2015, 9381 : 432 - 440
  • [30] A Solution for Navigating User-Generated Content
    Uusitalo, Severi
    Eskolin, Peter
    Belimpasakis, Petros
    2009 8TH IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY - SCIENCE AND TECHNOLOGY, 2009, : 219 - 220