Discovering attractive segments in the user-generated video streams

被引:27
|
作者
Wang, Zheng [1 ]
Zhou, Jie [1 ]
Ma, Jing [2 ]
Li, Jingjing [1 ]
Ai, Jiangbo [1 ]
Yang, Yang [1 ]
机构
[1] Univ Elect Sci & Technol China, Sch Comp Sci & Engn, Chengdu, Sichuan, Peoples R China
[2] Chinese Univ Hong Kong, Hong Kong, Peoples R China
关键词
Time-sync comment; User-generated video stream; Segment popularity prediction; Video-to-text transfer;
D O I
10.1016/j.ipm.2019.102130
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
With the rapid development of digital equipment and the continuous upgrading of online media, a growing number of people are willing to post videos on the web to share their daily lives (Jelodar, Paulius, & Sun, 2019). Generally, not all video segments are popular with audiences, some of which may be boring. If we can predict which segment in a newly generated video stream would be popular, the audiences can only enjoy this segment rather than watch the whole video to find the funny point. And if we can predict the emotions that the audiences would induce when they watch a video, this must be helpful for video analysis and for guiding the video-makers to improve their videos. In recent years, crowd-sourced time-sync video comments have emerged worldwide, supporting further research on temporal video labeling. In this paper, we propose a novel framework to achieve the following goal: Predicting which segment in a newly generated video stream (hasn't been commented with the time-sync comments) will be popular among the audiences. At last, experimental results on real-world data demonstrate the effectiveness of the proposed framework and justify the idea of predicting the popularities of segments in a video exploiting crowd-sourced time-sync comments as a bridge to analyze videos.
引用
收藏
页数:14
相关论文
共 50 条
  • [41] The Cognitive Effect of YouTube Video and User-Generated Content: A Preliminary Study
    Lee, Eunji
    Shin, Seunghun
    Joo, Hyeyeoun
    Koo, Chulmo
    INFORMATION AND COMMUNICATION TECHNOLOGIES IN TOURISM 2024, ENTER 2024, 2024, : 440 - 445
  • [42] User-Generated Content Introduction
    Krumm, John
    Davies, Nigel
    Narayanaswami, Chandra
    IEEE PERVASIVE COMPUTING, 2008, 7 (04) : 10 - 11
  • [43] Differentiation with User-Generated Content
    Zhang, Kaifu
    Sarvary, Miklos
    MANAGEMENT SCIENCE, 2015, 61 (04) : 898 - 914
  • [44] TRANSMISSION: USER-GENERATED LITURGY
    Everett, Isaac
    LITURGY, 2011, 26 (02) : 20 - 29
  • [45] A call for 'User-Generated Branding'
    Burmann, Christoph
    JOURNAL OF BRAND MANAGEMENT, 2010, 18 (01) : 1 - 4
  • [46] The Power of User-Generated Content
    Jagger P.
    ITNOW, 2023, 65 (01) : 32 - 33
  • [47] User-generated content and the law
    Holmes, Steve
    Ganley, Paul
    JOURNAL OF INTELLECTUAL PROPERTY LAW & PRACTICE, 2007, 2 (05) : 338 - 344
  • [48] On the "Localness" of User-Generated Content
    Hecht, Brent
    Gergle, Darren
    2010 ACM CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK, 2010, : 229 - 232
  • [49] Mining user-generated comments
    Subercaze, Julien
    Gravier, Christophe
    Laforest, Frederique
    2015 IEEE/WIC/ACM INTERNATIONAL CONFERENCE ON WEB INTELLIGENCE AND INTELLIGENT AGENT TECHNOLOGY (WI-IAT), VOL 1, 2015, : 45 - 52
  • [50] An Investigation of Cross-Language Information Retrieval for User-Generated Internet Video
    Khwileh, Ahmad
    Ganguly, Debasis
    Jones, Gareth J. F.
    EXPERIMENTAL IR MEETS MULTILINGUALITY, MULTIMODALITY, AND INTERACTION, 2015, 9283 : 117 - 129