Learning Visual Storylines with Skipping Recurrent Neural Networks

被引:15
|
作者
Sigurdsson, Gunnar A. [1 ]
Chen, Xinlei [1 ]
Gupta, Abhinav [1 ]
机构
[1] Carnegie Mellon Univ, Pittsburgh, PA 15213 USA
来源
关键词
D O I
10.1007/978-3-319-46454-1_5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
What does a typical visit to Paris look like? Do people first take photos of the Louvre and then the Eiffel Tower? Can we visually model a temporal event like "Paris Vacation" using current frameworks? In this paper, we explore how we can automatically learn the temporal aspects, or storylines of visual concepts from web data. Previous attempts focus on consecutive image-to-image transitions and are unsuccessful at recovering the long-term underlying story. Our novel Skipping Recurrent Neural Network (S-RNN) model does not attempt to predict each and every data point in the sequence, like classic RNNs. Rather, S-RNN uses a framework that skips through the images in the photo stream to explore the space of all ordered subsets of the albums via an efficient sampling procedure. This approach reduces the negative impact of strong short-term correlations, and recovers the latent story more accurately. We show how our learned storylines can be used to analyze, predict, and summarize photo albums from Flickr. Our experimental results provide strong qualitative and quantitative evidence that S-RNN is significantly better than other candidate methods such as LSTMs on learning long-term correlations and recovering latent storylines. Moreover, we show how storylines can help machines better understand and summarize photo streams by inferring a brief personalized story of each individual album.
引用
收藏
页码:71 / 88
页数:18
相关论文
共 50 条
  • [21] Learning to Adaptively Scale Recurrent Neural Networks
    Hu, Hao
    Wang, Liqiang
    Qi, Guo-Jun
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 3822 - 3829
  • [22] Learning Question Similarity with Recurrent Neural Networks
    Ye, Borui
    Feng, Guangyu
    Cheriton, David R.
    Cui, Anqi
    Li, Ming
    2017 IEEE INTERNATIONAL CONFERENCE ON BIG KNOWLEDGE (IEEE ICBK 2017), 2017, : 111 - 118
  • [23] A learning algorithm for improved recurrent neural networks
    Chen, CH
    Yu, LW
    1997 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, 1997, : 2198 - 2202
  • [24] Learning and bifurcation diagram of recurrent neural networks
    Sato, S
    Gohara, K
    PROGRESS IN CONNECTIONIST-BASED INFORMATION SYSTEMS, VOLS 1 AND 2, 1998, : 363 - 366
  • [25] Inaccessibility in online learning of recurrent neural networks
    Saito, A
    Taiji, M
    Ikegami, T
    PHYSICAL REVIEW LETTERS, 2004, 93 (16) : 168101 - 1
  • [26] Learning Morphological Transformations with Recurrent Neural Networks
    Biswas, Saurav
    Breuel, Thomas
    INNS CONFERENCE ON BIG DATA 2015 PROGRAM, 2015, 53 : 335 - 344
  • [27] Learning Multiple Timescales in Recurrent Neural Networks
    Alpay, Tayfun
    Heinrich, Stefan
    Wermter, Stefan
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2016, PT I, 2016, 9886 : 132 - 139
  • [28] Quantum recurrent neural networks for sequential learning
    Li, Yanan
    Wang, Zhimin
    Han, Rongbing
    Shi, Shangshang
    Li, Jiaxin
    Shang, Ruimin
    Zheng, Haiyong
    Zhong, Guoqiang
    Gu, Yongjian
    NEURAL NETWORKS, 2023, 166 : 148 - 161
  • [29] Learning minimal automata with recurrent neural networks
    Aichernig, Bernhard K.
    Koenig, Sandra
    Mateis, Cristinel
    Pferscher, Andrea
    Tappler, Martin
    SOFTWARE AND SYSTEMS MODELING, 2024, 23 (03): : 625 - 655