USAR: An Interactive User-specific Aesthetic Ranking Framework for Images

被引:24
|
作者
Lv, Pei [1 ]
Wang, Meng [1 ]
Xu, Yongbo [1 ]
Peng, Ze [1 ]
Sun, Junyi [1 ]
Su, Shimei [1 ]
Zhou, Bing [1 ]
Xu, Mingliang [1 ]
机构
[1] Zhengzhou Univ, Zhengzhou, Henan, Peoples R China
基金
中国国家自然科学基金;
关键词
user-specific; aesthetic assessment; ranking model; deep learning;
D O I
10.1145/3240508.3240635
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
When assessing whether an image is of high or low quality, it is indispensable to take personal preference into account. Existing aesthetic models lay emphasis on hand-crafted features or deep features commonly shared by high quality images, but with limited or no consideration for personal preference and user interaction. To that end, we propose a novel and user-friendly aesthetic ranking framework via powerful deep neural network and a small amount of user interaction, which can automatically estimate and rank the aesthetic characteristics of images in accordance with users' preference. Our framework takes as input a series of photos that users prefer, and produces as output a reliable, user-specific aesthetic ranking model matching with users' preference. Considering the subjectivity of personal preference and the uncertainty of user's single selection, a unique and exclusive dataset will be constructed interactively to describe the preference of one individual by retrieving the most similar images with regard to those specified by users. Based on this unique user-specific dataset and sufficient well-designed aesthetic attributes, a customized aesthetic distribution model can be learned, which concatenates both personalized preference and aesthetic rules. We conduct extensive experiments and user studies on two large-scale public datasets, and demonstrate that our framework outperforms those work based on conventional aesthetic assessment or ranking model.
引用
收藏
页码:1328 / 1336
页数:9
相关论文
共 50 条
  • [1] Interactive Recommendation with User-Specific Deep Reinforcement Learning
    Lei, Yu
    Li, Wenjie
    [J]. ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2019, 13 (06)
  • [2] A framework supporting user-specific services in RFID systems
    Chen, Chin-Ling
    [J]. FIFTH IEEE INTERNATIONAL CONFERENCE ON WIRELESS, MOBILE AND UBIQUITOUS TECHNOLOGIES IN EDUCATION, PROCEEDINGS, 2008, : 182 - 184
  • [3] A user-specific and selective multimodal biometric fusion strategy by ranking subjects
    Poh, Norman
    Ross, Arun
    Lee, Weifeng
    Kittler, Josef
    [J]. PATTERN RECOGNITION, 2013, 46 (12) : 3341 - 3357
  • [4] Averting Man In The Browser Attack using User-Specific Personal Images
    Goyal, Puneet
    Bansal, Naman
    Gupta, Neeraj
    [J]. PROCEEDINGS OF THE 2013 3RD IEEE INTERNATIONAL ADVANCE COMPUTING CONFERENCE (IACC), 2013, : 1283 - 1286
  • [5] Averting man in the browser attack using user-specific personal images
    Goyal, Puneet
    Bansal, Naman
    Gupta, Neeraj
    [J]. Proceedings of the 2013 3rd IEEE International Advance Computing Conference, IACC 2013, 2013, : 1283 - 1286
  • [6] User-Specific Perspectives on Ontologies
    Brochhausen, Mathias
    Slaughter, Laura
    Stenzhorn, Holger
    Graf, Norbert
    [J]. MEDICAL AND CARE COMPUNETICS 6, 2010, 156 : 114 - 121
  • [7] USER-SPECIFIC WATER DEMAND ELASTICITIES
    SCHNEIDER, ML
    WHITLATCH, EE
    [J]. JOURNAL OF WATER RESOURCES PLANNING AND MANAGEMENT-ASCE, 1991, 117 (01): : 52 - 73
  • [8] REPUTATIONAL RATING AND USER-SPECIFIC CONSENT
    Toscano, Gabriele
    [J]. REVISTA BOLIVIANA DE DERECHO, 2023, (36) : 46 - 55
  • [9] User-specific tool for project management
    不详
    [J]. HYDROCARBON PROCESSING, 1998, 77 (12): : 29 - 29
  • [10] Assessing user-specific difficulty of documents
    Paukkeri, Mari-Sanna
    Ollikainen, Marja
    Honkela, Timo
    [J]. INFORMATION PROCESSING & MANAGEMENT, 2013, 49 (01) : 198 - 212