Currency Recognition App for Visually Impaired Users in India

被引:0
|
作者
Bhutkar, Ganesh [1 ]
Patil, Mansi [1 ]
Patil, Deepak [1 ]
Mukunde, Shivani [1 ]
Shinde, Rajdeep [1 ]
Rathod, Anamika [1 ]
机构
[1] Vishwakarma Inst Technol, Ctr Excellence HCI, Pune, Maharashtra, India
关键词
Intelligent system; Indian paper currency; Currency recognition; Currency model; Convolutional neural network; Visually impaired users; Audio output;
D O I
10.1007/978-3-031-02904-2_10
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A system for the recognition of currency notes is one kind of intelligent system which is a very important need for visually impaired and blind users in the modern world of today. In this paper, we present a currency recognition app applied to Indian currency notes. Our proposed system is based on interesting features and correlations between images. It uses the Convolutional Neural Network for classification. The method takes Indian rupee paper currencies as a model. The method is quite reasonable in terms of accuracy. The system deals with the images of all the currency note denominations, some of which are tilted to an angle less than 150. The rest of the currency images consist of mixed, noisy, and normal images. It uses the current series (1996-2020) of currency issued by the Reserve Bank of India (RBI) as a model currency under consideration. The system produces an accuracy of recognition of 94.38% and gives an audio output to the users. The proposed technique produces quite satisfactory results in terms of recognition and efficiency. In the future, this app can be improved by adding a dataset of other currency notes of the world.
引用
收藏
页码:201 / 216
页数:16
相关论文
共 50 条
  • [41] An improved Web search engine for visually impaired users
    Yi-Fan Yang
    Sheue-Ling Hwang
    Bo Schenkman
    Universal Access in the Information Society, 2012, 11 : 113 - 124
  • [42] Analyzing Visual Questions from Visually Impaired Users
    Brady, Erin
    ASSETS 11: PROCEEDINGS OF THE 13TH INTERNATIONAL ACM SIGACCESS CONFERENCE ON COMPUTERS AND ACCESSIBILITY, 2011, : 309 - 310
  • [43] Speech and touch enhanced interface for visually impaired users
    Oppenheim, Matthew
    JOURNAL OF ASSISTIVE TECHNOLOGIES, 2013, 7 (03) : 149 - 159
  • [44] Familiarity and understanding of assistive technology among visually impaired users of libraries of special institutes of Haryana, India
    Singh, Dalip
    Gupta, Dinesh K.
    COLLECTION AND CURATION, 2023, 42 (04) : 137 - 147
  • [45] Improving Mobility of Pedestrian Visually-Impaired Users
    Fanucci, Luca
    Roncella, Roberto
    Iacopetti, Fabrizio
    Donati, Massimiliano
    Calabro, Antonello
    Leporini, Barbara
    Santoro, Carmen
    EVERYDAY TECHNOLOGY FOR INDEPENDENCE AND CARE, 2011, 29 : 595 - 603
  • [46] Mobile Application Accessibility in the Context of Visually Impaired Users
    da Silva, Claudia Ferreira
    Ferreira, Simone B. Leal
    Sacramento, Carolina
    PROCEEDINGS OF THE 17TH BRAZILIAN SYMPOSIUM ON HUMAN FACTORS IN COMPUTING SYSTEMS (IHC 2018), 2015,
  • [47] Content Authoring with Markdown for Visually Impaired and Blind Users
    Oelen, Allard
    Auer, Soeren
    2019 IEEE INTERNATIONAL SYMPOSIUM ON MULTIMEDIA (ISM 2019), 2019, : 285 - 290
  • [48] ENVISION: Assisted Navigation of Visually Impaired Smartphone Users
    Khenkar, Shoroog
    Alsulaiman, Hanan
    Ismail, Shahad
    Fairaq, Alaa
    Jarraya, Salma Kammoun
    Ben-Abdallah, Hanene
    INTERNATIONAL CONFERENCE ON ENTERPRISE INFORMATION SYSTEMS/INTERNATIONAL CONFERENCE ON PROJECT MANAGEMENT/INTERNATIONAL CONFERENCE ON HEALTH AND SOCIAL CARE INFORMATION SYSTEMS AND TECHNOLOGIES, CENTERIS/PROJMAN / HCIST 2016, 2016, 100 : 128 - 135
  • [49] ACCESSIBILITY AND SOME EDUCATIONAL BARRIERS FOR VISUALLY IMPAIRED USERS
    Bogdanova, G.
    Sabev, N.
    Noev, N.
    13TH INTERNATIONAL TECHNOLOGY, EDUCATION AND DEVELOPMENT CONFERENCE (INTED2019), 2019, : 9416 - 9421
  • [50] A Mobile Phone Wayfinding System for Visually Impaired Users
    Coughlan, J.
    Manduchi, R.
    ASSISTIVE TECHNOLOGY FROM ADAPTED EQUIPMENT TO INCLUSIVE ENVIRONMENTS, 2009, 25 : 849 - 849