Visiomode: An open-source platform for building rodent touchscreen-based behavioral assays

被引:0
|
作者
Eleftheriou, Constantinos [1 ,2 ,3 ]
Clarke, Thomas [2 ,3 ]
Poon, V. [2 ,3 ]
Zechner, Marie [4 ]
Duguid, Ian [1 ,2 ,3 ,5 ]
机构
[1] Univ Edinburgh, Simons Initiat Developing Brain, Edinburgh EH8 9XD, Scotland
[2] Univ Edinburgh, Ctr Discovery Brain Sci, Edinburgh EH8 9XD, Scotland
[3] Univ Edinburgh, Patrick Wild Ctr, Edinburgh Med Sch, Biomed Sci, Edinburgh EH8 9XD, Scotland
[4] Univ Edinburgh, Roslin Inst, Midlothian EH25 9RG, Scotland
[5] Univ Edinburgh, Ctr Discovery Brain Sci, Edinburgh Med Sch, Biomed Sci, Hugh Robson Bldg,George Sq, Edinburgh EH8 9XD, Scotland
基金
英国惠康基金;
关键词
Touchscreen; Open-source; Behavior; Sensorimotor; Visiomode; Rodent; COGNITIVE DEFICITS; TOUCH; MICE; LESIONS; SCREEN; CORTEX; MODEL; TASK;
D O I
10.1016/j.jneumeth.2022.109779
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
Background: Touchscreen-based behavioral assays provide a robust method for assessing cognitive behavior in rodents, offering great flexibility and translational potential. The development of touchscreen assays presents a significant programming and mechanical engineering challenge, where commercial solutions can be prohibi-tively expensive and open-source solutions are underdeveloped, with limited adaptability.New method: Here, we present Visiomode (www.visiomode.org), an open-source platform for building rodent touchscreen-based behavioral tasks. Visiomode leverages the inherent flexibility of touchscreens to offer a simple yet adaptable software and hardware platform. The platform is built on the Raspberry Pi computer combining a web-based interface and powerful plug-in system with an operant chamber that can be adapted to generate a wide range of behavioral tasks.Results: As a proof of concept, we use Visiomode to build both simple stimulus-response and more complex visual discrimination tasks, showing that mice display rapid sensorimotor learning including switching between different motor responses (i.e., nose poke versus reaching).Comparison with existing methods: Commercial solutions are the 'go to' for rodent touchscreen behaviors, but the associated costs can be prohibitive, limiting their uptake by the wider neuroscience community. While several open-source solutions have been developed, efforts so far have focused on reducing the cost, rather than pro-moting ease of use and adaptability. Visiomode addresses these unmet needs providing a low-cost, extensible platform for creating touchscreen tasks.Conclusions: Developing an open-source, rapidly scalable and low-cost platform for building touchscreen-based behavioral assays should increase uptake across the science community and accelerate the investigation of cognition, decision-making and sensorimotor behaviors both in health and disease.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] Open-source software for automated rodent behavioral analysis
    Isik, Sena
    Unal, Gunes
    [J]. FRONTIERS IN NEUROSCIENCE, 2023, 17
  • [2] An Open-source Based ITS Platform
    Andersen, Ove
    Krogh, Benjamin B.
    Torp, Kristian
    [J]. 2013 IEEE 14TH INTERNATIONAL CONFERENCE ON MOBILE DATA MANAGEMENT (MDM 2013), VOL 2, 2013, : 27 - 32
  • [3] MouseBytes, an open-access high-throughput pipeline and database for rodent touchscreen-based cognitive assessment
    Beraldo, Flavio H.
    Palmer, Daniel
    Memar, Sara
    Wasserman, David I.
    Lee, Wai-Jane V.
    Liang, Shuai
    Creighton, Samantha D.
    Kolisnyk, Benjamin
    Cowan, Matthew F.
    Mels, Justin
    Masood, Talal S.
    Fodor, Chris
    Al-Onaizi, Mohammed A.
    Bartha, Robert
    Gee, Tom
    Saksida, Lisa M.
    Bussey, Timothy J.
    Strother, Stephen S.
    Prado, Vania F.
    Winters, Boyer D.
    Prado, Marco A. M.
    [J]. ELIFE, 2019, 8
  • [4] Building BESSER: An Open-Source Low-Code Platform
    Alfonso, Ivan
    Conrardy, Aaron
    Sulejmani, Armen
    Nirumand, Atefeh
    Ul Haq, Fitash
    Gomez-Vazquez, Marcos
    Sottet, Jean-Sebastien
    Cabot, Jordi
    [J]. ENTERPRISE, BUSINESS-PROCESS AND INFORMATION SYSTEMS MODELING, BPMDS 2024, EMMSAD 2024, 2024, 511 : 203 - 212
  • [5] Development of a Touchscreen-Based Rodent Flanker Task: Empirical Evaluation of Visual Stimuli
    Robble, Mykel
    Kangas, Brian
    Nickels, Stefanie
    Wooldridge, Lisa
    Cardenas, Emilia
    Perlo, Sarah
    Bergman, Jack
    Carlezon, William
    Pizzagalli, Diego
    [J]. NEUROPSYCHOPHARMACOLOGY, 2017, 42 : S186 - S187
  • [6] Electrophysiological signatures of reward learning in the rodent touchscreen-based Probabilistic Reward Task
    Iturra-Mena, Ann M.
    Kangas, Brian D.
    Luc, Oanh T.
    Potter, David
    Pizzagalli, Diego A.
    [J]. NEUROPSYCHOPHARMACOLOGY, 2023, 48 (04) : 700 - 709
  • [7] Electrophysiological signatures of reward learning in the rodent touchscreen-based Probabilistic Reward Task
    Ann M. Iturra‑Mena
    Brian D. Kangas
    Oanh T. Luc
    David Potter
    Diego A. Pizzagalli
    [J]. Neuropsychopharmacology, 2023, 48 : 700 - 709
  • [8] Building Automatic Clouds with an Open-source and Deployable Platform-as-a-service
    Petcu, Dana
    [J]. CLOUD COMPUTING AND BIG DATA, 2013, 23 : 3 - 19
  • [9] Building open-source AI
    Shrestha, Yash Raj
    von Krogh, Georg
    Feuerriegel, Stefan
    [J]. NATURE COMPUTATIONAL SCIENCE, 2023, 3 (11): : 908 - 911
  • [10] Building open-source AI
    Yash Raj Shrestha
    Georg von Krogh
    Stefan Feuerriegel
    [J]. Nature Computational Science, 2023, 3 : 908 - 911