Pre-Touch Sensing for Mobile Interaction

被引:58
|
作者
Hinckley, Ken [1 ]
Heo, Seongkook [1 ,2 ]
Pahud, Michel [1 ]
Holz, Christian [1 ]
Benko, Hrvoje [1 ]
Sellen, Abigail [3 ]
Banks, Richard [3 ]
O'Hara, Kenton [3 ]
Smyth, Gavin [3 ]
Buxton, Bill [1 ,3 ]
机构
[1] Microsoft Res, Redmond, WA 98052 USA
[2] Korea Adv Inst Sci & Technol, HCI Lab, Dept Comp Sci, Daejeon, South Korea
[3] Microsoft Res, Cambridge, England
关键词
Multi-touch; hover; grip; context sensing; mobile interaction;
D O I
10.1145/2858036.2858095
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Touchscreens continue to advance-including progress towards sensing fingers proximal to the display. We explore this emerging pre-touch modality via a self-capacitance touchscreen that can sense multiple fingers above a mobile device, as well as grip around the screen's edges. This capability opens up many possibilities for mobile interaction. For example, using pre-touch in an anticipatory role affords an "ad-lib interface" that fades in a different UI-appropriate to the context-as the user approaches one-handed with a thumb, two-handed with an index finger, or even with a pinch or two thumbs. Or we can interpret pre-touch in a retroactive manner that leverages the approach trajectory to discern whether the user made contact with a ballistic vs. a finely-targeted motion. Pre-touch also enables hybrid touch + hover gestures, such as selecting an icon with the thumb while bringing a second finger into range to invoke a context menu at a convenient location. Collectively these techniques illustrate how pre-touch sensing offers an intriguing new back-channel for mobile interaction.
引用
收藏
页码:2869 / 2881
页数:13
相关论文
共 50 条
  • [1] SLURP! Spectroscopy of Liquids Using Robot Pre-Touch Sensing
    Hanson, Nathaniel
    Lewis, Wesley
    Puthuveetil, Kavya
    Furline, Donelle
    Padmanabha, Akhil
    Padir, Taskin
    Erickson, Zackory
    [J]. 2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA, 2023, : 3786 - 3792
  • [2] Improved Object Pose Estimation via Deep Pre-touch Sensing
    Lancaster, Patrick
    Yang, Boling
    Smith, Joshua R.
    [J]. 2017 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2017, : 2448 - 2455
  • [3] Pre-touch reaction is preferred over post-touch reaction in interaction with displayed agent
    Shiomi, Masahiro
    [J]. PeerJ Computer Science, 2024, 10
  • [4] Pre-touch reaction is preferred over post-touch reaction in interaction with displayed agent
    Shiomi, Masahiro
    [J]. PEERJ COMPUTER SCIENCE, 2024, 10
  • [5] Design and Application of a Novel Radio Frequency Wireless Sensor for Pre-Touch Sensing and Grasping of Objects
    Gharibi, Armin Gharibi
    Costa, Filippo
    Genovesi, Simone
    [J]. IEEE SENSORS JOURNAL, 2024, 24 (06) : 7573 - 7583
  • [6] Underwater pre-touch based on artificial electric sense
    Boyer, Frederic
    Lebastard, Vincent
    Ferrer, Steven Bruce
    Geffard, Franck
    [J]. INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2020, 39 (06): : 729 - 749
  • [7] Effects of appearance and gender on pre-touch proxemics in virtual reality
    Kimoto, Mitsuhiko
    Otsuka, Yohei
    Imai, Michita
    Shiomi, Masahiro
    [J]. FRONTIERS IN PSYCHOLOGY, 2023, 14
  • [8] Evaluating gaze behaviors as pre-touch reactions for virtual agents
    Mejia, Dario Alfonso Cuello
    Sumioka, Hidenobu
    Ishiguro, Hiroshi
    Shiomi, Masahiro
    [J]. FRONTIERS IN PSYCHOLOGY, 2023, 14
  • [9] Mouse, Touch, or Fich: Comparing Traditional Input Modalities to a Novel Pre-Touch Technique
    Mueller, Jonas
    Rieger, Lea
    Aslan, Ilhan
    Anneser, Christoph
    Sandstede, Malte
    Schwarzmeier, Felix
    Petrak, Bjoern
    Andre, Elisabeth
    [J]. MUM 2019: 18TH INTERNATIONAL CONFERENCE ON MOBILE AND UBIQUITOUS MULTIMEDIA, 2019,
  • [10] Preliminary Investigation of Pre-Touch Reaction Distances toward Virtual Agents
    Sato, Aoba
    Kimoto, Mitsuhiko
    Iio, Takamasa
    Shimohara, Katsunori
    Shiomi, Masahiro
    [J]. PROCEEDINGS OF THE 7TH INTERNATIONAL CONFERENCE ON HUMAN-AGENT INTERACTION (HAI'19), 2019, : 292 - 293