Incorporating tilt-based interaction in multimodal user interfaces for mobile devices

被引:0
|
作者
Mantyjarvi, Jani [1 ]
Paterno, Fabio [2 ]
Santoro, Carmen [2 ]
机构
[1] VTT Tech Res Ctr, Kaitovayla 1, Oulu 90571, Finland
[2] ISTI, CNR, I-56124 Pisa, Italy
关键词
model-based design of user interfaces; gestural interfaces for mobile devices; tilt interfaces;
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Emerging ubiquitous environments raise the need to support multiple interaction modalities in diverse types of devices. Designing multimodal interfaces for ubiquitous environments using development tools creates challenges since target platforms support different resources and interfaces. Model-based approaches have been recognized as useful for managing the increasing complexity consequent to the many available interaction platforms. However, they have usually focused on graphical and/or vocal modalities. This paper presents a solution for enabling the development of tilt-based hand gesture and graphical modalities for mobile devices in a multimodal user interface development tool. The challenges related to developing gesture-based applications for various types of devices involving mobile devices are discussed in detail. The possible solution presented is based. on a logical description language for hand-gesture user interfaces. Such language allows us to obtain a user interface implementation on the target mobile platform. The solution is illustrated with an example application that can be accessed from both the desktop and mobile device supporting tilt-based gesture interaction.
引用
收藏
页码:230 / +
页数:3
相关论文
共 50 条
  • [32] Development of voice-based multimodal user interfaces
    Sena, Claudia Pinto P.
    Santos, Celso A. S.
    SIGMAP 2006: PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING AND MULTIMEDIA APPLICATIONS, 2006, : 310 - +
  • [33] Multimodal Effects of Color and Haptics on Intuitive Interaction with Tangible User Interfaces
    Loeffler, Diana
    Tscharn, Robert
    Hurtienne, Joern
    PROCEEDINGS OF THE TWELFTH INTERNATIONAL CONFERENCE ON TANGIBLE, EMBEDDED, AND EMBODIED INTERACTION (TEI'18), 2018, : 647 - 655
  • [34] Conversational design as a paradigm for user interaction on mobile devices
    Leong, MK
    MOBILE AND UBIQUITOUS INFORMATION ACCESS, 2004, 2954 : 11 - 27
  • [35] A Face Tracking Algorithm for User Interaction in Mobile Devices
    Bulbul, Abdullah
    Cipiloglu, Zeynep
    Capin, Tolga
    2009 INTERNATIONAL CONFERENCE ON CYBERWORLDS, 2009, : 385 - 390
  • [36] A Lightweight Multimodal Learning Model to Recognize User Sentiment in Mobile Devices
    Karjee, Jyotirmoy
    Srinidhi, N.
    Dwivedi, Gargi
    Bhagavath, Arun
    Ranjan, Prajwal
    2023 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS, ICCE, 2023,
  • [37] Incorporating sensory data collected on mobile devices into user experience analysis
    Nagy, Akos
    Kovari, Bence
    2014 5TH IEEE CONFERENCE ON COGNITIVE INFOCOMMUNICATIONS (COGINFOCOM), 2014, : 487 - 491
  • [38] Performance analysis of visual tracking algorithms for motion-based user interfaces: On mobile devices
    Winkler, Stefan
    Rangaswamy, Karthik
    Tedjokusumo, Jefry
    Zhou, ZhiYing
    MULTIMEDIA ON MOBILE DEVICES 2008, 2008, 6821
  • [39] An approach to automatic customization of user interfaces for mobile devices in pervasive environments
    Martini, Ricardo Giuliani
    Librelotto, Giovani Rubert
    de Azevedo, Renato Preigschadt
    Fiorin, Andre
    Kasper, Jeferson
    Mozzaquatro, Bruno Augusti
    Pereira, Rafael
    2012 XXXVIII CONFERENCIA LATINOAMERICANA EN INFORMATICA (CLEI), 2012,
  • [40] Multimodal Fake News Detection Incorporating External Knowledge and User Interaction Feature
    Fu, Lifang
    Liu, Shuai
    ADVANCES IN MULTIMEDIA, 2023, 2023