Wednesday, January 20, 2010

SemFeel: A User Interface with Semantic Tactile Feedback for Mobile Touch-screen Devices, from UIST 2009

Comments

Summary
This paper, SemFeel: A User Interface with Semantic Tactile Feedback for Mobile Touch-screen Devices, by Yatani and Truong, presented a prototype of a new kind of vibration system for a touch-screen device like an iPod or a phone. The purpose is to improve on previous vibration systems that only vibrated the whole phone, or only in one specific location, and to present a “moving vibration” system in which users can distinguish between different patterns easier. With touch-screen keyboards that don’t have any sort of vibration feedback, it can be really hard to tell what you’re typing without always looking at the screen. SemFeel is designed to allow a user to use their device without always being forced to watch the screen.

SemFeel uses 5 different vibrators that can be tuned to 3 different vibration strengths to produce different patterns, such as top to bottom, or in a clockwise circular fashion. The picture below shows the prototype and the locations of the vibrators. The researchers, after an extensive user study test, found that 83.3 – 93.3% of the time, users can distinguish between the different patterns. The applications are far-reaching, including Braille on touch screens for the blind, and improved keyboard response time when typing on a touch-screen keyboard.

Discussion

I thought this technology seemed pretty interesting because having your phone vibrate from top to bottom to top when it’s ringing would be pretty cool! But I don’t really agree with the researcher’s reasons for developing the prototype: users want to know what they are pressing on when they aren’t looking at the screen at all. One example they gave was a calendar program in which the user would tap the top of the screen for the morning, the middle for the afternoon, and the bottom for the evening, and the phone would vibrate in the correct area with a vibration strength proportional to how busy the user is. For me, a calendar is supposed to be a list of things you have to do and what times you have to do them at. How can you possibly get any important information about your schedule without even looking at the phone? The only application that the researchers listed that made sense was Braille on a touch-screen. I thought it was pretty cool that almost 90% of blind people without any training on the touch-screen system could recognize the “vibration Braille” with only their previous knowledge of Braille on paper. So maybe this would be useful in the future, but I really don’t know if many blind people are buying touch-screen phones.

No comments:

Post a Comment