Gaze-based interaction allows people with severe motion disabilities to communicate through a computer using their eye movements. The most common method is text entry, or "typing", by gaze.
The most used techniques for text entry by gaze are based in virtual keyboards. The user performs a selections by fixating their gaze on the desired key for a given dwell time. This technique, though simple, forces the user to wait before performing each selection, resulting in low typing speed.
In this seminar we will present two recent advances proposed by our research group to improve the speed of text entry by gaze with virtual keyboards. The first, developed in collaboration with researchers from Boston University, eliminates the need to wait for each selection, as words are obtained from a dictionary according to the gaze pattern of the user on the keyboard.
The second, instead of trying to eliminate the wait time for each selection, show augmented information on the keys, so the user can use the dwell time to detect typing errors, and explore the list of most probable words without moving their gaze from the fixated key. Both techniques improve the typing speed, as shown by the user studies.
Eye gaze trackers are devices that indicate the gaze position on a known object, for example, a computer monitor. Earlier these devices were used primarily for research. The size and cost reduction of such devices has been allowing its use for more general applications, with particular success in assisting people with severe motion disabilities. In this talk we will present recent advances, current challenges in the field, and the research opportunities in the eye gaze tracking field, based on work presented at ETRA 2016. As it is the first HCI seminar in 2016, prof. Morimoto will also provide a brief description of the projects being developed by the HCI research group at DCC-IME-USP, with applications in wearable computing, augmented reality and gaze-based interaction.