hciLab

Human-Computer-Interaction

gaze

Over the last days there were a number of interesting papers presented and so it is not easy to pick a selection… Here is my random paper selection from Ubicomp 2008 that link to our work (the conference papers link into the ubicomp 2008 proceedings in the ACM DL, our references are below):

Don Patterson presented a survey on using IM. One of the finding surprised me: people seem to ignore “busy” settings. In some work we did in 2000 on mobile availability and sharing context users indicated that they would respect this or at least explain when… Continue reading

Using electrodes to detect eye movement and to detect reading [1] – relates to Heiko’s work but uses different sensing techniques. If the system can really be implemented in goggles this would be a great technologies for eye gestures as suggested in [2].

Utilizing infrastructures that are in place for activity sensing – the example is a heating/air condition/ventilation system [3]. I wondered and put forward the question how well this would work in active mode – where you actively create an airflow (using the already installed system) to detect the state of an environment.

Further interesting ideas:

Heiko Drewes and Richard Atterer, collegues from university of Munich, have travelled to Interact 2007. Their emails indicate that the conference is this year at a most interesting place. The conference is in Rio de Janeiro, directly at the Copacabana. The conference was highly competitive and we are happy to have two papers we can present there.

Heiko presents a paper that shows that eye gestures can be used to interact with a computer. In his experiments he shows that users can learn gesture with eyes (basically moving the eyes in a certain pattern, e.g. following the outline of a… Continue reading