>Gestural Input on a Touch Screen Steering Wheel in the Media

>At CHI 2011 we presented initial  work on how to use gestural input on a multi-touch steering wheel [1]; a 20 second video is also available [2]. The paper described a prototype – a steering wheel where the entire surface is a display and can recognize touch input. The study had two parts. In the first part we identified a natural gesture set for interaction and in the second part we looked at how such interaction impacts the visual demand for the diver. The results in short: using gestural input on the steering wheel reduces the visual demand for the driver.

Shortly after the conference a journalist from discovery news picked the topic up and did some interviews. This resulted in an article: “Touch-Screen Steering Wheel Keeps Eyes on Road” (Discovery News, 6.6.2011)

ACM Tech News mentioned the Discovery News article News (ACM Tech News June 8 2011).
After this it found its way around and appeared more widely than expected :-) examples include

 There were also a German article “Touchscreen-Lenkrad Wischen wechselt Radiosender” (sp-x, 14.6.2011), e.g. found in:

[1] Tanja Döring, Dagmar Kern, Paul Marshall, Max Pfeiffer, Johannes Schöning, Volker Gruhn, and Albrecht Schmidt. 2011. Gestural interaction on the steering wheel: reducing the visual demand. In Proceedings of the 2011 annual conference on Human factors in computing systems (CHI ’11). ACM, New York, NY, USA, 483-492. DOI=10.1145/1978942.1979010 http://doi.acm.org/10.1145/1978942.1979010

[2] http://www.youtube.com/watch?v=R_32jOlQY7E Gestural Interaction on the Steering Wheel – Reducing the Visual Demand. chi2011madness Video.

Gestural Input on a Touch Screen Steering Wheel in the Media

At CHI 2011 we presented initial  work on how to use gestural input on a multi-touch steering wheel [1]; a 20 second video is also available [2]. The paper described a prototype – a steering wheel where the entire surface is a display and can recognize touch input. The study had two parts. In the first part we identified a natural gesture set for interaction and in the second part we looked at how such interaction impacts the visual demand for the diver. The results in short: using gestural input on the steering wheel reduces the visual demand for the driver.

Shortly after the conference a journalist from discovery news picked the topic up and did some interviews. This resulted in an article: “Touch-Screen Steering Wheel Keeps Eyes on Road” (Discovery News, 6.6.2011)

ACM Tech News mentioned the Discovery News article News (ACM Tech News June 8 2011).
After this it found its way around and appeared more widely than expected :-) examples include

 There were also a German article “Touchscreen-Lenkrad Wischen wechselt Radiosender” (sp-x, 14.6.2011), e.g. found in:

[1] Tanja Döring, Dagmar Kern, Paul Marshall, Max Pfeiffer, Johannes Schöning, Volker Gruhn, and Albrecht Schmidt. 2011. Gestural interaction on the steering wheel: reducing the visual demand. In Proceedings of the 2011 annual conference on Human factors in computing systems (CHI ’11). ACM, New York, NY, USA, 483-492. DOI=10.1145/1978942.1979010 http://doi.acm.org/10.1145/1978942.1979010

[2] http://www.youtube.com/watch?v=R_32jOlQY7E Gestural Interaction on the Steering Wheel – Reducing the Visual Demand. chi2011madness Video.

Gestural Input on a Touch Screen Steering Wheel in the Media

At CHI 2011 we presented initial  work on how to use gestural input on a multi-touch steering wheel [1]; a 20 second video is also available [2]. The paper described a prototype – a steering wheel where the entire surface is a display and can recognize touch input. The study had two parts. In the first part we identified a natural gesture set for interaction and in the second part we looked at how such interaction impacts the visual demand for the diver. The results in short: using gestural input on the steering wheel reduces the visual demand for the driver.

Shortly after the conference a journalist from discovery news picked the topic up and did some interviews. This resulted in an article: “Touch-Screen Steering Wheel Keeps Eyes on Road” (Discovery News, 6.6.2011)

ACM Tech News mentioned the Discovery News article News (ACM Tech News June 8 2011).
After this it found its way around and appeared more widely than expected :-) examples include

 There were also a German article “Touchscreen-Lenkrad Wischen wechselt Radiosender” (sp-x, 14.6.2011), e.g. found in:

[1] Tanja Döring, Dagmar Kern, Paul Marshall, Max Pfeiffer, Johannes Schöning, Volker Gruhn, and Albrecht Schmidt. 2011. Gestural interaction on the steering wheel: reducing the visual demand. In Proceedings of the 2011 annual conference on Human factors in computing systems (CHI ’11). ACM, New York, NY, USA, 483-492. DOI=10.1145/1978942.1979010 http://doi.acm.org/10.1145/1978942.1979010

[2] http://www.youtube.com/watch?v=R_32jOlQY7E Gestural Interaction on the Steering Wheel – Reducing the Visual Demand. chi2011madness Video.

>Poker surface on youtube – 5000 hits in a day :-)

>A video describing the poker surface is available in youtube. It is an implementation of a poker game on a combination of a multi-touch table and mobile phones, for details see [1].

It is amazing how quickly it is picked up. It gained about 5000 views in a single day and it is already features in engadget.com, gizmodo.com, ubergizmo.com and on recombu.com. But as the comments on pokerolymp.com show the real poker players are hard to impress…

This really makes me think how research, publishing, and public perception of research is changing – rapidly…

[1] Shirazi, A. S., Döring, T., Parvahan, P., Ahrens, B., and Schmidt, A. 2009. Poker surface: combining a multi-touch table and mobile phones in interactive card games. In Proceedings of the 11th international Conference on Human-Computer interaction with Mobile Devices and Services (Bonn, Germany, September 15 – 18, 2009). MobileHCI ’09. ACM, New York, NY, 1-2. DOI= http://doi.acm.org/10.1145/1613858.1613945

Poker surface on youtube – 5000 hits in a day :-)

A video describing the poker surface is available in youtube. It is an implementation of a poker game on a combination of a multi-touch table and mobile phones, for details see [1].

It is amazing how quickly it is picked up. It gained about 5000 views in a single day and it is already features in engadget.com, gizmodo.com, ubergizmo.com and on recombu.com. But as the comments on pokerolymp.com show the real poker players are hard to impress…

This really makes me think how research, publishing, and public perception of research is changing – rapidly…

[1] Shirazi, A. S., Döring, T., Parvahan, P., Ahrens, B., and Schmidt, A. 2009. Poker surface: combining a multi-touch table and mobile phones in interactive card games. In Proceedings of the 11th international Conference on Human-Computer interaction with Mobile Devices and Services (Bonn, Germany, September 15 – 18, 2009). MobileHCI ’09. ACM, New York, NY, 1-2. DOI= http://doi.acm.org/10.1145/1613858.1613945

Poker surface on youtube – 5000 hits in a day :-)

A video describing the poker surface is available in youtube. It is an implementation of a poker game on a combination of a multi-touch table and mobile phones, for details see [1].

It is amazing how quickly it is picked up. It gained about 5000 views in a single day and it is already features in engadget.com, gizmodo.com, ubergizmo.com and on recombu.com. But as the comments on pokerolymp.com show the real poker players are hard to impress…

This really makes me think how research, publishing, and public perception of research is changing – rapidly…

[1] Shirazi, A. S., Döring, T., Parvahan, P., Ahrens, B., and Schmidt, A. 2009. Poker surface: combining a multi-touch table and mobile phones in interactive card games. In Proceedings of the 11th international Conference on Human-Computer interaction with Mobile Devices and Services (Bonn, Germany, September 15 – 18, 2009). MobileHCI ’09. ACM, New York, NY, 1-2. DOI= http://doi.acm.org/10.1145/1613858.1613945

>The computer mouse – next generation?

>In my lecture on user interface engineering I start out with a short history of human computer interaction. I like to discuss ideas and inventions in the context of the people who did it, besides others I take about Vannevar Bush and his vision of information processing [1], Ivan Sutherland’s sketchpad [2], Doug Engelbart’s CSCW demo (including the mouse) [3], and Alan Kay’s vision of the Dynabook [4].

One aspect of looking at the history is to better understand the future of interaction with computers. One typical question I ask in class is “what is the ultimate user interface” and typical answers are “direct interface to my brain – the computer will do what I think” and “mouse and keyboard” – both answers showing some insight…

As the mouse is still a very import input device (and probably for some time to come) there is a recent paper that I find really interesting. It looks at how the mouse could be enhanced – Nicolas Villar and his colleagues put really a lot of ideas together [5]. The paper is worthwhile to read – but if you don’t have time at least watch it on youtube.

[1] Vannevar Bush, As we may think, Atlantic monthly, July 1945.
[2] Ivan Sutherland, “Sketchpad: A Man-Machine Graphical Communication System” Technical Report No. 296, Lincoln Laboratory, Massachusetts Institute of Technology via Defense Technical Information Center January 1963. (PDF, youtube).
[3] Douglas Engelbart, the demo 1968. (Overview, youtube)
[4] John Lees. The World In Your Own Notebook (Alan Kay’s Dynabook project at Xerox PARC). The Best of Creative Computing. Volume 3 (1980)
[5] Villar, N., Izadi, S., Rosenfeld, D., Benko, H., Helmes, J., Westhues, J., Hodges, S., Ofek, E., Butler, A., Cao, X., and Chen, B. 2009. Mouse 2.0: multi-touch meets the mouse. In Proceedings of the 22nd Annual ACM Symposium on User interface Software and Technology (Victoria, BC, Canada, October 04 – 07, 2009). UIST ’09. ACM, New York, NY, 33-42. DOI= http://doi.acm.org/10.1145/1622176.1622184

The computer mouse – next generation?

In my lecture on user interface engineering I start out with a short history of human computer interaction. I like to discuss ideas and inventions in the context of the people who did it, besides others I take about Vannevar Bush and his vision of information processing [1], Ivan Sutherland’s sketchpad [2], Doug Engelbart’s CSCW demo (including the mouse) [3], and Alan Kay’s vision of the Dynabook [4].

One aspect of looking at the history is to better understand the future of interaction with computers. One typical question I ask in class is “what is the ultimate user interface” and typical answers are “direct interface to my brain – the computer will do what I think” and “mouse and keyboard” – both answers showing some insight…

As the mouse is still a very import input device (and probably for some time to come) there is a recent paper that I find really interesting. It looks at how the mouse could be enhanced – Nicolas Villar and his colleagues put really a lot of ideas together [5]. The paper is worthwhile to read – but if you don’t have time at least watch it on youtube.

[1] Vannevar Bush, As we may think, Atlantic monthly, July 1945.
[2] Ivan Sutherland, “Sketchpad: A Man-Machine Graphical Communication System” Technical Report No. 296, Lincoln Laboratory, Massachusetts Institute of Technology via Defense Technical Information Center January 1963. (PDF, youtube).
[3] Douglas Engelbart, the demo 1968. (Overview, youtube)
[4] John Lees. The World In Your Own Notebook (Alan Kay’s Dynabook project at Xerox PARC). The Best of Creative Computing. Volume 3 (1980)
[5] Villar, N., Izadi, S., Rosenfeld, D., Benko, H., Helmes, J., Westhues, J., Hodges, S., Ofek, E., Butler, A., Cao, X., and Chen, B. 2009. Mouse 2.0: multi-touch meets the mouse. In Proceedings of the 22nd Annual ACM Symposium on User interface Software and Technology (Victoria, BC, Canada, October 04 – 07, 2009). UIST ’09. ACM, New York, NY, 33-42. DOI= http://doi.acm.org/10.1145/1622176.1622184

The computer mouse – next generation?

In my lecture on user interface engineering I start out with a short history of human computer interaction. I like to discuss ideas and inventions in the context of the people who did it, besides others I take about Vannevar Bush and his vision of information processing [1], Ivan Sutherland’s sketchpad [2], Doug Engelbart’s CSCW demo (including the mouse) [3], and Alan Kay’s vision of the Dynabook [4].

One aspect of looking at the history is to better understand the future of interaction with computers. One typical question I ask in class is “what is the ultimate user interface” and typical answers are “direct interface to my brain – the computer will do what I think” and “mouse and keyboard” – both answers showing some insight…

As the mouse is still a very import input device (and probably for some time to come) there is a recent paper that I find really interesting. It looks at how the mouse could be enhanced – Nicolas Villar and his colleagues put really a lot of ideas together [5]. The paper is worthwhile to read – but if you don’t have time at least watch it on youtube.

[1] Vannevar Bush, As we may think, Atlantic monthly, July 1945.
[2] Ivan Sutherland, “Sketchpad: A Man-Machine Graphical Communication System” Technical Report No. 296, Lincoln Laboratory, Massachusetts Institute of Technology via Defense Technical Information Center January 1963. (PDF, youtube).
[3] Douglas Engelbart, the demo 1968. (Overview, youtube)
[4] John Lees. The World In Your Own Notebook (Alan Kay’s Dynabook project at Xerox PARC). The Best of Creative Computing. Volume 3 (1980)
[5] Villar, N., Izadi, S., Rosenfeld, D., Benko, H., Helmes, J., Westhues, J., Hodges, S., Ofek, E., Butler, A., Cao, X., and Chen, B. 2009. Mouse 2.0: multi-touch meets the mouse. In Proceedings of the 22nd Annual ACM Symposium on User interface Software and Technology (Victoria, BC, Canada, October 04 – 07, 2009). UIST ’09. ACM, New York, NY, 33-42. DOI= http://doi.acm.org/10.1145/1622176.1622184

Keynote at Pervasive 2009 – Toshio Iwai

Toshio Iwai gave the keynote at Pervasive 2009 on expanding media art. He introduced us to the basics of moving images and films. The examples were fun and I think I will copy some for my introductory class on user interfaces for explaining the visual system (afterimages with a black-and-white negative image; the concept of combining images on two sides of a disk; the idea of moving images by using a flip book).

In his introduction he also went back to explain what he learned as a child and I found this very interesting and encouraging to expose smaller children more to technology than we usually tend to do (especially in Germany I think we do not give children much chance to explore technologies while they are in kindergarten and primary school). Hope to go with Vivien to the space center in Florida in few weeks :-)

Following up from the basic visual effects he showed some really funny life video effects. He introduced a delay to some parts (lines) in the picture when displaying which led to ghostly movements. Everything that is not moving appears in its real shape and everything that is in motion will be deformed.

In the final part of his talk he argued that the Theremin is the only electronic instrument that has been newly invented in the 20th century. For him an instrument has to have unique interaction, unique shape, and unique sound. Additional for the interaction it is essential that the interaction can be perceived by the audience (you can see how one plays a violin but not how one makes digital music with a laptop computer). Based on this he show a new musical instrument he developed that is inspired by a music box. The instrument is the TENORI-ON [1]. It has a surface with 16×16 switches (that include an LED) and 16×16 LEDs on the back. It has a unique interaction, its shape and sound is unique and it supports visibility of interaction as the sound is combined with light pattern. The basic idea is that the horizontal direction is the time line and the vertical the pitch (similar to a music box).

[1] Yu Nishibori, Toshio Iwai. TENORI-ON. Proceedings of the 2006 International Conference on New Interfaces for Musical Expression (NIME06), Paris, France. http://www.nime.org/2006/proc/nime2006_172.pdf