Paper and demo in Salzburg at Auto-UI-2011

At the automotive user interface conference in Salzburg we presented some of our research. Salzburg is a really nice place and Manfred and his team did a great job organizing the conference!

Based on the Bachelor Thesis of Stefan Schneegaß and some follow-up work we published a full paper [1] that describes a KLM-Model for the car and a prototyping tools that makes use of the model. In the model we look at the specific needs in the car, model rotary controllers, and cater for the limited attention while driving. The prototyping tool provides means to quickly estimate interaction times. It supports visual prototyping using images of the UI and tangible prototyping using Nic Villar´s VoodooIO. Looking forward to having Stefan on our team full-time 🙂

We additionally had a demo on a recently completed thesis by Michael Kienast. Here we looked at how speech and gestures can be combined for controlling functions, such as mirror adjustments or windscreen wipers, in the car. This multimodal approach combines the strength of gestural interaction and speech interaction [2].

The evening event of the conference was at Festung Hohensalzburg – with a magnificent view over the twon!

[1] Stefan Schneegaß, Bastian Pfleging, Dagmar Kern, Albrecht Schmidt. Support for modeling interaction with in-vehicle interfaces. (PDF) Proceedings of 3rd international conference on Automotive User Interfaces and Vehicular Applications 2011 (http://auto-ui.org). Salzburg. 30.11-2.12.2011

[2] Bastian Pfleging, Michael Kienast, Albrecht Schmidt. DEMO: A Multimodal Interaction Style Combining Speech and Touch Interaction in Automotive Environments. Adjunct proceedings of 3rd international conference on Automotive User Interfaces and Vehicular Applications 2011 (http://auto-ui.org). Salzburg. 30.11-2.12.2011

Ephemeral User Interfaces – nothing lasts forever and somethings even shorter

Robustness and durability are typical qualities that we aim for when building interactive prototypes and systems. Tanja and Axel explored what user experience we can create when we deliberately design something in a way that is ephemeral (=not lasting, there is a good German word “vergänglich”). Ephemeral User Interfaces are user interface elements and technologies that are designed to be engaging but fragile [1]. In the prototype that we showed at TEI 2010 in Cambridge the user can interact with soap bubbles to control a computer. Axel has some additional photos on his web page.


There is a short video of the installation on the technology review blog: http://www.technologyreview.com/blog/editors/24729

[1] Sylvester, A., Döring, T., and Schmidt, A. 2010. Liquids, smoke, and soap bubbles: reflections on materials for ephemeral user interfaces. In Proceedings of the Fourth international Conference on Tangible, Embedded, and Embodied interaction(Cambridge, Massachusetts, USA, January 24 – 27, 2010). TEI ’10. ACM, New York, NY, 269-270. DOI= http://doi.acm.org/10.1145/1709886.1709941

Ubicomp Spring School in Nottingham – prototyping user interfaces

On Tuesday and Wednesday afternoon I ran practical workshops on creating novel user interfaces complementing the tutorial on Wednesday morning. The aim of the practical was to motivate people to more fundamentally question user interface decisions that we make in our research projects.

On a very simple level an input user interface can be seen as a sensor, a transfer function or mapping, and an action in the system that is controlled. To motivate that this I showed two simple javascript programs that allowed to play with the mapping of the mouse to a movement of a button on the screen and with moving through a set of images. If you twist the mapping functions really simple tasks (like moving one button on top of the other) may get complicated. Similarly if you change the way you use the sensor (e.g. instead of moving the mouse on a surface, having several people moving a surface over the mouse) such simple tasks may become really difficult, too.

With this initial experience, a optical mouse, a lot of materials (e.g. fabrics, cardboard boxes, picture frames, toys, etc.), some tools, and 2 hours of time the groups started to create their novel interactive experience. The results created included a string puppet interface, a frog interface, a interface to the (computer) recycling, a scarf, and a close contact dancing interface (the music only plays if bodies touch and move).

The final demos of the workshop were shown before dinner. Seeing the whole set of the new interface ideas one wonders why there is so little of this happening beyond the labs in the real world and why people are happy to live with current efficient but rather boring user interfaces – especially in the home context…

Demo day at TEI in Cambridge

What is a simple and cheap way to get from Saarbrücken to Linz? It’s not really obvious, but going via Stansted/Cambridge makes sense – especially when there is the conference on Tangible and Embedded Interaction (www.tei-conf.org) and Raynair offers 10€ flight (not sure about sustainability though). Sustainability, from a different perspective was also at the center of the Monday Keynote by Tom Igeo which I missed.

Nicolas and Sharam did a great job and the choice to do a full day of demos worked out great. The large set of interactive demos presented captures and communicates a lot of the spirit of the community. To get an overview of the demos one has to read through the proceedings (will post a link as soon as they are online in the ACM-DL) as there are too many to discuss them here.
Nevertheless here is my random pick:
One big topic is tangible interaction on surfaces. Several examples showed how interactive surfaces can be combined with physical artifacts to make interaction more graspable. Jan Borcher’s group showed a table with passive controls that are recognized when placed on the table and they provide tangible means for interaction (e.g. keyboard keys, knobs, etc.). An interesting effect is that the labeling of the controls can be done dynamically.
Microsoft research showed an impressive novel table top display that allows two images to be projected – on the interactive surface and one on the objects above [1]. It was presented at large year’s UIST but I have tried it out now for the first time – and it is a stunning effect. Have a look at the paper (and before you read the details make a guess how it is implemented – at the demo most people guessed wrong 😉
Embedding sensing into artifacts to create a digital representation has always been a topic in tangible – even back to the early work of Hiroshi Ishii on Triangles [2]. One interesting example in this year’s demo was a set of cardboard pieces that are held together by hinges. Each hinge is technically realized as a potentiometer and by measuring the potion the structure can be determined. It is really interesting to think this further.
Conferences like TEI let you inevitably think about the feasibility of programmable matter – and there is ongoing work in this in the robotics community. The idea is to create micro-robots that can create arbitrary shapes – for a starting point see the work at CMU on Claytronics.
[1] Izadi, S., Hodges, S., Taylor, S., Rosenfeld, D., Villar, N., Butler, A., and Westhues, J. 2008. Going beyond the display: a surface technology with an electronically switchable diffuser. In Proceedings of the 21st Annual ACM Symposium on User interface Software and Technology (Monterey, CA, USA, October 19 – 22, 2008). UIST ’08. ACM, New York, NY, 269-278. DOI= http://doi.acm.org/10.1145/1449715.1449760
[2] Gorbet, M. G., Orth, M., and Ishii, H. 1998. Triangles: tangible interface for manipulation and exploration of digital information topography. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Los Angeles, California, United States, April 18 – 23, 1998). C. Karat, A. Lund, J. Coutaz, and J. Karat, Eds. Conference on Human Factors in Computing Systems. ACM Press/Addison-Wesley Publishing Co., New York, NY, 49-56. DOI= http://doi.acm.org/10.1145/274644.274652

Design Ideas and Demos at FH Potsdam

During the workshop last week in Potsdam we got to see demos from students of Design of Physical and Virtual Interfaces class taught by Reto Wettach and JennyLC Chowdhury. The students had to design a working prototype of an interactive system. As base technology most of them use the Arduino Board with some custom made extensions. For a set of pictures see my photo gallery and the photos on flickr. It would need pages to describe all of the projects so I picked few…

The project “Navel” (by Juan Avellanosa, Florian Schulz and Michael Härtel) is a belt with tactile output, similar to [1], [2] and [3]. The first idea along this lines that I have tried out was Gentle Guide [4] at mobile HCI 2003 – it seemed quite compelling. The student project proposed one novel application idea: to use it in sport. That is quite interesting and could complement ideas proposed in [5].

Vivien’s favorite was the vibrating doormat; a system where a foot mat is constructed of three vibrating tiles that can be controlled and different vibration patters can be presented. It was built by Lionel Michel and he has several ideas what research questions this could address. I found especially the question if and how one can induce feelings and emotions with such a system. In the same application context (doormat) another prototype looked at emotions, too. If you stroke or pat this mat it comes out of its hiding place (Roll-o-mat by Bastian Schulz).

There were several projects on giving everyday objects more personality (e.g. a Talking Trashbin by Gerd-Hinnerk Winck) and making them emotional reactive (e.g. lights that reacted to proximity). Firefly (by Marc Tiedemann) is one example how reactiveness and motion that is hard to predict can lead to an interesting user experience. The movement appears really similar to a real firefly.

Embedding Information has been an important topic in our research over the last years [6] – the demos provided several interesting examples: a cable that visualized energy consumption and keyboard to leave messages. I learned a further example of an idea/patent application where information is included in the object – in this case in a tea bag. This is an extreme case but I think looking into the future (and assuming that we get sustainable and bio-degradable electronics) it indicates an interesting direction and pushing the idea of Information at your fingertip (Bill Gates Keynote in 1994) much further than originally intended.

For more photos see my photo gallery and the photos on flickr.

[1] Tsukada, K. and Yasumrua, M.: ActiveBelt: Belt-type Wearable Tactile Display for Directional Navigation, Proceedings of UbiComp2004, Springer LNCS3205, pp.384-399 (2004).

[2] Alois Ferscha et al. Vibro-Tactile Space-Awareness . Video Paper, adjunct proceedings of Ubicomp2008. Paper. Video.

[3] Heuten, W., Henze, N., Boll, S., and Pielot, M. 2008. Tactile wayfinder: a non-visual support system for wayfinding. In Proceedings of the 5th Nordic Conference on Human-Computer interaction: Building Bridges (Lund, Sweden, October 20 – 22, 2008). NordiCHI ’08, vol. 358. ACM, New York, NY, 172-181. DOI= http://doi.acm.org/10.1145/1463160.1463179

[4] S.Bosman, B.Groenendaal, J.W.Findlater, T.Visser, M.de Graaf & P.Markopoulos . GentleGuide: An exploration of haptic output for indoors pedestrian guidance . Mobile HCI 2003.

[5] Mitchell Page, Andrew Vande Moere: Evaluating a Wearable Display Jersey for Augmenting Team Sports Awareness. Pervasive 2007. 91-108

[6] Albrecht Schmidt, Matthias Kranz, Paul Holleis. Embedded Information. UbiComp 2004, Workshop ‘Ubiquitous Display Environments’, September 2004

GIST, Gwangju, Korea

Yesterday I arrived in Gwangju for the ISUVR-2008. It is my first time in Korea and it is an amazing place. Together with some of the other invited speakers and PhD students we went for a Korean style dinner (photos from the dinner). The campus (photos from the campus) is large and very new.

This morning we had the opportunity to see several demos from Woontack’s students in the U-VR lab. There is a lot of work on haptics and mobile augmented reality going on. See the pictures of the open lab demo for yourself…

In the afternoon we had some time for culture and sightseeing – the country side parks are very different from Europe. Here are some of the photos of the trip around Gwangju and see http://www.damyang.go.kr/

In 2005 Yoosoo Oh, a PhD student with Woontack Wo at GIST, was a visiting student in our lab in Munich. We worked together on issues related to context awareness and published a paper together discussing the whole design cycle and in particular the evaluation (based on a heuristic approach) of context-aware systems [1].

[1] Yoosoo Oh, Albrecht Schmidt, Woontack Woo: Designing, Developing, and Evaluating Context-Aware Systems. MUE 2007: 1158-1163

Photos – ISUVR2008 – GIST – Korea

Talk by Florian Michahelles, RFID showcase at Kaufhof Essen

Florian Michahelles, associate director of the AutoID-Labs in Zürich visited our group and gave a presentation in my course on Pervaisve Computing. He introduced the vision of using RFID in businesses, gave a brief technology overview and discussed the potential impact – in a very interactive session.

Florian and I worked together in the Smart-its project and during his PhD studies he and Stavros were well know as the experts on Ikea PAX [1], [2]. In 2006 and 2007 we ran workshops on RFID technologies and published the results and a discussion on emerging trends in RFID together [3], [4].

At Kaufhof in Essen you can see a showcase of using RFID tags in garment retail. The installation includes augmented shelves, an augmented mirror, and contextual information displays in the changing rooms. The showcase is related to the European Bridge project. …was fun playing with the system – seems to be well engineered for a prototype.

PS: Florian told me that Vlad Coroama finished his PhD. In a different context we talked earlier about his paper discussing the use of sensors to access cost for insurance [5] – he did it with cars but there are other domain where this makes sense, too.

[1] S. Antifakos, F. Michahelles, and B. Schiele. Proactive Instructions for Furniture Assembly. In UbiComp, Gothenburg, Sweden, 2002.
http://www.viktoria.se/fal/exhibitions/smart-its-s2003/furniture.pdf

[2] Florian Michahelles, Stavors Antifakos, Jani Boutellier, Albrecht Schmidt, and Bernt Schiele. Instructions immersed into the real world How your Furniture can teach you. Poster at the Fifth International Conference on Ubiquitous Computing, Seattle, USA, October 2003. http://www.mis.informatik.tu-darmstadt.de/Publications/ubipost03.pdf

[3]Florian Michahelles, Frédéric Thiesse, Albrecht Schmidt, John R. Williams: Pervasive RFID and Near Field Communication Technology. IEEE Pervasive Computing 6(3): 94-96 (2007) http://www.alexandria.unisg.ch/EXPORT/PDF/publication/38445.pdf

[4] Schmidt, A., Spiekermann, S., Gershman, A., and Michahelles, F. 2006. Real-World Challenges of Pervasive Computing. IEEE Pervasive Computing 5, 3 (Jul. 2006), 91-93 http://www.hcilab.org/events/pta2006/IEEE-PvM-b3091.pdf

[5] Vlad Coroama: The Smart Tachograph – Individual Accounting of Traffic Costs and Its Implications. Pervasive 2006: 135-152. http://www.vs.inf.ethz.ch/res/papers/coroama_pervasive2006.pdf

Impressions from Pervasive 2008

Using electrodes to detect eye movement and to detect reading [1] – relates to Heiko’s work but uses different sensing techniques. If the system can really be implemented in goggles this would be a great technologies for eye gestures as suggested in [2].

Utilizing infrastructures that are in place for activity sensing – the example is a heating/air condition/ventilation system [3]. I wondered and put forward the question how well this would work in active mode – where you actively create an airflow (using the already installed system) to detect the state of an environment.

Further interesting ideas:

  • Communicate while you sleep? Air pillow communication… Vivien loves the idea [4].
  • A camera with additional sensors [5] – really interesting! We had in Munich a student project that looked at something similar [6]
  • A cool vision video of the future is SROOM – everything becomes a digital counterpart. Communicates the idea of ubicomp in a great and fun way [7] – not sure if the video is online – it is on the conference DVD.

[1] Robust Recognition of Reading Activity in Transit Using Wearable Electrooculography. Andreas Bulling, Jamie A. Ward, Hans-W. Gellersen and Gerhard Tröster. Proc. of the 6th International Conference on Pervasive Computing (Pervasive 2008), pp. 19-37, Sydney, Australia, May 2008. http://dx.doi.org/10.1007/978-3-540-79576-6_2

[2] Heiko Drewes, Albrecht Schmidt. Interacting with the Computer using Gaze Gestures. Proceedings of INTERACT 2007. http://murx.medien.ifi.lmu.de/~albrecht/pdf/interact2007-gazegestures.pdf

[3] Shwetak N. Patel, Matthew S. Reynolds, Gregory D. Abowd: Detecting Human Movement by Differential Air Pressure Sensing in HVAC System Ductwork: An Exploration in Infrastructure Mediated Sensing. Proc. of the 6th International Conference on Pervasive Computing (Pervasive 2008), pp. 1-18, Sydney, Australia, May 2008. http://shwetak.com/papers/air_ims_pervasive2008.pdf

[4] Satoshi Iwaki et al. Air-pillow telephone: A pillow-shaped haptic device using a pneumatic actuator (Poster). Advances in Pervasive Computing. Adjunct proceedings of the 6th International Conference on Pervasive Computing (Pervasive 2008). http://www.pervasive2008.org/Papers/LBR/lbr11.pdf

[5] Katsuya Hashizume, Kazunori Takashio, Hideyuki Tokuda. exPhoto: a Novel Digital Photo Media for Conveying Experiences and Emotions. Advances in Pervasive Computing. Adjunct proceedings of the 6th International Conference on Pervasive Computing (Pervasive 2008). http://www.pervasive2008.org/Papers/Demo/d4.pdf

[6] P. Holleis, M. Kranz, M. Gall, A. Schmidt. Adding Context Information to Digital Photos. IWSAWC 2005. http://www.hcilab.org/documents/AddingContextInformationtoDigitalPhotos-HolleisKranzGallSchmidt-IWSAWC2005.pdf

[7] S-ROOM: Real-time content creation about the physical world using sensor network. Takeshi Okadome, Yasue Kishino, Takuya Maekawa, Kouji Kamei, Yutaka Yanagisawa, and Yasushi Sakurai. Advances in Pervasive Computing. Adjunct proceedings of the 6th International Conference on Pervasive Computing (Pervasive 2008). http://www.pervasive2008.org/Papers/Video/v2.pdf

Gregor showed the potential of multi-tag interaction in a Demo

Gregor, a colleague from LMU Munich, presented work that was done in the context of the PERCI project, which started while I was in Munich. The demo showed several applications (e.g. buying tickets) that exploit the potential of interaction with multiple NFC-Tags. The basic idea is to have several NFC-Tags included in a printed poster with which the user can interact using a phone. By touching the tags in a certain order the selection can be made. For more details see the paper accompanying the demo [1].

[1] Gregor Broll, Markus Haarländer, Massimo Paolucci, Matthias Wagner, Enrico Rukzio, Albrecht Schmidt. Collect & Drop: A Technique for Physical Mobile Interaction. Demo at Pervasive 2008. Sydney. http://www.pervasive2008.org/Papers/Demo/d1.pdf

CeBIT Demo – always last minute…

Yesterday afternoon I was in Hannover at CeBIT to set up our part in a demo at the Fraunhofer stand (Hall 9, Stand B36). The overall topic of the fraunhofer presence is “Researching for the people“.

After some difficulties with our implementation on the phone, the server and the network (wired and wireless – my laptop showed more than 30 wifi-accesspoints and a BT-scan showed 12 devices) we got the demo going. The demo is related to outdoor advertisement and together with Fraunhofer IAIS we provide an approach to estimate the number viewers/visitors. On Wednesday I will give a talk at CeBIT to explain some more details.

It seems demo are always finished last minute…

Talks, Demos and Poster at TEI’08

The first day of the conference went well – thanks to many helping hands. The time we had for the demos seemed really long in the program but was too short to engage with every exhibit (for next year we should make sure to allocate even more time).

People made really last minutes efforts to make their demos work. We even went to Conrad electronics to get some pieces (which burned before in setting up the demos). Demos are in my eyes an extremely efficient means for communicating between scientists and sharing ideas.

Central mechanical workshop

Currently we work in one of our courses on a specific multi-touch table. Students have already created a first version of an interesting application – and ideas for many more are there. However so far our prototype does not look like a table.

Learning that our university has central workshops we went there to talk about our project and to get the mechanical parts built. Our first meeting was really interesting – we got a tour and saw drilling and milling machines as well as a cutter that works with water (can cut glass precisely – extremely impressive). Best of all it seems (as they are at university) they find strange requirements in our prototypes not odd 😉

Our initial design is a welded metal table frame which leaves us a lot off options for experimenting with camera, projection, and surface. Looking really forward to see the first version!

Visit at the University of Hamburg

Yesterday we visited the computer science department at the University of Hamburg. Prof. Oberquelle und Beckhaus had invited me at the Mensch & Computer conference to visit them and give a talk about our work.

Before the seminar we had a chance to see the lab of Steffi Beckhaus. I have tried the ChairIO – and it was fun. They sound floor creates a really interesting experience (similar to the butt-kicker just more intense). We could also play with GranulatSynthese and try the smell user interface (apple smell is absolutely convincing, not sure about some of the others).

We had some discussion on emotions and capturing physiological parameters. Thinking about emotions and senses with regard to a community sharing them opens up a lot of potential for new experiences and potentially applications. We discussed this topic to some extent some weeks ago at the Human Computer Confluence Workshop in Brussels. I really thing a small scale experience in share emotions could move us forward and provide some more insight. In Hamburg they have the NeXus-system (perhaps we should get this too and create a networked application).

In my talk (creating novel user interfaces) I focused on the PhD work of Paul Holleis (KLM for mobile phones, his CHI Paper from last year) and of Heiko Drewes (Eye-Gestures, his Interact’07 paper). The discussion was quite interesting.

CardioViz Demo at Ubicomp 2007

Alireza Sahami presented our CardiViz project at the demo session at Ubicomp. We were very happy that the project that was the result of our IPEC course on developing mobile applications was accepted as a demo.

For more details see:
Alireza Sahami Shirazi, Diana Cheng, Oliver Kroell, Dagmar Kern, Albrecht Schmidt. CardioViz: Contextual Capture and Visualization for Long-term ECG Data. Adjunct Proceedings of Ubicomp 2007 (Demo).

Jonna Häkkilä, Anind Dey, Kari Hjelt, and I organized organized the Ubiwell workshop (Interaction with Ubiquitous Wellness and Healthcare Applications) at this years pervasive. Alireza presented another paper on heartbeat monitoring there:
Florian Alt, Alireza Sahami Shirazi, Albrecht Schmidt. Monitoring Heartbeat per Day to Motivate Increasing Physical Activity. UbiWell workshop@Ubicomp 2007.

bi-t Student demo lab results at Fraunhofer IAIS

This morning we presented selected demos of the lab on location and context awareness to people at the Fraunhofer IAIS. Besides the fact that our main infrastructure component (the Ubisense indoor system) did not work the demos went well. It was very strange – the infrastructure worked for the last 6 weeks (including several reboots) and this morning after rebooting the server it did not find the sensors anymore for several hours.

The majority of demos were based on the second assignment which was to create a novel application that makes use of an indoor location system. The applications implemented by the students included a heat-map (showing where a room is mainly used), co-location depended displays (enabling minimal setup effort and admin effort), museum information system (time and location depend display of different levels of information), and a security system (allowing a functionality only inside a perimeter dynamically defined by tags). Overall it was very interesting what the students created in 4 weeks of hard work.

We also briefly showed the location post its which were based on GPS and were done for the first group assignment, the CardioViz prototype (from the lab in the winter term), and the Web annotation tool that is now nearly ready.

Even though there were some difficulties in running some of the demos I am still convinced in a research environment we need to show live demos and not just ppt-slide-ware 😉 We probably have to demo more to get more professional with non-working components.

More pictures are online at http://foto.ubisys.org/iais_presentation/