- Matt Jones: Mobile Search
- Luca Chittaro: Information Visualization and Visual Interfaces for Mobile Devices
- Chris Kray: Mobile Guides
- Marc Langheinrich: Mobile Privacy
- Enrico Rukzio: Mobile Interaction with the Real World
- Paul Holleis: Modelling and Developing Mobile Applications
I did a tutorial on Mobile Human Computer interaction at Pervasive 2009. The tutorial tried to give an overview of challenges of mobile HCI and was partly based on last year’s tutorial day at MobileHCI2008 in Amsterdam. For the slides from last year have a look at: http://albrecht-schmidt.blogspot.com/2008/09/mobilehci-2008-tutorial.html
Listening to Marc Langheinrich‘s tutorial on privacy I remembered that that I still have the photos of his HCI library – and to not forget them I upload them. Marc highlighted the risk of data analysis with the AOL Stalker example (some comments about the AOL Stalker). His overall tutorial is always good to hear and has many inspring issues – even so I am not agreeing with all the conclusions 😉
For me seeing the books my collegues use on a certain topic still works better than the amazon recommendations I get 😉 perhaps people (or we?) should work harder on using social network based product recommendation systems…
On Tuesday and Wednesday afternoon I ran practical workshops on creating novel user interfaces complementing the tutorial on Wednesday morning. The aim of the practical was to motivate people to more fundamentally question user interface decisions that we make in our research projects.
With this initial experience, a optical mouse, a lot of materials (e.g. fabrics, cardboard boxes, picture frames, toys, etc.), some tools, and 2 hours of time the groups started to create their novel interactive experience. The results created included a string puppet interface, a frog interface, a interface to the (computer) recycling, a scarf, and a close contact dancing interface (the music only plays if bodies touch and move).
The final demos of the workshop were shown before dinner. Seeing the whole set of the new interface ideas one wonders why there is so little of this happening beyond the labs in the real world and why people are happy to live with current efficient but rather boring user interfaces – especially in the home context…
The ubicomp spring school in Nottingham had an interesting set of lectures and practical sessions, including a talk by Turing Award winner Robin Milner on a theoretical approach to ubicomp. When I arrived on Tuesday I had the chance to see Chris Baber‘s tutorial on wearable computing. He provided really good examples of wearable computing and its distinct qualities (also in relation to wearable use of mobile phones). One example that captures a lot about wearable computing is an adaptive bra. The bra one example of a class of interesting future garments. The basic idea is that these garments detects the activity and changes their properties accordingly. A different example in this class is a shirt/jacket/pullover/trouser that can change its insulation properties (e.g. by storing and releasing air) according to the external temperature and the users body temperature.
My tutorial was on user interface engineering and I discussed: what is different in creating ubicomp UIs compared to traditional user interfaces. I showed some trends (including technologies as well as a new view on privacy) that open the design space for new user interfaces. Furthermore we discussed the idea about creating magical experiences in the world and the dilemma of user creativity and user needs.
There were about 100 people the spring school from around the UK – it is really exciting how much research in ubicomp (and somehow in the tradition of equator) is going on in the UK.
The conference on mobile human computer interaction (MobileHCI 2008) started today in Amsterdam with the tutorial and workshop day.
I am chairing the tutorials and we tried a new approach for the tutorial, having 6 sessions/chapters that all together make up an introduction to mobile HCI. After 10 years of mobile HCI it seems important to help new members of the community to quickly learn about the field. The presentations were given by experts in the field that had 1 hour each for their topics. We had unexpected high attendence (the room with 100 seats was nearly always full). Have a look at the slides:
Text input for mobile devices by Scott MacKenzie
Scott gave an overview of different input means (e.g. key-based, stylus, predictive, virtual keyboard), parameters relevant for designing and assessing mobile text input (e.g., writing speed, cognitive load) and issues related to the context of use (e.g., walking/standing).
Mobile GUIs and Mobile Visualization by Patrick Baudisch
Patrick introduced input and output options for mobile devices. He will talk about the design process, prototyping and assessment of user interfaces, trade-offs related to the design of mobile GUIs and different possible interaction styles.
Understanding Mobile User Experience by Mirjana Spasojevic
Mirjana discussed different means for studying mobile user needs and evaluating the user experience. This includes explorative studies and formal evaluations (in the lab vs. in the field), including longitudinal pilot deployments. The lecture discusses traditional HCI methods of user research and how they need to be adapted for different mobile contexts and products.
Context-Aware Communication and Interaction by Albrecht Schmidt
Albrecht gave an overview of work in context-awareness and activity recognition that is related to mobile HCI. He discussed how sharing of context in communication applications can improve the user experience. The lecture explained how perception and sensing can be used to acquire context and activity information and show examples how such information can be exploited.
Haptics, audio output and sensor input in mobile HCI by Stephen Brewster
Stephen discussed the design space for haptics, audio output as well as sensor and gesture input in mobile HCI. Furthermore he assessed resulting interaction methods and implications for the interactive experience.
Camera-based interaction and interaction with public displays by Michael Rohs
Michael introduced camera based interaction with mobile devices; this included a assessment of optical markers, 2D-barcodes and optical flow as well as techniques related to augmented reality. In this context he addressed interaction with public displays, too.
You can also download the complete tutorial including all 6 chapters in a single PDF file (16MB).
Pervasive 2007 introduced a new form of tutorials – having a number of experts talking one hour about their special topic – I was last year as participant and liked it a lot. This year Pervasive 2008 repeated this approach and I contributed a tutorial on how to get context and activity from sensors (tutorial slides in PDF).
Abstract. Intelligent environments, sensor network and smart objects are inherently connected to building systems that sense phenomena in the real world and make the perceived information available to applications. In the first part of the tutorial an overview of sensors and sensor systems commonly used in pervasive computing application is given. Additionally to the sensor properties means for connecting sensors to systems (e.g. ADC, PWM, I2C, serial line) are explained. In the second part it is discussed how to create meaningful information in the application domain. Some basic features, calculated in the time and frequency domain, are introduced to provide basic means for processing and abstraction of raw sensor data. This part is complemented by a brief overview of mechanisms and methods for relating (abstracted) sensor information to context, activity and situations. Additionally general problems that are associated with sensing context and activity will be addressed in this tutorial.