{"id":202,"date":"2024-02-12T14:22:01","date_gmt":"2024-02-12T13:22:01","guid":{"rendered":"https:\/\/www.hcilab.org\/physiochi24\/?post_type=mp-event&#038;p=202"},"modified":"2025-03-13T10:30:45","modified_gmt":"2025-03-13T09:30:45","slug":"keynote","status":"publish","type":"mp-event","link":"https:\/\/www.hcilab.org\/physiochi24\/timetable\/event\/keynote\/","title":{"rendered":"Pedro Lopes &#8211; Keynote"},"content":{"rendered":"\n<p>Pedro Lopes, University of Chicago<\/p>\n\n\n\n<p><strong>Keynote title:<\/strong> Integrating interactive devices with the user\u2019s body &amp; brain<\/p>\n\n\n\n<div class=\"wp-block-media-text is-stacked-on-mobile is-vertically-aligned-center\" style=\"grid-template-columns:34% auto\"><figure class=\"wp-block-media-text__media\"><img loading=\"lazy\" decoding=\"async\" width=\"959\" height=\"1024\" src=\"https:\/\/www.hcilab.org\/physiochi24\/wp-content\/uploads\/sites\/7\/2024\/03\/Picture1-1-959x1024.jpg\" alt=\"\" class=\"wp-image-234 size-full\" srcset=\"https:\/\/www.hcilab.org\/physiochi24\/wp-content\/uploads\/sites\/7\/2024\/03\/Picture1-1-959x1024.jpg 959w, https:\/\/www.hcilab.org\/physiochi24\/wp-content\/uploads\/sites\/7\/2024\/03\/Picture1-1-281x300.jpg 281w, https:\/\/www.hcilab.org\/physiochi24\/wp-content\/uploads\/sites\/7\/2024\/03\/Picture1-1-768x820.jpg 768w, https:\/\/www.hcilab.org\/physiochi24\/wp-content\/uploads\/sites\/7\/2024\/03\/Picture1-1-1200x1281.jpg 1200w, https:\/\/www.hcilab.org\/physiochi24\/wp-content\/uploads\/sites\/7\/2024\/03\/Picture1-1.jpg 1430w\" sizes=\"auto, (max-width: 959px) 100vw, 959px\" \/><\/figure><div class=\"wp-block-media-text__content\">\n<p>Pedro Lopes, University of Chicago<\/p>\n\n\n\n<p><strong>bio: <\/strong><a href=\"https:\/\/lab.plopes.org\/\" target=\"_blank\" rel=\"noreferrer noopener\">Pedro Lopes<\/a> is an Associate Professor in Computer Science at the University of Chicago. Pedro focuses on <strong>integrating interfaces with the human body\u2014exploring the interface paradigm that supersedes wearables<\/strong>. These include: muscle stimulation wearables that allow&nbsp;<a href=\"https:\/\/www.youtube.com\/watch?v=Gz4dphzBb6I\" target=\"_blank\" rel=\"noreferrer noopener\">users to manipulate tools they have never seen before<\/a>&nbsp;or that&nbsp;<a href=\"https:\/\/www.youtube.com\/watch?v=1BT8REEJibM\" target=\"_blank\" rel=\"noreferrer noopener\">accelerate reaction time<\/a>, or a device that&nbsp;<a href=\"https:\/\/www.youtube.com\/watch?v=pH68GNkb_fA&amp;feature=youtu.be\" target=\"_blank\" rel=\"noreferrer noopener\">leverages the smell to create an illusion of temperature<\/a>. All these examples leverage computers to augment the user\u2019s body, not just cognitively, but also <em>physically<\/em> (e.g., our wearable that accelerates one\u2019s reaction time made it to the Guinness Book of World Records). Pedro\u2019s work has received several academic awards, such as six CHI\/UIST Best Papers, the Sloan Fellowship and the NSF CAREER award, and captured the interest of the public (e.g., New York Times, exhibited at Ars Electronica, etc.; more: <a href=\"https:\/\/lab.plopes.org\">https:\/\/lab.plopes.org<\/a>).<\/p>\n<\/div><\/div>\n\n\n\n<p><strong>Keynote Synopsis<\/strong><\/p>\n\n\n\n<p>When we look back to the early days of computing, user and device were distant, often located in separate rooms. Then, in the \u201970s, personal computers \u201cmoved in\u201d with users. In the \u201990s, mobile devices moved computing into users\u2019 pockets. Recently, wearables brought computing into constant physical contact with the user\u2019s skin. These transitions proved useful: moving closer to users allowed interactive devices to sense more of their users and act more personal. The main question that drives my research is: <strong>what is the next interface paradigm that supersedes wearable devices?<\/strong><\/p>\n\n\n\n<p>I propose that the next generation of interfaces will be defined by how devices <strong>integrate<\/strong>&nbsp;with the user\u2019s biological senses and actuators. For the past years, my lab has been exploring how this body-device integration allows to <strong>engineer interactive devices that intentionally borrow parts of the body for input and output, rather than adding more technology to the body<\/strong>.<\/p>\n\n\n\n<p>The first key advantage of body-device integration is that puts forward a&nbsp;<strong>new<\/strong>&nbsp;<strong>generation of miniaturized devices<\/strong>; allowing us to circumvent traditional physical constraints. For instance, in the case of our devices based on electrical muscle stimulation, they illustrate how to create realistic haptic feedback (e.g., forces in VR\/AR) while circumventing the constraints imposed by robotic exoskeletons, which need to balance their output power against the size of their motors and batteries. Taking this further, we successfully applied this body-device integration approach to other sensory modalities. For instance, we engineered a device that delivers chemicals to the user to render temperature sensations without the need to rely on cumbersome thermal actuators. Our approach to miniaturizing devices is especially useful to advance mobile interactions, such as in virtual or augmented reality, where users have a desire to remain untethered.<\/p>\n\n\n\n<p>A second key advantage is that integrating devices with the user\u2019s body allows <strong>for new interactions to emerge without<\/strong> <strong>encumbering the user\u2019s hands<\/strong>. Using our approach, we demonstrated how to create tactile sensations in the user\u2019s fingerpads without putting anything directly on the fingerpads\u2014instead we intercept the fingerpad nerves from the back of the user\u2019s hand. This allows users to benefit from haptic feedback (e.g., for guidance in VR\/AR) without encumbering their dexterity. Taking this further, we demonstrated that using brain stimulation we can achieve haptic feedback on all four limbs <strong>without wearing any hardware<\/strong> (e.g., feeling forces &amp; tactile sensations on both hands and feet)\u2014opening up a new way to achieve haptics by directly stimulation the source (brain) rather than the endpoints (limbs).<\/p>\n\n\n\n<p>A third facet is that our integrated devices <strong>enable new physical modes of reasoning with computers<\/strong>, going beyond just symbolic thinking. For example, we have engineered a set of devices that control the user\u2019s muscles to provide tacit information to the user, such as a muscle-stimulation devices to learn new skills (piano or sign-language), or our wearable devices to allow users to control information using their bodies and without the need for screens (e.g., using their lips, muscles or feet for <em>both<\/em> input and output).<\/p>\n\n\n\n<p>A fourth key aspect that we found while integrating devices with the user\u2019s body is that we can endow users with <em>new<\/em> physical abilities. We engineered a device that allows users to locate odor sources by \u201csmelling in stereo\u201d as well as a device that <strong>physically accelerates one\u2019s reaction time using muscle stimulation<\/strong>, which can steer users to safety or even catch a falling object that they would normally miss.<\/p>\n\n\n\n<p>While this integration between human and computer is beneficial (e.g., faster reaction time, realistic simulations in VR\/AR, or faster skill acquisition), it also requires tackling new challenges, such as improving the precision of how we safely stimulate the body or the question of agency: <strong>do we feel in control when our body is integrated with an interface?<\/strong> Together with our colleagues in neuroscience, we have been measuring how our brain encodes agency to improve the design of this new type of integrated interfaces. We found that, even in the extreme case of our interfaces that electrically control the user\u2019s muscles, it is possible to improve the sense of agency. More importantly, we found that it is only by preserving the user\u2019s sense of agency that these integrated devices <strong>provide benefits even after the user takes them out<\/strong>.<\/p>\n\n\n\n<p>Finally, I believe that these bodily-integrated devices are <strong>the natural succession to wearable interfaces<\/strong> and allow us to investigate how interfaces will connect to our bodies in a more direct and personal way.<strong><br><\/strong><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Pedro Lopes, University of Chicago Keynote title: Integrating interactive devices with the user\u2019s body &amp; brain Pedro Lopes, University of Chicago bio: Pedro Lopes is an Associate Professor in Computer Science at the University of Chicago. Pedro focuses on integrating interfaces with the human body\u2014exploring the interface paradigm that supersedes wearables. These include: muscle stimulation [&hellip;]<\/p>\n","protected":false},"author":21,"featured_media":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","mp-event_category":[],"mp-event_tag":[],"class_list":["post-202","mp-event","type-mp-event","status-publish","hentry","mp-event-item"],"_links":{"self":[{"href":"https:\/\/www.hcilab.org\/physiochi24\/wp-json\/wp\/v2\/mp-event\/202","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.hcilab.org\/physiochi24\/wp-json\/wp\/v2\/mp-event"}],"about":[{"href":"https:\/\/www.hcilab.org\/physiochi24\/wp-json\/wp\/v2\/types\/mp-event"}],"author":[{"embeddable":true,"href":"https:\/\/www.hcilab.org\/physiochi24\/wp-json\/wp\/v2\/users\/21"}],"replies":[{"embeddable":true,"href":"https:\/\/www.hcilab.org\/physiochi24\/wp-json\/wp\/v2\/comments?post=202"}],"wp:attachment":[{"href":"https:\/\/www.hcilab.org\/physiochi24\/wp-json\/wp\/v2\/media?parent=202"}],"wp:term":[{"taxonomy":"mp-event_category","embeddable":true,"href":"https:\/\/www.hcilab.org\/physiochi24\/wp-json\/wp\/v2\/mp-event_category?post=202"},{"taxonomy":"mp-event_tag","embeddable":true,"href":"https:\/\/www.hcilab.org\/physiochi24\/wp-json\/wp\/v2\/mp-event_tag?post=202"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}