Our research is focused understanding how to integrate computer interfaces with the human body—we believe this is the interface paradigm that supersedes wearable computing. We explore this by engineering interactive systems that intentionally borrow parts of the user’s body for input and output. We have used our wearable muscle stimulation devices, for example, to: make a user's muscles properly manipulate a tool they never used before, computationally accelerate a user's reaction time so they are able to take a photograph of a high-speed moving target, read and write information without using a screen, and transform someone's arm into a plotter so they can solve computationally complex problems with just a pen and paper. |
We think these types of interactive devices are beneficial because they afford new emotional and physical modes of reasoning with computers, which goes beyond just symbolic thinking (reasoning by typing and reading language on a screen). While this physical integration between human and computer is beneficial in many ways, it also requires tackling a series of new philosophical challenges, such as the question of agency: how do I feel when my body is integrated with an interface, do I feel in control? We explore these questions, together with neuroscientists, by measuring and improving how our brain encodes the feeling of agency under this new kind of integrated interfaces.
The Human Computer Integration research lab is led by Prof. Pedro Lopes at the Computer Science Department of the University of Chicago.
Our lab is a welcoming environment that does not discriminate. We are a LGBTQ+ ally lab. If you are emailing lab members, do not assume pronoums, just ask before.
Jun Nishida, Soichiro Matsuda, Hiroshi Matsui, Shan-Yuan Teng, Ziwei Liu, Kenji Suzuki, Pedro Lopes, In Proc. UIST’20 (full paper)
UIST best paper award
We engineered HandMorph, an exoskeleton that approximates the experience of having a smaller grasping range. It uses mechanical links to transmit motion from the wearer’s fingers to a smaller hand with five anatomically correct fingers. The result is that HandMorph miniaturizes a wearer’s grasping range while transmitting haptic feedback. Unlike other size-illusions based on virtual reality, HandMorph achieves this in the user’s real environment, preserving the user’s physical and social contexts. As such, our device can be integrated into the user’s workflow, e.g., to allow product designers to momentarily change their grasping range into that of a child while evaluating a toy prototype.
UIST'20 paper video 3D files (print your exoskeleton)Jas Brooks, Steven Nagels, Pedro Lopes, In Proc. CHI’20 (full paper) CHI best paper award (top 1%)
We explore a temperature illusion that uses low-powered electronics and enables the miniaturization of simple warm and cool sensations. Our illusion relies on the properties of certain scents, such as the coolness of mint or hotness of peppers. These odors trigger not only the olfactory bulb, but also the nose’s trigeminal nerve, which has receptors that respond to both temperature and chemicals. To exploit this, we engineered a wearable device that emits up to three custom-made “thermal” scents directly to the user’s nose. Breathing in these scents causes the user to feel warmer or cooler.
CHI'20 paper video CHI talk video hardware schematicsYuxin Chen*, Huiying Li∗, Shan-Yuan Teng∗, Steven Nagels, Pedro Lopes, Ben Y. Zhao and Heather Zheng, In Proc. CHI’20 (full paper)
* authors contributted equally CHI honorable mention for best paper award (top 5%)
We engineered a wearable microphone jammer that is capable of disabling microphones in its user’s surroundings, including hidden microphones. Our device is based on a recent exploit that leverages the fact that when exposed to ultrasonic noise, commodity microphones will leak the noise into the audible range. Our jammer is more efficient than stationary jammers. This is work was a colaboration led by Heather Zheng (who runs the SAND Lab) at UChicago.
CHI'20 paper video CHI talk video code/hardwareFloyd Mueller* ,Pedro Lopes*, Paul Strohmeier, Wendy Ju, Caitlyn Seim, Martin Weigel, Suranga Nanayakkara, Marianna Obrist, Zhuying Li, Joseph Delfa, Jun Nishida, Elizabeth Gerber, Dag Svanaes, Jonathan Grudin, Stefan Greuter, Kai Kunze, Thomas Erickson, Steven Greenspan, Masahiko Inami, Joe Marshall, Harald Reiterer, Katrin Wolf, Jochen Meyer, Thecla Schiphorst, Dakuo Wang, Pattie Maes. In Proc. CHI’20 (full paper) * authors contributted equally
Human-computer integration (HInt) is an emerging paradigm in which computational and human systems are closely interwoven; with rapid technological advancements and growing implications, it is critical to identify an agenda for future research in HInt.
CHI'20 paper CHI talk videoMichelle Carr, Adam Haar*, Judith Amores*, Pedro Lopes*, Guillermo Bernal, Tomás Vega, Oscar Rosello, Abhinandan Jain, Pattie Maes. In Proc. Consciousness and Cognition (Vol. 83, 2020) (journal paper) * authors contributted equally
We draw a parallel between recent VR haptic/sensory devices to further stimulate more senses for virtual interactions and the work of sleep/dream researchers, who are exploring how senses are intregrated and influence the sleeping mind. We survey recent developments in HCI technologies and analyze which might provide a useful hardware platform to manipulate dream content by sensory manipulation, i.e., to engineer dreams. This work was led by Michelle Carr (University of Rochester) and in collaboration with the Fluid Interfaces group (MIT Media Lab).
Consciousness and Cognition'20 paperSeungwoo Je, Myung Jin Kim, Woojin Lee, Byungjoo Lee, Xing-Dong Yang, Pedro Lopes, Andrea Bianchi. In Proc. UIST’19 (full paper)
We engineered Aero-plane, a force-feedback handheld controller based on two miniature jet-propellers that can render shifting weights of up to 14 N within 0.3 seconds. Unlike other ungrounded haptic devices, our prototype realistically simulates weight changes over 2D surfaces. This work was a collaboration and was led by Andrea Bianchi, who runs the MAKinteract group at KAIST.
Jakub Limanowski, Pedro Lopes, Janis Keck, Patrick Baudisch, Karl Friston, and Felix Blankenburg. In Cerebral Cortex (journal), to appear.
Tactile input generated by one’s own agency is generally attenuated. Conversely, externally caused tactile input is enhanced; e.g., during haptic exploration. We used functional magnetic resonance imaging (fMRI) to understand how the brain accomplishes this weighting. Our results suggest an agency-dependent somatosensory processing in the parietal operculum. Read more at our project's page.
Shunichi Kasahara, Jun Nishida and Pedro Lopes. In Proc. CHI’19, Paper 643 (full paper) and demonstration at SIGGRAPH'19 eTech.
Grand Prize, awarded by Laval Virtual in partnership with SIGGRAPH'19 eTech.We found out that it is possible to optimize the timing of haptic systems to accelerate human reaction time without fully compromising the user' sense of agency. This work was done in cooperation with Shunichi Kasahara from Sony CSL. Read more at our project's page.
CHI'19 paper video SIGGRAPH'19 etech CHI'19 talk (slides) CHI talk video
Lukas Gehrke, Sezen Akman, Pedro Lopes, Albert Chen, ..., Klaus, Gramann. In Proc. CHI’19, Paper 427. (full paper)
We detect visuo-haptic mismatches in VR by analyzing the user's event-related potentials (ERP). In our EEG study, participants touched VR objects and received either no haptics, vibration, or vibration and EMS. We found that the negativity component (prediction error) was more pronounced in unrealistic VR situations, indicating visuo-haptic mismatches. Read more at our project's page.
Pedro Lopes, Sijing You, Alexandra Ion, and Patrick Baudisch. In Proc. CHI’18. (full paper)
Summary: We present a mobile system that enhances mixed reality experiences, displayed on a Microsoft HoloLens, with force feedback by means of electrical muscle stimulation (EMS). The benefit of our approach is that it adds physical forces while keeping the users’ hands free to interact unencumbered—not only with virtual objects, but also with physical objects, such as props and appliances that are an integral part of both virtual and real worlds.
Pedro Lopes, Sijing You, Alexandra Ion, and Patrick Baudisch. In Proc. CHI’17 (full paper) and demonstration at SIGGRAPH'17 studios
We explored how to add haptics to walls and other heavy objects in virtual reality. Our contribution is that we prevent the user’s hands from penetrating virtual objects by means of electrical muscle stimulation (EMS). As the shown user lifts a virtual cube, our system lets the user feel the weight and resistance of the cube. The heavier the cube and the harder the user presses the cube, the stronger a counterforce the system generates.
Pedro Lopes, Doga Yueksel, François Guimbretière, and Patrick Baudisch. In Proc. UIST’16 (full paper).
We explore how to create more expressive EMS-based systems. Muscle-plotter achieves this by persisting EMS output, allowing the system to build up a larger whole. More specifically, it spreads out the 1D signal produced by EMS over a 2D surface by steering the user’s wrist. Rather than repeatedly updating a single value, this renders many values into curves.
Pedro Lopes, Alexandra Ion, and Patrick Baudisch. In Proc. UIST’15 (full paper). UIST best demo nomination
We present impacto, a device designed to render the haptic sensation of hitting and being hit in virtual reality. The key idea that allows the small and light impacto device to simulate a strong hit is that it decomposes the stimulus: it renders the tactile aspect of being hit by tapping the skin using a solenoid; it adds impulse to the hit by thrusting the user’s arm backwards using electrical muscle stimulation. The device is self-contained, wireless, and small enough for wearable use.
Pedro Lopes, Patrik Jonell, and Patrick Baudisch. In Proc. CHI’15 (full paper). CHI best paper award (top 1%)
We propose extending the affordance of objects by allowing them to communicate dynamic use, such as (1) motion (e.g., spray can shakes when touched), (2) multi-step processes (e.g., spray can sprays only after shaking), and (3) behaviors that change over time (e.g., empty spray can does not allow spraying anymore). Rather than enhancing objects directly, however, we implement this concept by enhancing the user with electrical muscle stimulation. We call this affordance++.
Pedro Lopes, Alexandra Ion, Willi Mueller, Daniel Hoffmann, Patrik Jonell, and Patrick Baudisch. In Proc. CHI’15 (full paper). CHI best talk award
We propose a new way of eyes-free interaction for wearables. It is based on the user’s proprioceptive sense, i.e., users feel the pose of their own body. We have implemented a wearable device, Pose-IO, that offers input and output based on proprioception. Users communicate with Pose-IO through the pose of their wrists. Users enter information by performing an input gesture by flexing their wrist, which the device senses using an accelerometer. Users receive output from Pose-IO by finding their wrist posed in an output gesture, which Pose-IO actuates using electrical muscle stimulation.
Pedro Lopes and Patrick Baudisch. In Proc. CHI’13 (short paper). IEEE World Haptics, People’s Choice Nomination for Best Demo
Force feedback devices resist miniaturization, because they require physical motors and mechanics. We propose mobile force feedback by eliminating motors and instead actuating the user’s muscles using electrical stimulation. Without the motors, we obtain substantially smaller and more energy-efficient devices. Our prototype fits on the back of a mobile phone. It actuates users’ forearm muscles via four electrodes, which causes users’ muscles to contract involuntarily, so that they tilt the device sideways. As users resist this motion using their other arm, they perceive force feedback.
The publications above are core to our lab's mission. If you are interested more of Pedro's publications in other topics, see here.
1. Introduction to Human-Computer Interaction (CMSC 20300; Fall quarter)
Synopsis: An introduction to the field of Human Computer Interaction (HCI), with a particular emphasis in understanding and designing user-facing software and hardware systems. This class covers the core concepts of HCI: affordance and mental models, input techniques (cursors, touch, text entry, voice, etc.), output techniques (visual menus and widgets, sound, haptics), conducting user studies, and so forth. It also includes a project in which students design, build and study a user-facing interactive system. See here for class website. (This class is required for our Undergraduate Specialization in HCI, see here for details.)
2. Inventing, Engineering and Understanding Interactive Devices (CMSC 23220; Spring quarter)
Synopsis: In this class we build I/O devices, typically wearable or haptic devices. These are user-facing hardware devices engineered to enable new ways to interact with computers. In order for you to be successful in building your own I/O device we will: (1) study and program 8 bit microntrollers, (2) explore different analog and digital sensors and actuators, (3) write control loops and filters, (4) explore stretchable and fabric based electronics, (5) learn how to approach invention, and (6) apply I/O devices to novel contexts such as Virtual Reality. See here for class website. (This class is part of our Undergraduate Specialization in HCI, see here for details.)
Synopsis: In this class we will engineer electronics onto Printed Circuit Boards (PCBs). We will focus on designing and laying out the circuit and PCB for our own custom-made I/O devices, such as wearable or haptic devices. In order for you to be successful in engineering a functional PCB we will: (1) review digital circuits and three microntrollers (ATMEGA, NRF, SAMD), (2) use KICAD to build circuit schematics; (3) learn how to wire analog/digital sensors or actuators to our microcontroller, including SPI and I2C protocols; (4) use KICAD to build PCB schematics; (5) actually manufacture our designs; (6) receive in our hands our PCBs from factory; (7) finally, learn how to debug our custom-made PCBs. See here for class website. (This class is part of our Undergraduate Specialization in HCI, see here for details.)
4. Emerging Interface Technologies (CMSC 33240 and CMSC 23240; Winter quarter)
Synopsis: In this class, we examine emergent technologies that might impact the future generations of computing interfaces, these include: physiological I/O (e.g., brain and muscle computer interfaces), tangible computing (giving shape and form to interfaces), wearable computing (I/O devices closer to the user's body), rendering new realities (e.g., virtual and augmented reality) and haptics (giving computers the ability to generate touch and forces). (Note that This class superseeds our former "HCI Topics" graduate seminar, this is a hands-on class with more projects and assignments, not a typical graduate seminar). See here for class website. (This class is part of our Undergraduate Specialization in HCI, see here for details.)
While we do our best to increase class' capacity, our HCI classes fill up quickly. If that happens and you still want to register, please use the CS Waiting list.
4. Creative Machines (PHYS 21400; CMSC 21400; ASTR 31400, PSMS 31400, CHEM 21400, ASTR 21400)
Note: This class is taught by Stephan Meyer (Astrophysics), Scott Wakely (Physics), Erik Shirokoff (Astrophysics). While this class is not taught by Pedro Lopes, it was co-created by Pedro together with Scott Wakely (Physics), Stephan Meyer (Astrophysics), Aaron Dinner (Chemistry), Benjamin Stillwell and Zack Siegel. We highly recommend students interested in HCI or in working with us to take this class. Synopsis: Techniques for building creative machines is essential for a range of fields from the physical sciences to the arts. In this hands-on course, you will engineer and build functional devices, e.g., mechanical design and machining, CAD, rapid prototyping, and circuitry; no previous experience is expected. Open to undergraduates in all majors, Master's or Ph.D. students.
Synopsis: In this class, we examine review developments in HCI technologies that might impact the future generations of computing interfaces, these include: physiological I/O (e.g., brain and muscle computer interfaces), tangible computing (giving shape and form to interfaces), wearable computing (I/O devices closer to the user's body), rendering new realities (e.g., virtual and augmented reality) and haptics (giving computers the ability to generate touch and forces). This is a graduate seminar with emphasis on paper reading, discussing and paper writing. (This class is paused, we recommend you take the more hands-on Emergent Technologies instead.)
We are always looking for exceptional students at the intersection of Human Computer Interaction, Electrical Engineering, Materials Science and Mechanical Engineering.
If you are considering applying for our lab for any position, do the following:P.s.: for UChicago students that want to learn more about HCI, meet HCI faculty and students, consider joining us for the "HCI Club".
Our lab is supported by the following sponsor organizations:
See here for a complete list of press articles about our work.