We engineer interactive devices that integrate directly with the user’s body—we believe these devices are the natural succession to wearable interfaces.
One example of our human-computer integration is our exploration of interactive devices based on electrical muscle stimulation (EMS). These devices electrically actuate the user's body, enabling touch/forces in VR, or allowing everyday objects to teach their users how they should be operated—our wearables achieve this without the weight and bulkiness of conventional robotic exoskeletons. These devices gain their advantages, not by adding more technology to the body, but from borrowing parts of the user's body as input/output hardware, resulting in devices that are not only exceptionally small, but that also implement a novel interaction model, in which devices integrate directly with the user's body. Recently, we were able to generalize this concept to new modalities, including novel ways to interface with a user’s sense of temperature, smell and rich-touch sensations.
We think bodily-integrated interactive devices are beneficial as they enable new modes of reasoning with computers, going beyond just symbolic thinking (reasoning by typing and reading language on a screen). While this physical integration between human and computer is beneficial in many ways (e.g., faster reaction time, realistic simulations in VR/AR, faster skill acquisition, etc.), it also requires tackling new challenges, such as improving the precision of muscle stimulation or the question of agency: do we feel in control when our body is integrated with an interface? We explore these questions, together with neuroscientists, by understanding how our brain encodes the feeling of agency to improve the design of this new type of integrated interfaces.
The Human Computer Integration research lab is led by Prof. Pedro Lopes at the Computer Science Department of the University of Chicago.
Our lab is a welcoming environment that does not discriminate. We are a LGBTQ+ ally lab. If you are emailing lab members, do not assume pronoums, just ask before.
Jasmine Lu, Pedro Lopes. In Proc. UIST’22 (full paper)
We explore how embedding a living organism, as a functional component of a device, changes the user-device relationship. In our concept, the user is responsible for providing an environment that the organism can thrive in by caring for the organism. We instantiated this concept as a slime mold integrated smartwatch. The slime mold grows to form an electrical wire that enables a heart rate sensor. The availability of the sensing depends on the slime mold’s growth, which the user encourages through care. If the user does not care for the slime mold, it enters a dormant stage and is not conductive. The users can resuscitate it by resuming care.
Shan-Yuan Teng, K. D. Wu, Jacqueline Chen, Pedro Lopes. In Proc. UIST’22 (full paper)
UIST Honorable mention award
We propose a new technical approach to implement VR haptic devices that contain no battery, yet can render on-demand haptic feedback. The key is that the haptic device charges itself by harvesting the user’s kinetic energy (i.e., movement)—even without the user needing to realize this. This is achieved by integrating the energy-harvesting with the virtual experience, in a responsive manner. Whenever our batteryless haptic device is about to lose power, it switches to harvesting mode (by engaging its clutch to a generator) and, simultaneously, the VR headset renders an alternative version of the current experience that depicts resistive forces (e.g., rowing a boat in VR). As a result, the user feels realistic haptics that corresponds to what they should be feeling in VR, while unknowingly charging the device via their movements. Once the supercapacitors are charged, they wake up its microcontroller to communicate with the VR headset. The VR can now use the harvested power for on-demand haptics, including vibration, electrical or mechanical force-feedback; this process can be repeated, ad infinitum.UIST'22 paper video UIST talk video hardware files
Yujie Tao, Pedro Lopes. In Proc. UIST’22 (full paper)
We explore a new concept, where we directly integrate the distracting stimuli from the user’s physical surroundings into their virtual reality experience to enhance presence. Using our approach, an otherwise distracting wind gust can be directly mapped to the sway of trees in a VR experience that already contains trees. Using our novel approach, we demonstrate how to integrate a range of distractive stimuli into the VR experience, such as haptics (temperature, vibrations, touch), sounds, and smells. From the results of three studies, we engineered a sensing module that detects a set of distractive signals during any VR experience (e.g., sounds, winds, and temperature shifts) and responds by trigerring pre-made VR sequences that will feel realistic to the user.UIST'22 paper video UIST talk video open source code
Jun Nishida, Yudai Tanaka, Romain Nith, Pedro Lopes. In Proc. UIST’22 (full paper)
We engineered DigituSync, a passive-exoskeleton that physically links two hands together, enabling two users to adaptively transmit finger movements in real-time. It uses multiple four-bar linkages to transfer both motion and force, while still preserving congruent haptic feedback. Moreover, we implemented a variable-length linkage that allows adjusting the force transmission ratio between the two users and regulates the amount of intervention, which enables users to customize their learning experience. DigituSync’s benefits emerge from its passive design: unlike existing haptic devices (motor-based exoskeletons or electrical muscle stimulation), DigituSync has virtually no latency and does not require batteries/electronics to transmit or adjust movements, making it useful and safe to deploy in many settings, such as between students and teachers in a classroom.UIST'22 paper video UIST talk video
Yudai Tanaka, Jun Nishida, Pedro Lopes. In Proc. CHI’22 (full paper)
CHI best demo award (people's choice)
We propose a novel interface concept in which interactive systems directly manipulate the user’s head orientation. We implement this using electrical muscle stimulation (EMS) of the neck muscles, which turns the head around its yaw (left/right) and pitch (up/down) axis. As the first exploration of EMS for head actuation, we characterized which muscles can be robustly actuated and demonstrated how it enables interactions not possible before by building a range of applications, such as (1) synchronizing head orientations of two users, which enables a user to communicate head nods to another user while listening to music, and (2) directly changing the user’s head orientation to locate objects in augmented reality.CHI'22 paper video CHI talk video CHI demo video source code
Daisuke Tajima, Jun Nishida, Pedro Lopes, and Shunichi Kasahara. In Transactions of CHI’21 (full paper)
Force-feedback interfaces actuate the user's to touch involuntarily (using exoskeletons or electrical muscle stimulation); we refer to this as computer-driven touch. Unfortunately, forcing users to touch causes a loss of their sense of agency. While we found that delaying the timing of computer-driven touch preserves agency, they only considered the naive case when user-driven touch is aligned with computer-driven touch. We argue this is unlikely as it assumes we can perfectly predict user-touches. But, what about all the remainder situations: when the haptics forces the user into an outcome they did not intend or assists the user in an outcome they would not achieve alone? We unveil, via an experiment, what happens in these novel situations. From our findings, we synthesize a framework that enables researchers of digital-touch systems to trade-off between haptic-assistance vs. sense-of-agency. Read more at our project's page (we have five other papers on this topic).TOCHI'21 paper video (presented at CHI'22)
Lukas Gehrke, Pedro Lopes, Marius Klug, Sezen Akman and Klaus Gramann. In Journal of Neural Engineering (full paper)
In VR, designing immersion is one key challenge. Subjective questionnaires are the established metrics to assess the effectiveness of immersive VR simulations. However, administering questionnaires requires breaking the immersive experience they are supposed to assess. We present a complimentary metric based on a ERPs. For the metric to be robust, the neural signal employed must be reliable. Hence, it is beneficial to target the neural signal's cortical origin directly, efficiently separating signal from noise. To test this new complementary metric, we designed a reach-to-tap paradigm in VR to probe EEG and movement adaptation to visuo-haptic glitches. Our working hypothesis was, that these glitches, or violations of the predicted action outcome, may indicate a disrupted user experience. Using prediction error negativity features, we classified VR glitches with ~77% accuracy. This work was a collaboration led by Klaus Gramann and his team at the Neuroscience Department at the TU Berlin.Journal of Neural Engineering 2022
Jasmine Lu, Ziwei Liu, Jas Brooks, Pedro Lopes, In Proc. UIST’21 (full paper)
We propose a new class of haptic devices that provide haptic sensations by delivering liquid-stimulants to the user’s skin; we call this chemical haptics. Upon absorbing these stimulants, which contain safe and small doses of key active ingredients, receptors in the user’s skin are chemically triggered, rendering distinct haptic sensations. We identified five chemicals that can render lasting haptic sensations: tingling (sanshool), numbing (lidocaine), stinging (cinnamaldehyde), warming (capsaicin), and cooling (menthol). To enable the application of our novel approach in a variety of settings (such as VR), we engineered a self-contained wearable that can be worn anywhere on the user’s skin (e.g., face, arms, legs).UIST'21 paper video UIST talk video hardware schematics
Yujie Tao, Shan-Yuan Teng, Pedro Lopes., In Proc. UIST’21 (full paper)
UIST best paper award
UIST best demo award (jury's choice)
We proposed a wearable haptic device that alters perceived softness of everyday objects without instrumenting the object itself. Simulteneously, our device achieves this softness illusion while leaving most of the user's fingerpad free, allowing users to feel the texture of the object they touch. We demonstrate the robustness of this haptic illusion alongside various interactive applications, such as making the same VR prop display different softness states or allowing a rigid 3D printed button to feel soft, like a real rubber button.UIST'21 paper video UIST talk video hardware schematics
Romain Nith, Shan-Yuan Teng, Pengyu Li, Yujie Tao, and Pedro Lopes, In Proc. UIST’21 (full paper)
UIST best demo award (people's choice)
DextrEMS is a haptic device designed to improve the dexterity of electrical muscle stimulation (EMS). It achieves this by combining EMS with a mechanical brake on all finger joints. These brakes allow us to solve two fundamental problems with current EMS devices: lack of independent actuation (i.e., when a target finger is actuated via EMS, it also often causes unwanted movements in other fingers); and unwanted oscillations (i.e., to stop a finger, EMS needs to continuously contract the opposing muscle). Using its brakes, dextrEMS achieves unprecedented dexterity, in both EMS finger flexion and extension, enabling applications not possible with existing EMS-based interactive devices, such as: actuating the user’s fingers to pose simple letters in sign language, precise VR force-feedback or even playing the guitar.UIST'21 paper video UIST talk video hardware schematics
Qi Su, Q. Zou, Yang Li, Yuzhen Chen, Shan-Yuan. Teng, Jane Tunde Kelleher, Romain Nith, Ping Cheng, Nan Li, Wei Liu, Shilei Dai, Youdi Liu, Alex Mazursky, Jie Xu, Lihua Jin, Pedro Lopes, Sihong Wang, In Proc. Science Advances
Existing types of stretchable pressure sensors have an inherent limitation of the interference of the stretching to the pressure sensing accuracy. We present a new design concept for a highly stretchable and highly sensitive pressure sensor that can provide unaltered sensing performance under stretching, which is realized through the creation of locally and biaxially stiffened micro-pyramids made from an ionic elastomer. This work was a collaboration and led by Sihong Wang and his team at the Molecular Engineering Department at the University of Chicago.ScienceAdvances'21 paper video
Akifumi Takahashi, Jas Brooks, Hiroyuki Kajimoto, and Pedro Lopes, In Proc. CHI’21 (full paper)
CHI best paper award (top 1%)
CHI best demo award (people's choice)
We improved the dexterity of the finger flexion produced by interactive devices based on electrical muscle stimulation (EMS). The key to achieve it is that we discovered a new electrode layout in the back of the hand. Instead of the existing EMS electrode placement, which flexes the fingers via the flexor muscles in the forearm, we stimulate the interossei/lumbricals muscles in the palm. Our technique allows EMS to achieve greater dexterity around the metacarpophalangeal joints (MCP), which we demonstrate in a series of applications, such as playing individual piano notes, doing a a two-stroke drum roll or barred guitar frets. These examples were previously impossible with existing EMS electrode layouts.CHI'21 paper video CHI talk video
Shan-Yuan Teng, Pengyu Li, Romain Nith, Joshua Fonseca, and Pedro Lopes, In Proc. CHI’21 (full paper)
CHI honorable mention for best paper award (top 5%)
We propose a nail-mounted foldable haptic device that provides tactile feedback to mixed reality (MR) environments by pressing against the user’s fingerpad when a user touches a virtual object. What is novel in our device is that it quickly tucks away when the user interacts with real-world objects. Its design allows it to fold back on top of the user’s nail when not in use, keeping the user’s fingerpad free to, for instance, manipulate handheld tools and other objects while in MR. To achieve this, we engineered a wireless and self-contained haptic device, which measures 24×24×41 mm and weighs 9.5 g. Furthermore, our foldable end-effector also features a linear resonant actuator, allowing it to render not only touch contacts (i.e., pressure) but also textures (i.e., vibrations).CHI'21 paper video CHI talk video hardware schematics
Alex Mazursky, Shan-Yuan Teng, Romain Nith, and Pedro Lopes, In Proc. CHI’21 (full paper)
We propose a new type of haptic actuator, which we call MagnetIO, that is comprised of two parts: any number of soft interactive patches that can be applied anywhere and one battery-powered voice-coil worn on the user’s fingernail. When the fingernail-worn device contacts any of the interactive patches it detects its magnetic signature and makes the patch vibrate. To allow these otherwise passive patches to vibrate, we make them from silicone with regions doped with neodymium powder, resulting in soft and stretchable magnets. This novel decoupling of traditional vibration motors allows users to add interactive patches to their surroundings by attaching them to walls, objects or even other devices or appliances without instrumenting the object with electronics.CHI'21 paper video CHI talk video hardware schematics
Jas Brooks, Shan-Yuan Teng, Jingxuan Wen, Romain Nith, Jun Nishida, and Pedro Lopes, In Proc. CHI’21 (full paper) Honorable Mention, Fast Company Innovation by Design Awards for Experimental Design
We engineered a device that creates a stereo-smell experience, i.e., directional information about the location of an odor, by rendering the readings of external odor sensors as trigeminal sensations using electrical stimulation of the user’s nasal septum. The key is that the sensations from the trigeminal nerve, which arise from nerve-endings in the nose, are perceptually fused with those of the olfactory bulb (the brain region that senses smells). We use this sensation to then allow participants to track a virtual smell source in a room without any previous training.CHI'21 paper video CHI talk video hardware schematics
Shunichi Kasahara, Kazuma Takada, Jun Nishida, Kazuhisa Shibata, Shinsuke Shimojo and Pedro Lopes, In Proc. CHI’21 (full paper)
We found out that after training for reaction time tasks with electrical muscle stimulation (EMS), one's reaction time is acceleerated even after you remove the EMS device. What is remarkable is that the key to the optimal speedup is not applying EMS as soon as possible (traditional view on EMS stimulus timing) but to delay the EMS stimulus closer to the user's own reaction time, but still deliver the stimulus faster than humanly possible; this preserves some of user's sense of agency while still accelerating them to superhuman speeds (we call this Preemptive Action). This was done in cooperation with our colleagues from Sony CSL, RIKEN Center for Brain Science, and CalTech. Read more at our project's page.CHI'21 paper video CHI talk video source code
Yuxin Chen, Zhuolin Yang, Ruben Abbou, Pedro Lopes, Ben Y. Zhao and Haitao Zheng, In Proc. CHI'21 (full paper)
We propose a novel modality for active biometric authentication: electrical muscle stimulation (EMS). To explore this, we engineered ElectricAuth, a wearable that stimulates the user’s forearm muscles with a sequence of electrical impulses (i.e., EMS challenge) and measures the user’s involuntary finger movements (i.e., response to the challenge). ElectricAuth leverages EMS’s intersubject variability, where the same electrical stimulation results in different movements in different users because everybody’s physiology is unique (e.g., differences in bone and muscular structure, skin resistance and composition, etc.). As such, ElectricAuth allows users to login without memorizing passwords or PINs. This is work was a colaboration led by Heather Zheng (who runs the SAND Lab) at UChicago.CHI'21 paper video CHI talk video code
Seungwoo Je, Hyunseung Lim, Kongpyung Moon, Shan-Yuan Teng, Jas Brooks, Pedro Lopes, and Andrea Bianchi, In Proc. CHI'21 (full paper)
Existing shape-changing floors are limited by their tabletop scale or the coarse resolution of the terrains they can display due to the limited number of actuators and low vertical resolution. To tackle this, we engineered Elevate, a dynamic and walkable pin-array floor on which users can experience not only large variations in shapes but also the details of the underlying terrain. Our system achieves this by packing 1200 pins arranged on a 1.80 x 0.60m platform, in which each pin can be actuated to one of ten height levels (resolution: 15mm/level). This work was a collaboration and was led by Andrea Bianchi, who runs the MAKinteract group at KAIST.CHI'21 paper video CHI talk video
Jun Nishida, Soichiro Matsuda, Hiroshi Matsui, Shan-Yuan Teng, Ziwei Liu, Kenji Suzuki, Pedro Lopes, In Proc. UIST’20 (full paper)
UIST best paper award
We engineered HandMorph, an exoskeleton that approximates the experience of having a smaller grasping range. It uses mechanical links to transmit motion from the wearer’s fingers to a smaller hand with five anatomically correct fingers. The result is that HandMorph miniaturizes a wearer’s grasping range while transmitting haptic feedback. Unlike other size-illusions based on virtual reality, HandMorph achieves this in the user’s real environment, preserving the user’s physical and social contexts. As such, our device can be integrated into the user’s workflow, e.g., to allow product designers to momentarily change their grasping range into that of a child while evaluating a toy prototype.UIST'20 paper video UIST talk video 3D files (print your exoskeleton)
Jas Brooks, Steven Nagels, Pedro Lopes, In Proc. CHI’20 (full paper) CHI best paper award (top 1%)
We explore a temperature illusion that uses low-powered electronics and enables the miniaturization of simple warm and cool sensations. Our illusion relies on the properties of certain scents, such as the coolness of mint or hotness of peppers. These odors trigger not only the olfactory bulb, but also the nose’s trigeminal nerve, which has receptors that respond to both temperature and chemicals. To exploit this, we engineered a wearable device that emits up to three custom-made “thermal” scents directly to the user’s nose. Breathing in these scents causes the user to feel warmer or cooler.CHI'20 paper video CHI talk video hardware schematics
Yuxin Chen*, Huiying Li∗, Shan-Yuan Teng∗, Steven Nagels, Pedro Lopes, Ben Y. Zhao and Heather Zheng, In Proc. CHI’20 (full paper)
* authors contributted equally CHI honorable mention for best paper award (top 5%)
We engineered a wearable microphone jammer that is capable of disabling microphones in its user’s surroundings, including hidden microphones. Our device is based on a recent exploit that leverages the fact that when exposed to ultrasonic noise, commodity microphones will leak the noise into the audible range. Our jammer is more efficient than stationary jammers. This is work was a colaboration led by Heather Zheng (who runs the SAND Lab) at UChicago.CHI'20 paper video CHI talk video code/hardware
Floyd Mueller* ,Pedro Lopes*, Paul Strohmeier, Wendy Ju, Caitlyn Seim, Martin Weigel, Suranga Nanayakkara, Marianna Obrist, Zhuying Li, Joseph Delfa, Jun Nishida, Elizabeth Gerber, Dag Svanaes, Jonathan Grudin, Stefan Greuter, Kai Kunze, Thomas Erickson, Steven Greenspan, Masahiko Inami, Joe Marshall, Harald Reiterer, Katrin Wolf, Jochen Meyer, Thecla Schiphorst, Dakuo Wang, Pattie Maes. In Proc. CHI’20 (full paper) * authors contributted equally
Human-computer integration (HInt) is an emerging paradigm in which computational and human systems are closely interwoven; with rapid technological advancements and growing implications, it is critical to identify an agenda for future research in HInt.CHI'20 paper CHI talk video
Michelle Carr, Adam Haar*, Judith Amores*, Pedro Lopes*, Guillermo Bernal, Tomás Vega, Oscar Rosello, Abhinandan Jain, Pattie Maes. In Proc. Consciousness and Cognition (Vol. 83, 2020) (journal paper) * authors contributted equally
We draw a parallel between recent VR haptic/sensory devices to further stimulate more senses for virtual interactions and the work of sleep/dream researchers, who are exploring how senses are intregrated and influence the sleeping mind. We survey recent developments in HCI technologies and analyze which might provide a useful hardware platform to manipulate dream content by sensory manipulation, i.e., to engineer dreams. This work was led by Michelle Carr (University of Rochester) and in collaboration with the Fluid Interfaces group (MIT Media Lab).Consciousness and Cognition'20 paper
Seungwoo Je, Myung Jin Kim, Woojin Lee, Byungjoo Lee, Xing-Dong Yang, Pedro Lopes, Andrea Bianchi. In Proc. UIST’19 (full paper)
We engineered Aero-plane, a force-feedback handheld controller based on two miniature jet-propellers that can render shifting weights of up to 14 N within 0.3 seconds. Unlike other ungrounded haptic devices, our prototype realistically simulates weight changes over 2D surfaces. This work was a collaboration and was led by Andrea Bianchi, who runs the MAKinteract group at KAIST.
Jakub Limanowski, Pedro Lopes, Janis Keck, Patrick Baudisch, Karl Friston, and Felix Blankenburg. In Cerebral Cortex (journal)
Tactile input generated by one’s own agency is generally attenuated. Conversely, externally caused tactile input is enhanced; e.g., during haptic exploration. We used functional magnetic resonance imaging (fMRI) to understand how the brain accomplishes this weighting. Our results suggest an agency-dependent somatosensory processing in the parietal operculum. Read more at our project's page.
Shunichi Kasahara, Jun Nishida and Pedro Lopes. In Proc. CHI’19, Paper 643 (full paper) and demonstration at SIGGRAPH'19 eTech.Grand Prize, awarded by Laval Virtual in partnership with SIGGRAPH'19 eTech.
We found out that it is possible to optimize the timing of haptic systems to accelerate human reaction time without fully compromising the user' sense of agency. This work was done in cooperation with Shunichi Kasahara from Sony CSL. Read more at our project's page.
CHI'19 paper video SIGGRAPH'19 etech CHI'19 talk (slides) CHI talk video
Lukas Gehrke, Sezen Akman, Pedro Lopes, Albert Chen, ..., Klaus, Gramann. In Proc. CHI’19, Paper 427. (full paper)
We detect visuo-haptic mismatches in VR by analyzing the user's event-related potentials (ERP). In our EEG study, participants touched VR objects and received either no haptics, vibration, or vibration and EMS. We found that the negativity component (prediction error) was more pronounced in unrealistic VR situations, indicating visuo-haptic mismatches. Read more at our project's page.
Pedro Lopes, Sijing You, Alexandra Ion, and Patrick Baudisch. In Proc. CHI’18. (full paper)
Summary: We present a mobile system that enhances mixed reality experiences, displayed on a Microsoft HoloLens, with force feedback by means of electrical muscle stimulation (EMS). The benefit of our approach is that it adds physical forces while keeping the users’ hands free to interact unencumbered—not only with virtual objects, but also with physical objects, such as props and appliances that are an integral part of both virtual and real worlds.
Pedro Lopes, Sijing You, Alexandra Ion, and Patrick Baudisch. In Proc. CHI’17 (full paper) and demonstration at SIGGRAPH'17 studios
We explored how to add haptics to walls and other heavy objects in virtual reality. Our contribution is that we prevent the user’s hands from penetrating virtual objects by means of electrical muscle stimulation (EMS). As the shown user lifts a virtual cube, our system lets the user feel the weight and resistance of the cube. The heavier the cube and the harder the user presses the cube, the stronger a counterforce the system generates.
Pedro Lopes, Doga Yueksel, François Guimbretière, and Patrick Baudisch. In Proc. UIST’16 (full paper).
We explore how to create more expressive EMS-based systems. Muscle-plotter achieves this by persisting EMS output, allowing the system to build up a larger whole. More specifically, it spreads out the 1D signal produced by EMS over a 2D surface by steering the user’s wrist. Rather than repeatedly updating a single value, this renders many values into curves.
Pedro Lopes, Alexandra Ion, and Patrick Baudisch. In Proc. UIST’15 (full paper). UIST best demo nomination
We present impacto, a device designed to render the haptic sensation of hitting and being hit in virtual reality. The key idea that allows the small and light impacto device to simulate a strong hit is that it decomposes the stimulus: it renders the tactile aspect of being hit by tapping the skin using a solenoid; it adds impulse to the hit by thrusting the user’s arm backwards using electrical muscle stimulation. The device is self-contained, wireless, and small enough for wearable use.
Pedro Lopes, Patrik Jonell, and Patrick Baudisch. In Proc. CHI’15 (full paper). CHI best paper award (top 1%)
We propose extending the affordance of objects by allowing them to communicate dynamic use, such as (1) motion (e.g., spray can shakes when touched), (2) multi-step processes (e.g., spray can sprays only after shaking), and (3) behaviors that change over time (e.g., empty spray can does not allow spraying anymore). Rather than enhancing objects directly, however, we implement this concept by enhancing the user with electrical muscle stimulation. We call this affordance++.
Pedro Lopes, Alexandra Ion, Willi Mueller, Daniel Hoffmann, Patrik Jonell, and Patrick Baudisch. In Proc. CHI’15 (full paper). CHI best talk award
We propose a new way of eyes-free interaction for wearables. It is based on the user’s proprioceptive sense, i.e., users feel the pose of their own body. We have implemented a wearable device, Pose-IO, that offers input and output based on proprioception. Users communicate with Pose-IO through the pose of their wrists. Users enter information by performing an input gesture by flexing their wrist, which the device senses using an accelerometer. Users receive output from Pose-IO by finding their wrist posed in an output gesture, which Pose-IO actuates using electrical muscle stimulation.
Pedro Lopes and Patrick Baudisch. In Proc. CHI’13 (short paper). IEEE World Haptics, People’s Choice Nomination for Best Demo
Force feedback devices resist miniaturization, because they require physical motors and mechanics. We propose mobile force feedback by eliminating motors and instead actuating the user’s muscles using electrical stimulation. Without the motors, we obtain substantially smaller and more energy-efficient devices. Our prototype fits on the back of a mobile phone. It actuates users’ forearm muscles via four electrodes, which causes users’ muscles to contract involuntarily, so that they tilt the device sideways. As users resist this motion using their other arm, they perceive force feedback.
The publications above are core to our lab's mission. If you are interested more of Pedro's publications in other topics, see here.
1. Introduction to Human-Computer Interaction (CMSC 20300; Next: Fall quarter 2022)
Synopsis: An introduction to the field of Human-Computer Interaction (HCI), with an emphasis in understanding, designing and programming user-facing software and hardware systems. This class covers the core concepts of HCI: affordances, mental models, selection techniques (pointing, touch, menus, text entry, widgets, etc), conducting user studies (psychophysics, basic statistics, etc), rapid prototyping (3D printing, etc), and the fundamentals of 3D interfaces (optics for VR, AR, etc). We compliment the lectures with weekly programming assignments and two larger projects, in which we build/program/test user-facing interactive systems. See here for class website. (This class is required for our Undergraduate Specialization in HCI, see here for details.)
2. Inventing, Engineering and Understanding Interactive Devices (CMSC 23220; Next: Spring 2023!)
Synopsis: In this class we build I/O devices, typically wearable or haptic devices. These are user-facing hardware devices engineered to enable new ways to interact with computers. In order for you to be successful in building your own I/O device we will: (1) study and program 8 bit microcontrollers, (2) explore different analog and digital sensors and actuators, (3) write control loops and filters; (4) learn how to design simple bit-level protocols for Bluetooth communication. We compliment the lectures with labs, weekly programming/circuit-building assignments and one larger project, in which we build a complete circuit for a standalone wearable device. See here for class website. (This class is part of our Undergraduate Specialization in HCI, see here for details.)
3. Engineering Interactive Electronics onto Printed Circuit Boards (CMSC 23230 and CMSC 33230; Next: Winter 2023!)
Synopsis: In our "PCB class" we will engineer electronics from scratch onto Printed Circuit Boards (PCBs). We focus on designing and laying out the circuit and PCB for our own custom-made I/O devices, such as wearable or haptic devices. In order for you to be successful in engineering a functional PCB we will: (1) review digital circuits and three microcontrollers (ATMEGA, NRF, SAMD), (2) use KICAD to build circuit schematics; (3) learn how to wire analog/digital sensors or actuators to our microcontroller, including via SPI and I2C protocols; (4) use KICAD to build PCB schematics; (5) manufacture our designs in a real factory; (6) receive in our hands the PCBs we sent to the factory; (7) finally, learn how to debug our custom-made PCBs. This class is the advanced version of CMSC 23220; while it is possible to take it without taking 23220, we do not recommend it unless you have already some experience with microcontroller programming, breadboarding and simple circuit design. We compliment the lectures with weekly labs, weekly circuit design assignments and one larger project, in which we design a complete PCB from scratch that is manufactured in a factory. See here for class website. (This class is part of our Undergraduate Specialization in HCI, see here for details. Finally, this class is a systems class and it covers a lot of ground in bit-based protocol design, low-level hardware, bootloaders and more.)
While we do our best to increase our classes' capacity, our HCI classes fill up quickly. If that happens and you still want to register, please use the CS Waiting list.
4. Emerging Interface Technologies (CMSC 33240 and CMSC 23240; next: stay tuned!)
Synopsis: In this class, we examine emergent technologies that might impact the future generations of computing interfaces, these include: physiological I/O (e.g., brain and muscle computer interfaces), tangible computing (giving shape and form to interfaces), wearable computing (I/O devices closer to the user's body), rendering new realities (e.g., virtual and augmented reality) and haptics (giving computers the ability to generate touch and forces). (Note that This class superseeds our former "HCI Topics" graduate seminar, this is a hands-on class with more projects and assignments, not a typical graduate seminar). See here for class website. (This class is part of our Undergraduate Specialization in HCI, see here for details.)
5. Human-Computer Interaction and Neuroscience (CMSC 33231-1; Next: stay tuned!)
Synopsis: In this class we examine the field of HCI using Neuroscience as a lens to generate ideas. This is an advanced graduate level seminar that assumes expertise in HCI (e.g., especially in haptics and human actuation) and in basic neuroscience (e.g., sensory systems). The class is based on mini-challenges, paper discussion, interactions with neuroscience experts on-campus and paper writing/experiment design.
6. Creative Machines (PHYS 21400; CMSC 21400; ASTR 31400, PSMS 31400, CHEM 21400, ASTR 21400)
Note: This class is taught by Stephan Meyer (Astrophysics), Scott Wakely (Physics), Erik Shirokoff (Astrophysics). While this class is not taught by Pedro Lopes, it was co-created by Pedro together with Scott Wakely (Physics), Stephan Meyer (Astrophysics), Aaron Dinner (Chemistry), Benjamin Stillwell and Zack Siegel. We highly recommend students interested in HCI or in working with us to take this class. Synopsis: Techniques for building creative machines is essential for a range of fields from the physical sciences to the arts. In this hands-on course, you will engineer and build functional devices, e.g., mechanical design and machining, CAD, rapid prototyping, and circuitry; no previous experience is expected. Open to undergraduates in all majors, Master's or Ph.D. students.
7. Topics in Human Computer Interaction (CMSC 33231; winter quarter; graduate seminar; currently inactive)
Synopsis: In this class, we examine review developments in HCI technologies that might impact the future generations of computing interfaces, these include: physiological I/O (e.g., brain and muscle computer interfaces), tangible computing (giving shape and form to interfaces), wearable computing (I/O devices closer to the user's body), rendering new realities (e.g., virtual and augmented reality) and haptics (giving computers the ability to generate touch and forces). This is a graduate seminar with emphasis on paper reading, discussing and paper writing. (This class is paused, we recommend you take the more hands-on Emergent Technologies instead.)
Click here to see even more of our lovely allumni!
We are always looking for exceptional students at the intersection of Computer Science / Human Computer Interaction but also, Electrical Engineering, Neuroscience, Physics, Materials Science and Mechanical Engineering.If you are considering applying for our lab for any position, do the following:
P.s.: for UChicago students that want to learn more about HCI, meet HCI faculty and students, consider joining us for the "People & Tech Seminar".
Our lab is supported by the following sponsor organizations:
See here for a complete list of press articles about our work.