This project was accepted in the IEEE World Haptics Conference 2021 Student Innovation Challenge.
This project has been made possible through a collaboration of some amazing people and organization!
Adil Rahman is a first-year CS PhD student at the University of Virginia (UVa) advised by Professor Seongkook Heo. Adil's research interest lies in the field of Human-Computer Interaction, particularly in making devices more accessible to everyone and creating new interaction techniques. Prior to joining the PhD program at UVa, Adil has had a year's worth of experience working with gesture recognition at the Indian Statistical Institute. Adil also holds a BTech (IT) from Heritage Institute of Technology, Kolkata, from where he graduated as a gold-medalist in 2020. In his spare time, Adil loves to play video games, make music, and then play some more video games!
Explore Adil's work on his GitHub page and additional information on his CV.
Md Aashikur Rahman Azim is a third-year computer science Ph.D. student at the University of Virginia (UVa). Aashik is currently working with Professor Seongkook Heo on wearable user interfaces in the context of Human-Computer Interaction (HCI). He received both Bachelor of Science and Master of Science degrees from the Department of Computer Science and Engineering at Bangladesh University of Engineering and Technology (BUET) in February 2013 and December 2016, respectively. His research area includes Human-Computer Interaction, Applied Machine Learning, and Embedded Systems.
Find more information on his website.
Wen Ying is a first-year CS PhD student at the University of Virginia. Her study involves experimenting with different types of haptic feedback from which she has had experience with creating virtual haptic feedback using LRA actuators, sensors, boards, etc.
Check her portfolio for more information.
Archana Narayanan is an MS student in Computer Engineering at the University of Virginia, Charlottesville. She is also a part of the Human Computer Interaction Lab at UVA where her current research focuses on the development of a multidimensional haptic feedback device based on asymmetric vibrations. Her research interests include Embedded Systems, Hardware Engineering and Human Computer Interaction. Prior to joining the master’s program, she was a project assistant at the Indian Institute of Science, Bangalore where she built data acquisition systems for spectroscopy using FPGA’s. In the past, she has worked on multiple projects involving FPGA’s,IoT,analog and digital devices/circuits, multiple sensors, microcontrollers, and actuators.
Find more information on her website.
Seongkook Heo is an Assistant Professor in the Department of Computer Science at the University of Virginia. His research interests span across various areas of human-computer interaction, with an emphasis on interaction techniques and technologies. He was previously a postdoctoral researcher in the DGP Lab at the University of Toronto, where he worked with Prof. Daniel Wigdor. He received his PhD in Computer Science at the KAIST in 2017, advised by Prof. Geehyuk Lee. He was also a research intern at industry research labs, including Samsung Advanced Institute of Technology, Microsoft Research (with Dr. Ken Hinckley), and Autodesk Research (with Dr. Tovi Grossman).
Find more information on his website.
Christian Frisson is an associate researcher at the Input Devices and Music Interaction Laboratory (IDMIL) (2021), previously postdoctoral researcher at McGill University with the IDMIL (2019-2020), at the University of Calgary with the Interactions Lab (2017-2018) and at Inria in France with the Mjolnir team (2016-2017). He obtained his PhD at the University of Mons, numediart Institute, in Belgium (2015); his MSc in “Art, Science, Technology” from Institut National Polytechnique de Grenoble with the Association for the Creation and Research on Expression Tools (ACROE), in France (2006); his Masters in Electrical (Metrology) and Mechanical (Acoustics) Engineering from ENSIM in Le Mans, France (2005). Christian Frisson is a researcher in Human-Computer Interaction, with expertise in Information Visualization, Multimedia Information Retrieval, and Tangible/Haptic Interaction. Christian creates and evaluates user interfaces for manipulating multimedia data. Christian favors obtaining replicable, reusable and sustainable results through open-source software, open hardware and open datasets. With his co-authors, Christian obtained the IEEE VIS 2019 Infovis Best Paper award and was selected among 4 finalists for IEEE Haptics Symposium 2020 Most Promising WIP.
Find more information on his website.
Jun Nishida is Currently Postdoctoral Fellow at University of Chicago & Research Fellow at Japan Society for the Promotion of Science (JSPS PDRA) / Previously JSPS Research Fellow (DC1), Project Researcher at Japanese Ministry of Internal Affairs and Communications, SCOPE Innovation Program & PhD Fellow at Microsoft Research Asia / Graduated from Empowerment Informatics Program, University of Tsukuba, Japan.
I’m a postdoctoral fellow at University of Chicago. I have received my PhD in Human Informatics at University of Tsukuba, Japan in 2019. I am interested in designing experiences in which all people can maximize and share their physical and cognitive capabilities to support each other. I explore the possibility of this interaction in the field of rehabilitation, education, and design. To this end, I design wearable cybernic interfaces which share one’s embodied and social perspectives among people by means of electrical muscle stimulation, exoskeletons, virtual/augmented reality systems. Received more than 40 awards including Microsoft Research Asia Fellowship Award, national grants, and three University Presidential Awards. Review service at ACM SIGCHI, SIGGRAPH, UIST, TEI, IEEE VR, HRI.
Find more information on their website.
Heather Culbertson is a Gabilan Assistant Professor of Computer Science at the University of Southern California. Her research focuses on the design and control of haptic devices and rendering systems, human-robot interaction, and virtual reality. Particularly she is interested in creating haptic interactions that are natural and realistically mimic the touch sensations experienced during interactions with the physical world. Previously, she was a research scientist in the Department of Mechanical Engineering at Stanford University where she worked in the Collaborative Haptics and Robotics in Medicine (CHARM) Lab. She received her PhD in the Department of Mechanical Engineering and Applied Mechanics (MEAM) at the University of Pennsylvania in 2015 working in the Haptics Group, part of the General Robotics, Automation, Sensing and Perception (GRASP) Laboratory. She completed a Masters in MEAM at the University of Pennsylvania in 2013, and earned a BS degree in mechanical engineering at the University of Nevada, Reno in 2010. She is currently serving as the Vice-Chair for Information Dissemination for the IEEE Technical Committee on Haptics. Her awards include a citation for meritorious service as a reviewer for the IEEE Transactions on Haptics, Best Paper at UIST 2017, and the Best Hands-On Demonstration Award at IEEE World Haptics 2013.
Find more information on her website.
- Abstract
- Frequently Asked Questions
- Getting Started 📖
- A User's Guide to Wearing the On-Body Gym 🎽
- A Developer's Guide to the On-Body Gym API 🔌
- On-Body Jukebox 🎶
- On-Body VR 🎮
- Acknowledgements
- References
- License
Impacted by the COVID-19 pandemic, gyms across the globe have either shut down or restricted total capacity. Due to this, everyone has turned to exercising at home regularly. To benefit people with their mental and physical health, we propose a wearable on-body haptic device capable of providing a holistic workout experience at home. Using granular haptic feedback, our proposed system can simulate the feeling of using actual gym equipment. We leverage flex sensors, force sensors, and vibrotactile actuators, all working in tandem to detect workout activity and trigger a granular haptic response to mirror the feedback we receive from various spring-and-pulley based gym machines. The generated granular haptic feedback also motivates free-hand workout routines. Additionally, our system simulates the rhythm of the music using vibrotactile actuators to promote endurance and a positive mood during workouts, allowing users to select from a range of predefined soundtracks, or load their own music. This will be especially useful for people with hearing impairment, who can now enjoy the rhythm of any music while working out. Overall, we have designed this system to encourage all types of people to pursue fitness and to enrich their home workout experience.
On-Body Gym is a wearable haptic device that is capable of detecting the user's workout activity and provide granular haptic feedback to simulate the feeling of using spring-and-pulley based gym equipments through compliance illusion.
Imagine pressing a physical button or squeezing a hard spring. Once the button is pressed or the spring is squeezed, we can feel a certain tactile feedback generated by these object's physical displacement and deformation. Generating these kind of feedback upon pressing a rigid object and perceiving the object to be flexible is compliance illusion.
Granular haptic feedback, also referred to in our work as grain vibration, is a type of vibration generated by a very short-lasting damped sinusoidal wave burst. This feedback is similar to a tick felt when squeezing a hard hand-grip.
On-Body Gym consists of two force sensors, one placed on each palm, and four flex sensors, one placed on each elbow and knee pits. The force sensors are used to detect if the person is holding something or applying pressure on a surface using their hands, whereas the flex sensors are used to detect whether the person is bending their arms or legs. On-Body Gym uses a simple thresholding based algorithm which puts the force exerted on the palms and intensity of haptic feedback directly proportional, i.e., if the user is holding a dumbbell (or any other heavy object), then while doing exercises involving bending the knees and elbows, they receive a greater haptic feedback, wheras if the person is not holding anything, they do not receive any haptic feedback. This algorithm supports a variety of weight-based exercises such as bicep curls, bell squats, lunges, bench press, shoulder press, etc., without the need for any specialized gym hardware.
On-Body Gym supports free-hand exercises as well as long as pressure is applied on the palms. Push-ups are a great example of these kinds of interaction, since the force sensors get triggered when the user is lying on the ground supported by their hands. Similarly, other free-hand exercises can be also be done with only slight modification of holding a dumbbell, like lunges, squats, bicycle crunches, etc.
While getting haptic feedback for exercises that require gym equipments makes sense, haptic feedback in free-hand exercises can significantly improve the overall experience of working out. Imagine doing push-ups. Haptic feedback will allow users to directly feel the result of their actions and provide a richer workout experience.
NO. In no way can On-Body Gym substitute for the effort put in working out. On-Body Gym does not stimulate muscle growth. On-Body Gym only allows users to receive a haptic feedback on the limbs which they are using for exercising, making the workout more enjoyable.
We understand that music is an ultra-important component for many of us working out. The On-Body Gym has an Android companion application using which you can load and play the rhythm of any music that you like, along with 5 of our personally curated tracks. The two actuators placed behind the earlobes allows the users to feel the rhythm of any music, which can effectively allow even people with hearing impairments to enjoy their favorite rhythm. Also, users with normal ears can enjoy the music itself as a result of bone conduction!
The On-Body Gym system can also be used as a full-body haptic controller. In order to demonstrate this feature, we also created a virtual reality application for the Oculus Rift S users. This application strengthens the compliance illusion of squeezing a hand-grip by not only providing the haptic feedback via the actuators but also showing appropriate visuals in virtual reality!
You can build your own On-Body Gym system following our Getting Started guide.
While the entire system costs approximately $350 to build, most of the equipments used are either extremely common tech or household items, or should be fairly easy to get, reducing the effective build cost significantly. You can check the hardware requirements here.
Definitely! All contributions are welcome! You can open an issue to discuss the changes, and use pull requests to send in the updates. Also, be sure to check out the developer's guide to learn more about how you can craft your own interaction using our API!
Want to experience On-Body Gym for yourself? Follow our step-by-step guide to build your own On-Body Gym system!
The hardware requirements for building the On-Body Gym are categorised as follows:
- Raspberry Pi 4 Microcontroller | $35.00
- Audio Injector Octo 8-Channel Sound Card | $58.00
- Syntacts 8-Channel Amplifier Board | Contact manufacturer for price
- SparkFun Qwiic Hat for Raspberry Pi | $6.50
- SparkFun Qwiic 12-Bit 4-Channel ADC (2 count) | $21.00
- Force Sensitive Resistors 0.5" (2 count) | $13.90
- Flex Sensor 4.5" (4 count) | $63.80
- Coin Linear Resonant Actuator (8 count) | $25.36
- 2-RCA Male to 3.5mm Male (4 count) | $31.96
- Cable Length Recommendation: Short, approximately 6 inches in length.
- Qwiic Cable (2 count) | $1.90
- Cable Length Recommendation: Extremely short, approximately 50 mm in length.
- Ribbon Cables (2 count) | $9.90
- Cable Recommentation: A multi-colored wire strand setup will help in cable management.
- Cable Length Recommendation: Extremely long, at least 30 feet long for a 10-wire pack.
- Cable Recommentation: A multi-colored wire strand setup will help in cable management.
- Female Jumper Wires (16 count) | $6.59
- Cable Length Recommendation: As short as can be, only the connector is required with minimal wire (will be soldered to ribbon cables).
- [Optional] Type-C Cable | $4.99
- Cable Recommentation: This cable is required only if using a power bank or USB outlet to power Raspberry Pi.
- Cable Length Recommendation: Depends, approximately 2 feet if using a powerbank, otherwise as long as possible if powering through USB-outlet.
- Cable Recommentation: This cable is required only if using a power bank or USB outlet to power Raspberry Pi.
- [Optional] Micro-HDMI to HDMI Cable | $5.00
- Cable Recommentation: This cable is required only if connection to monitor is required.
- Cable Length Recommendation: As per monitor placement.
- Cable Recommentation: This cable is required only if connection to monitor is required.
- [Optional] 2x AA Battery Holder + Jumper Headers | $1.95
- Recommentation: This is only required to power the Syntacts board. Alternatively, a USB cable with exposed VCC and GND wires can also be used along with a power bank or USB-outlet.
- Recommentation: This is only required to power the Syntacts board. Alternatively, a USB cable with exposed VCC and GND wires can also be used along with a power bank or USB-outlet.
- 16GB MicroSDHC Card | $6.80
- 1k Ohm Resistor (6 count) | $0.95
- [Optional] 20000mAh Power Bank | $24.25
- Recommendation: This is HIGHLY RECOMMENDED in order to allow for an untethered experience.
- Knee-Length Socks (at least 4 pairs) | $13.99
- Velcro Cable Ties (approximately 30 pieces) | $12.49
- Recommendation: Cable Ties and Socks are used to hold components in place. These are highly recommended, but alternatively any string with significant tensile strength can be used to tie the components on the user's body, not recommended though.
- Band-Aid (each setup needs approximately 10 band-aids per use, purchase accordingly) | $5.99
- Recommendation: Band-Aids are used for holding components against the skin since they are considered as skin-safe adhesives, but alternatively regular tapes could also be used, not recommended though.
- Cardboard Box (17mm L x 11mm B x 9mm D) | Hopefully free of cost
Assembling the On-Body Gym will require the usual workshop tools which include (but may not be limited to): scissor, screw-drivers, blade, soldering kit, shrink tubes (optional, for wrapping up the soldered-joints), and a heat gun (optional, for shrinking the shrink tube). A cheaper alternative to shrink tubes and heat gun is an electrical tape, which can also be used to wrap up soldered joints.
The mandatory components required to build the On-Body Gym system is estimated to be approximately $315, whereas the optional requirements total to approximately $37. This puts the total estimated cost for the recommended build to be at approximately $352.
Note: The above build cost does not take into account the cost of the required tools and the Syntacts board.
Before we start connecting the hardware together, we need to prepare some components first.
The SparkFun 12-bit 4-channel ADC board has a default I2C address of 0x48. However, this address may conflict with other components on the Raspberry Pi board. Thankfully, the ADC board offers 3 additional I2C address to choose from (0x49, 0x4A, and 0x4B). Thus, in order to avoid an address conflict, we need to manually change the I2C address of both the ADC boards. Follow this tutorial (under Jumpers > I2C Address) to learn how to change I2C address of this ADC board. Based on this tutorial, first cut the connection to the 0x48 address for both the ADC boards (remember we have 2 counts of the ADC board), then make connection to address 0x49 for one of the ADC boards, and another connection to address 0x4A for the other ADC boards.
The ADC boards have a 10k ohm potentiometer connected to their AIN3 ports which disturbs the sensor readings. Thus, we need to disconnect the potentiometer. Follow this tutorial (under Jumpers > Potentiometer) to learn more about how to cut this potentiometer.
From the ribbon cable, take out 8 sets of 2-wired cables with length of at least 1 meter. Slightly peel off both ends of the cables to expose the wires. Solder one end of each of the 2-wired cable ends with a linear resonant actuator. Now, cut the female-to-female jumper wires in half, and peel off the ends of the cables to expose the wire. Now, for each of the 8 sets of 2-wired ribbon cables soldered to the linear resonant actuators, solder the other end of the cable with the jumper wires. This will result in 8 linear resonant actuator assemblies, with each actuator having two female jumpers at the end of a 1 meter long cable. If done correctly, the end result should look similar to the linear resonant actuator assembly shown here.
- Recommendation: While taking out 8 sets of 1 meter 2-wired cables, try to take out 4 sets of 2 meter 2-wired cables first, each pair having a different color combination, and then cut these 4 sets in half to make 8 sets of 1 meter 2-wired cables. Doing this will result in each color combination have 2 sets of 1 meter 2-wired cables, which will help immensely with wire management in the later part of the build.
- Recommendation: It is highly recommended to use shrink tubes at the solder joints and heat it with a heat gun. This will not only protect the system from short-circuits and strengthen the overall structure of the system but will also make the system more comfortable to mount at a later stage.
Similar to the previous step involving the linear resonant actuators, we need to add 1-meter long 2-wired cables to each of the 4 flex and 2 force sensors. However, unlike the previous step, we do not need to solder female jumpers on the other end of the cable.
- Recommendation: Similar to the previous stage, it is highly recommended to use shrink tubes at the solder joints and heat it with a heat gun. This will not only protect the system from short-circuits and strengthen the overall structure of the system but will also make the system more comfortable to mount at a later stage.
Once the hardware components are prepared, it is now finally time to assemble them!
We begin our assembly process by connecting the sound card and Qwiic Pi Hat to the Raspberry Pi 4 board.
- Stack the Audio Injector Octo 8-Channel sound card on the GPIO header of the Raspberry Pi 4 board. Push in the sound card gently on the board so that the GPIO header is perfectly connected to the sound card.
- Stack the SparkFun Qwiic Pi Hat on top of the GPIO header pins present on the sound card. Make sure to align all the pins properly before pushing it in the board.
- Connect the 8-channel output connector on the Out socket of the sound card.
If done properly, the setup should resemble the image above.
Once the sound card and the Qwiic Pi Hat are connected to the Raspberry Pi 4 board, we now move on to connecting the sensor system to the Qwiic Pi Hat. Before proceeding, make sure all the board and sensor wirings are prepared, as stated in the instructions above.
- Connect the two ADC boards to the Qwiic Pi Hat using Qwiic cables.
- Mark the two flex sensors which are to be connected to the ADC board with I2C address 0x4A with some sort of identification mark (e.g., colored marker) for later identification.
- Connect the sensors to the ADC boards using the following configuration:
I2C Address | Mounting Body Section | Mounting Body Location | Pin Number | Sensor Type |
---|---|---|---|---|
0x49 | Hands | Right Palm | A0 | Force |
Left Palm | A1 | Force | ||
Right Elbow | A2 | Flex | ||
Left Elbow | A3 | Flex | ||
0x4A | Legs | Right Dorsal Knee | A0 | Flex |
Left Dorsal knee | A1 | Flex |
Please refer to the visual guide below to get a clearer idea of sensor connection. Also, ensure that all the sensors are connected to resistors, as described in the image above.
Once all the sensors are connected properly, it is time to assemble the haptic feedback system. Before proceeding, ensure that all the 8 linear resonant actuators are prepared, as stated in the instructions above.
- Connect all the 8 output ports of with the sound card with the 4 input ports of the Syntacts amplifier board. To do so, use the 2-RCA Male to 3.5 mm Male cable connectors. Ensure that the cables are connected serially, with output channel numbers of the sound card matching the input channel numbers of the Syntacts board.
Tip: To ensure correct connection, make sure the output ports of the sound card faces the input ports of the Syntacts board, then connect ports facing each other. - Connect the linear resonant actuators to the Syntacts board. During connection, ensure that each actuator is connected to one positive and one negative of the same output channel on the Syntacts board.
Tip: On-Body Gym involves a lot of (lenghtily) wired components. Thus, in order to make the usage process smooth, we STRONGLY recommend following a color-based wiring pattern for the actuators, such as:
Syntacts Output Channel | Cable Color Pattern | Actuator Location |
---|---|---|
0 | Red | Right Dorsal Knee |
1 | Left Dorsal Knee | |
2 | Yellow | Right Wrist/Palm |
3 | Left Wrist/Palm | |
4 | White | Right Elbow Pit |
5 | Left Elbow Pit | |
6 | Black | Right Mastoid/Behind Right Earlobe |
7 | Left Mastoid/Behind Left Earlobe |
Note: While maintaining the exact same color combination as defined in the table above is not mandatory, the idea is to maintain a similar pattern/group of wire colors, and that no two wires of different color groups have the same color.
This concludes the primary hardware assembly of On-Body Gym. Now all that's left is loading the software and wrapping the setup in a portable box.
We now need to install the Raspbian OS on the Raspberry Pi 4, load all the necessary drivers, softwares, and libraries, and finally load the On-Body Gym software.
- Install the Raspbian OS and set up the Raspberry Pi 4 system. The follow tutorial can serve as a great guide to get things started!
Note: Alternatively, Raspbian OS can also be set up headlessly. But for the initial setup, we strongly recommend setting up the Raspberry Pi with the Raspbian OS which comes with a Desktop (and install with a monitor connected) to make the setup much more convenient. - Connect the Raspberry Pi to WiFi.
- Enable VNC and SSH on the Raspberry Pi.
- Enable I2C on the Raspberry Pi.
- Install i2c-tools. To do so, open terminal and enter the following command:
sudo apt-get install -y i2c-tools
- Check the I2C connections on the Raspberry Pi to ensure that the modules have been connected properly. Open terminal and enter the following command:
i2cdetect -y 1
If the connections are made properly, the command above should yield the following output:
0 1 2 3 4 5 6 7 8 9 a b c d e f
00: -- -- -- -- -- -- -- -- -- -- -- -- --
10: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
20: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
30: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
40: -- -- -- -- -- -- -- -- UU 49 4a -- -- -- -- --
50: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
60: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
70: -- -- -- -- -- -- -- --
If you see 49
and 4a
, this means that the ADC board with the address 0x49 and 0x4A are connected properly to the Raspberry Pi.
- Install Octo sound card drivers. To do so, follow the manual setup provided here.
- Install the Python library dependencies. To do so, open the terminal and enter the following command:
pip install requests Flask Adafruit-ADS1x15
- Install PureData. To do so, open the terminal and enter the following command:
sudo apt-get install puredata
- Test the sound card installation. To do so, open PureData, then go to Media > Audio Settings. If the Octo sound card setup is done correctly, under the Output Devices dropdown menu, you should be able to see
audioinjector-octo-soundcard (hardware)
. Select it, and then set the Channels to8
. - Test the actuator connection. To do so, in PureData, go to Media > Test Audio and Midi.... Then, in the testing window, select the button labeled
80
under the Test Tones section. If the actuators are installed correctly, all the actuators should now start vibrating. Now, test all the actuators individually by disabling all the 8 output channels from the testing window, and then enabling each channel individually and checking the corresponding actuator.
Note: It is extremely important to ensure that the audio channels 1 through 8 in PureData directly correspond to the Syntacts board output channels 0 through 7, such that:
PureData Channel | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 |
Syntacts Channel | 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 |
Troubleshooting Guide: If the channel mapping does not match the mapping above, check the connection between the Syntacts board and the Octo sound card. If the connection is correct, go to Media > Audio Settings and then reselect audioinjector-octo-soundcard (hardware)
under the output devices, re-set the channels to 8
, and then click on Apply
. This should fix the mapping.
- Download and extract the On-Body Gym software. To do so, navigate to the Releases page, and download the latest release of On-Body Gym (not to be confused with On-Body Jukebox and On-Body VR). Once downloaded, extract the contents of the archive.
- Add execution permission to the On-Body Gym software. To do so, navigate to the directory of the extracted archive in a terminal. Once you are in that directory, enter the following command:
sudo chmod +x app.sh
Congratulations! You are now done installing all the necessary softwares, drivers, and libraries on the Raspberry Pi.
In order to run the On-Body Gym server, all you need to do is navigate to the extracted archive folder using a terminal, and then enter the following command on the terminal: ./app.sh
Note: It is advisable to run the server after wearing the system on the body. Also, from this point onwards, you can safely access the Raspberry Pi over SSH and run the server using the instruction above, thus you won't have to connect the Raspberry Pi to any monitor as it can be operated headlessly.
Now that the hardware is assembled together and the softwares are loaded, we can finally compactify the setup by putting it in a cardboard box. Before proceeding, ensure that all the components are connected to each other (except for the power connection on the Raspberry Pi and Syntacts board).
- For packing the system, we need a cardboard box (with a lid) that is 17mm long, 11mm wide, and 9mm deep.
- Cut two thin, vertical slits of dimension
6mm x 0.5mm
, one on each side, on end of the sides which have the dimensions11mm x 9mm
. Refer to the image above to get a clearer idea of the dimension and location of the slits. - Insert two knee-length socks, one in each slit, and then tie the ends of both the socks together inside the box to form one very long, dyfunctional sock (refer image above).
Note: Leave the outer end of the sock as is. We will use this to tie the device to our body over our waist. - Write
R
with a marker on the right side of the box which contains the slit. Similarly, writeL
on the left side of the box which contains the other slit. - [Ultra Important Step] Referring to the actuator and sensor tables above, separate all the sensors and actuators that go on the right side of the body from those that go on the left side of the body. Once the sensors and actuators are categorised into two parts, left and right, insert the sensors and actuators corresponding to the right side of the body through the box slit (from inside to outside of the box) on the side marked
R
. Similarly, insert the sensors and actuators corresponding to the right side of the body through the box slit (from inside to outside of the box) on the side markedL
. - [Optional] Wrap the cables on each end with a cable tie to keep things clean and tidy.
- Plug in a Type-C USB on the Raspberry Pi, and a USB-based power cable on the Syntacts board, and then take the other ends of those wires out from either of the slit (from inside to outside of the box).
- Once the sensors/actuators of the left and right side are segregated, it is now time to put the hardware in the box.
- First place the Syntacts board on the base of the cardboard box.
- Then place the ADC boards.
Tip: You can try securing the ADC boards behind the sock joint inside the box. - Then place the Raspberry Pi board along with the connected sound card and Qwiic Pi Hat modules.
- Finally, insert the 8-channel output connector.
- Close the box!
- [Optional, but Highly Recommended] If you're using a power bank to power the system, place the power bank on top of the closed box and then tie it to the box using a sock cutout or a cable tie (or literally anything that can hold the two things together).
Note: Do not plug in the USB cables to the power bank unless you want to start using the system since plugging the USB cable will power on the device.
If everything is done correctly, your build should have resemblance to the images above.
Congratulations! You have successfully built your own On-Body Gym system!
In order to mount the On-Body Gym, we use sock cut-outs, cable ties, and band-aids (or skin-friendly adhesive tapes). Before proceeding to wear our system, sock cut-outs need to be prepared.
Sock Set # | Length | Location | Held Components | Notes |
---|---|---|---|---|
1 | 26 mm | Knees | Flex, LRA | Use at least 2 cable ties on each sock to hold the flex sensor. Sensors can also be taped to the sock with a band-aid. |
2 | 12 mm | Palms | Force, LRA | Ensure that this cut has the top elastic for better grip. Also, you need to cut holes for the thumb. |
3 | 10 mm | Biceps | - | Used for cable management. |
4 | 8 mm | Forearm | - | Used for cable management. |
5 | 18 mm | Elbows | Flex, LRA | Use at least 2 cable ties on each sock to hold the flex sensor. Sensors can also be taped to the sock with a band-aid. |
6 | 14 mm | Thighs | - | Used for cable management. |
Based on the diagram and table information above, cut out the relevant sections from knee-length socks.
Once the sock cut-outs are prepared, it is now finally time to mount the system as per the images above. While you can wear this device as per your convenience as long as the sensors are perfectly in place, we have compiled a list of tips which we have learned during our many trials of wearing the system.
- Tip #1: Taking assistance from a friend for wearing the system will be super helpful and is highly recommended.
- Tip #2: Before putting on any sensor, it is advisable to wear all the socks in the relevant places.
- Tip #3: Once the socks are worn, the next advisable step would be to wrap the device box around the waist before placing any sensors.
- Tip #4: Once the device box is tied to the waist, it is advisable to place the sensors one side of the body at a time. This will help reduce wire clutter.
- Tip #5: Use band-aids to place the LRA behind the earlobes.
- Tip #6: Sensors can be held together to the socks using band-aids.
- Tip #7: Tie up all the excess wires together with the box on the back only after all the sensors are placed.
- Tip #8: Ensure that you are able to stretch both your arms and legs before tying up all the excess wires. It is advisable to give some leeway while tying up the extras.
Once you've mounted the system, you can now power on the device (by connecting the USB cables with the power bank) and then experience our On-Body Gym system by running the server.
Happy Exercising!
The On-Body Gym system can be deployed as a full-body haptic controller. To this extent, we have developed a RESTful API to aid designers in crafting their own experience using the On-Body Gym system. Using this API, designers can access real-time sensor values, trigger grain vibration on individual actuators, and even set the intensity of vibration on individual actuators.
We define the endpoints for our API functionality here.
-
Real-Time Sensor Values
Returns the real-time sensor values of the 6 sensors at the instance of call.
API:http://<server_ip>:<port>/get_sensor_values
Parameter:None
Returns: String with 6 comma-delineated sensor values.
Response Format:RH_FSR,LH_FSR,RH_FLEX,LH_FLEX,RL_FLEX,LL_FLEX
[RH
,LH
,RL
, andLL
stand for Right Hand, Left hand, Right Leg, and Left Leg, respectively] -
Play Grain Vibration
Triggers the grain vibration feedback on the specified actuator.
API:http://<server_ip>:<port>/play_grain
Parameter:channel
, defines which channel to play the grain vibration on. Takes values 0 through 7.
Returns:OK
if channel is provided,NaN
otherwise. -
Set Grain Vibration Intensity
Sets the intensity of the grain vibration on the specified actuator.
API:http://<server_ip>:<port>/set_grain_volume
Parameter 1:channel
, defines which channel to play the grain vibration on. Takes values 0 through 7.
Parameter 2:volume
, defines the intensity of the grain vibration. Takes values 0 through 100, with 0 being no vibration and 100 being maximum vibration.
Returns:OK
Additional endpoints can be found in the server file.
Note: It is strongly advisable to use 11996
as the port value (which is the default value). Also, it is fairly obvious that the On-Body Server needs to be running in order to access the above API endpoints.
If you're developing a custom interaction for the On-Body Gym system, it might be desirable to disable the default interaction and haptic feedback generation system of the On-Body Gym. To do so, instead of running the server by launching app.sh
, please run app_dev.sh
, which launches the PureData system and the server, without launching the default interaction system of the On-Body Gym.
Note: Don't forget to grant app_dev.sh
execution permission using chmod
before running it.
To demonstrate some of the capabilities of our API, we have developed On-Body Jukebox and On-Body VR, both of which are built over the API to communicate with the On-Body Gym system.
On-Body Jukebox is an Android companion application for our On-Body Gym system which allows users to play the rhythm of any music through haptic feedback (and also listen to the audio via bone conduction) on the On-Body Gym system to give them an additional push in reaching their workout goals. On-Body Jukebox comes with 5 pre-loaded beat-pumping tracks (see tracklisting here), and allows the user to add one of their own custom audio. This sytem can effectively allow even people with hearing impairments to feel the rhythmic vibration of any music even if they aren't able to enjoy the music itself! The features of On-Body Jukebox can be listed as follows:
- Jukebox
- Play rhythmic vibration of any music on the actuators placed behind the earlobes of the On-Body Gym system.
- Contains 5 pre-loaded, personally curated tracklisting.
- Allows users to load any custom music.
- Settings
- Allows easy connection to the Raspberry Pi running the On-Body Gym server by simply entering its IP address.
- Allows users to adjust the music rhythm vibration intensity.
- Allows users to adjust the workout grain vibration intensity.
- Allows access to developer console.
- Developer Console
- Allows testing individual LRA connections and also reveals the position where each LRA should be placed.
- Provides real-time force and flex sensor data to test whether the device is running properly.
- Allows users to hard-reset the Audio DSP on the Raspberry Pi for troubleshooting.
Note: On-Body Gym is, in itself, a standalone system, and does not mandatorily require this application to function. However, this companion app adds a neat layer of convenience to using our system, allowing users to perform all the said additional features wirelessly through a mobile phone, without having to manually connect the Raspberry Pi to a monitor and fidget with terminals and codes.
In this section, we have listed instructions to get started with the basic functionalities of the application.
Download the latest on_body_jukebox_<version>.apk
file from releases and open it on your Android device to install On-Body Jukebox.
Note: The Android device must have appropriate permission to install applications from unknown sources in order to successfully install the application.
Before proceeding, ensure that the Raspberry Pi is connected to a network and that the On-Body Gym server is running on it. Note down the IP address of the Raspberry Pi device. Now, after installing the android application, open it and click on Settings
. Inside settings, on the top, there should be a text box to enter the IP address of the system. Enter the IP address in that field and then click on Set Connection
. If everything is done correctly, the bottom bar on the application should turn green and show the message Connected
.
Follow the instructions on the image below to load custom music.
Note: The system only supports *.wav
audio files currently, *.mp3
is not supported.
To demonstrate the capabilities of the On-Body Gym system to be deployed as a full-body haptic controller, we have developed On-Body VR, a virtual reality application in which users can both feel and visualize the act of squeezing a hand-grip, without actually squeezing a hand-grip. This application intensifies the illusion of deformation by providing an immersive visual result alongside the haptic feedback which the user receives on their palm through our On-Body Gym system.
Note: On-Body Gym does not mandatorily require this application to function as a standalone system. This application is just an add-on to demonstrate additional features that can be achieved with the On-Body Gym.
In this section, we have listed instructions to install and use the On-Body VR application.
Download the latest on_body_vr_<version>.zip
file from releases and then extract it on your Windows computer.
Before proceeding, ensure that the Raspberry Pi is connected to a network and that the On-Body Gym server is running on it. Note down the IP address of the Raspberry Pi device. Now, navigate to the folder where On-Body VR has been extracted. Then, navigate to <extracted_dir>/OnBodyVr_Data/
and open config.ini
in a text editor. In this file, you should see something like this:
[main]
ip = 192.168.0.2
port = 11996
Edit the ip
and port
parameters in this file to match the IP and port of the On-Body Gym server.
This application requires Oculus Rift S to work. To run this application, simply navigate to the installation directory and execute OnBodyVR.exe
. If the installation is done correctly and the system is connected to the On-Body Gym server, On-Body VR: Connected To Server
message should appear written on the sky, and you should now be able to squeeze the rigid VR controller and feel (and see!) yourself squeezing an actual hand-grip!
Note: In case you see the error message ERROR: Request Timeout
on the sky, it means that the VR application could not connect to the server. In this case, ensure that the server is running, and that the VR application configuration file has the correct IP address and port.
SIC chairs would like to thank Evan Pezent, Zane A. Zook and Marcia O'Malley from MAHI Lab at Rice University for having distributed to them 2 Syntacts kits for the IROS 2020 Intro to Haptics for XR Tutorial. SIC co-chair Christian Frisson would like to thank Edu Meneses and Johnty Wang from IDMIL at McGill University for their recommendations on Raspberry Pi hats for audio and sensors.
Team On-Body Gym would like to thank Kuntal Podder for making the hand-grip 3D model for the virtual reality application. We would also like to thank the WHC 2021 SIC chairs for providing us with the opportunity (and of course the amazing hardware kit)!
- Seongkook Heo, Jaeyeon Lee, and Daniel Wigdor. 2019. PseudoBend: Producing Haptic Illusions of Stretching, Bending, and Twisting Using Grain Vibrations. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST '19). Association for Computing Machinery, New York, NY, USA, 803–813. DOI:https://doi.org/10.1145/3332165.3347941
- Paul Strohmeier, Seref Güngör, Luis Herres, Dennis Gudea, Bruno Fruchard, and Jürgen Steimle. 2020. BARefoot: Generating Virtual Materials using Motion Coupled Vibration in Shoes. Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology. Association for Computing Machinery, New York, NY, USA, 579–593. DOI:https://doi.org/10.1145/3379337.3415828
This documentation is released under the terms of the Creative Commons Attribution Share Alike 4.0 International license (see LICENSE.txt).