Skip to content

Team Vibrotactile Data Physicalization Project Repository for the IEEE World Haptics Conference 2021 Student Innovation Challenge

License

Notifications You must be signed in to change notification settings

WHC2021SIC/WHC2021SIC-TeamVibrotactileDataPhysicalization

Repository files navigation

WHC2021SIC Project Template

Project Template for the IEEE World Haptics Conference 2021 Student Innovation Challenge

https://2021.worldhaptics.org/sic/

dataPhysic

Authors

Team

Franklin Bastidas

Name's picture

Franklin Bastidas is currently a MSc. student in Computer Science at the Federal University of Rio Grande do Sul, his recent research areas are focused on the simulation of physical movements, human-computer interaction and control of haptic devices. He received his bachelor's degree in Mechatronics Engineering at the Instituto Tecnológico Metropolitano, Medellín in 2019, where he was associated with two research projects focused on meteorological data and swarm robotics design and control.

Renan Guarese

Renan Guarese's picture

Renan Guarese is an HCI researcher and PhD student in Computer Science at RMIT - Australia. He obtained his MSc. degree in Computer Science at UFRGS, studying Human-Computer interaction. Worked with Data Visualization in Augmented Reality, including a Situated Visualization of Electromagnetic Compatibility project at Halmstad University, in Sweden. Mainly uses Unity, Android and the HoloLens for his applications. Also holds a BSc. in Computer Science from UFRGS, having spent one year at Radford University, in the USA.

Find more information on his page.

Yhonatan Iquiapaza

Yhonatan Iquiapaza' picture

Yhonatan Iquiapaza is currently MSc. student in Computer Science at Federal University of Rio Grande do Sul, his research focus on Data Visualization, Human-Computer Interaction, Immersive Analytics, Physicalization and Virtual/Augmented Reality. He finished his studies as a bachelor in System Engineering at Universidad Nacional de San Agustin, in Arequipa-Peru. He usually uses Unity, Javascript and Hololens.

Find more information on his page.

Carlos Johansson

Carlos Johansson's picture

Carlos Johansson is currently MSc. student in Computer Science at Federal University of Rio Grande do Sul, his research focus on Human-Computer Interaction, Physicalization and Virtual/Augmented Reality. He received a bachelor's degree in biomedical informatics from the Federal University of Health Sciences of Porto Alegre.

Mariane Giambastiani

Mariane Giambastiani's picture

Mariane Giambastiani is currently MSc. student in Computer Science at Federal University of Rio Grande do Sul, his research focuses on Data Visualization, Human-Computer Interaction, Immersive Analytics, Physicalization and Augmented Reality. She obtained her BSc. in Computer Engineering from Federal University of Rio Grande do Sul.

Chairs

Christian Frisson

Christian Frissons's picture

Christian Frisson is an associate researcher at the Input Devices and Music Interaction Laboratory (IDMIL) (2021), previously postdoctoral researcher at McGill University with the IDMIL (2019-2020), at the University of Calgary with the Interactions Lab (2017-2018) and at Inria in France with the Mjolnir team (2016-2017). He obtained his PhD at the University of Mons, numediart Institute, in Belgium (2015); his MSc in “Art, Science, Technology” from Institut National Polytechnique de Grenoble with the Association for the Creation and Research on Expression Tools (ACROE), in France (2006); his Masters in Electrical (Metrology) and Mechanical (Acoustics) Engineering from ENSIM in Le Mans, France (2005). Christian Frisson is a researcher in Human-Computer Interaction, with expertise in Information Visualization, Multimedia Information Retrieval, and Tangible/Haptic Interaction. Christian creates and evaluates user interfaces for manipulating multimedia data. Christian favors obtaining replicable, reusable and sustainable results through open-source software, open hardware and open datasets. With his co-authors, Christian obtained the IEEE VIS 2019 Infovis Best Paper award and was selected among 4 finalists for IEEE Haptics Symposium 2020 Most Promising WIP.

Find more information on his website.

Jun Nishida

Jun Nishida's picture

Jun Nishida is Currently Postdoctoral Fellow at University of Chicago & Research Fellow at Japan Society for the Promotion of Science (JSPS PDRA) / Previously JSPS Research Fellow (DC1), Project Researcher at Japanese Ministry of Internal Affairs and Communications, SCOPE Innovation Program & PhD Fellow at Microsoft Research Asia / Graduated from Empowerment Informatics Program, University of Tsukuba, Japan.

I’m a postdoctoral fellow at University of Chicago. I have received my PhD in Human Informatics at University of Tsukuba, Japan in 2019. I am interested in designing experiences in which all people can maximize and share their physical and cognitive capabilities to support each other. I explore the possibility of this interaction in the field of rehabilitation, education, and design. To this end, I design wearable cybernic interfaces which share one’s embodied and social perspectives among people by means of electrical muscle stimulation, exoskeletons, virtual/augmented reality systems. Received more than 40 awards including Microsoft Research Asia Fellowship Award, national grants, and three University Presidential Awards. Review service at ACM SIGCHI, SIGGRAPH, UIST, TEI, IEEE VR, HRI.

Find more information on their website.

Heather Culbertson

Heather Culbertson's picture

Heather Culbertson is a Gabilan Assistant Professor of Computer Science at the University of Southern California. Her research focuses on the design and control of haptic devices and rendering systems, human-robot interaction, and virtual reality. Particularly she is interested in creating haptic interactions that are natural and realistically mimic the touch sensations experienced during interactions with the physical world. Previously, she was a research scientist in the Department of Mechanical Engineering at Stanford University where she worked in the Collaborative Haptics and Robotics in Medicine (CHARM) Lab. She received her PhD in the Department of Mechanical Engineering and Applied Mechanics (MEAM) at the University of Pennsylvania in 2015 working in the Haptics Group, part of the General Robotics, Automation, Sensing and Perception (GRASP) Laboratory. She completed a Masters in MEAM at the University of Pennsylvania in 2013, and earned a BS degree in mechanical engineering at the University of Nevada, Reno in 2010. She is currently serving as the Vice-Chair for Information Dissemination for the IEEE Technical Committee on Haptics. Her awards include a citation for meritorious service as a reviewer for the IEEE Transactions on Haptics, Best Paper at UIST 2017, and the Best Hands-On Demonstration Award at IEEE World Haptics 2013.

Find more information on her website.

Contents

Generated with npm run toc, see INSTALL.md.

Once this documentation becomes very comprehensive, the main file can be split in multiple files and reference these files.

Abstract

Physicalization has been studied by the data visualization research community as a good way to help people to understand and communicate data through physical representation. This type of data representation is widely used in museums as tangible surfaces and mockups, furthermore it is used in the medical field as 3D prints for the study of human organs. However, building physical artifacts to represent data can be expensive and very time-consuming. Another way to build this kind of data visualization is using Augmented or Virtual Reality associated with tangible surfaces or haptic stimulation. In this work, we explore the use of vibrotactile actuators to physically convey additional information to the visual data representation. Likewise, In the context of helping people understand data visualization with limited graphic resources, we propose an adaptive data physicalization surface.

Introduction

We present a visuo-haptic approach and system that augments the visual experience of graphic data visualization with meaningful vibrotactile information. The user of our system explores charts and graphs either on a video monitor or on printed paper. When touching the data visualization, they feel vibrotactile patterns on the back of their hands that communicate other dimensions of the data at the point of contact that are not printed but help understand the data. The system recognizes the interaction location on the chart by means of a set of force sensors placed at the corners of the supporting surface (either a plastic clipboard or a video monitor). An array of vibrotactile actuators placed at the joints on the back of the user's hand modulate frequency and amplitude to build signals that convey the appropriate stimuli, in a process of dynamic physicalization.

Documentation

Hardware

1. Components:

2. Initial settings:

  • Actuators - cable - jumper male: Each motor is connected and soldered to two of the ribbon cables, with the motors at one end and male jumpers at the other end.

  • Force sensors: As with the previous item, each sensor is connected to two of the ribbon cables, having the sensors at one end and male jumpers at the other end.

  • Change I2C ADC address: The I2c address 0x48 (default) is changed to address 0x49 because the Octo sound card is using this address as audio input, causing a conflict. For this you can continue with the link steps. That is, cut the jumper connecting the center pin to 0x48 and then make a new connection to pin 0x49.

  • Disconnect potentiometer A3 from ADC: The ADC have potentiometer connected to AN3 by default, to avoid possible unwanted results it's necessary that the range of values and sensitivity of the sensors have a similar behavior, for this the potentiometer is disconnected. To do this, just cut the jumper that connects the two TRIM A3 pins, as shown in this link or following image:

dataPhysic

3. Assemble hardware

assaml

A. Raspberry Pi - Octo sound card - Sparkfun hat

raspberryOctohat

B. Glove and Actuators

Octo sound card - RCA to 3.5mm

octoCir

RCA to 3.5mm - Amplifier - Jumpers male

octoamp

Jumpers male - Actuators - Glove

Note that each motor will need to be gloved based on the pin number connected to the amplifier for proper representation of vibration. That is, each number represents the pin connected to the amplifier, following the image:

raspberryOctohat

C. Touch surface:

Sparkfun hat- Qwiic Cable - ADC: The following circuit is performed for each of the force sensors (4 sensors)

raspberryOctohat

raspberryOctohat

then we proceed to join the sensor with tape to the corners of the glass surface and on it the rigid support (metal washer in our case)

raspberryOctohat

in the same way we join the rest of the sensors to the surface

raspberryOctohat

Software

1. Main dependencies

  • Raspberry pi 4: The download and installation of Raspberry OS can be found on the official website. Since the second approach is to use a Python GUI, it is recommended to use the operating system that comes with desktop version, as well as being more convenient to use.

  • Octo sound card: For a correct operation of the engines (as independent sounds) it is necessary to download and install the dependencies of the Octo sound card, for this follow the official instructions on the website

  • Syntacts: The open source Syntacts are used for independent use and control of the actuators, which can be downloaded and installed by following the steps in the repository.

2. Other dependencies

Python 3

sudo apt install python3

Matplotlib

pip3 install matplotlib

Numpy

pip3 install numpy

Pandas

pip3 install pandas

Adafruit_ADS1x15

pip3 install Adafruit-ADS1x15

Tkinter

pip3 install tk

tensorflow

pip3 install tensorflow

3. Artificial Neural Network:

For the first approach, artificial intelligence was used to detect the tactile position on the glass surface, this was done with the following steps:

  1. A 15cmx15cm grid is designed, where each square has a measurement of 1.5cm in real scale. This grid was placed on the surface of the glass and taped on, this grid can be found here.

  2. To create the data set, the grid points were used, that is, the grid is marked with a coordinate system 0 < x < 15 and 0 < y < 15 at each corner. For each point a data set of 5000 data is generated by varying the touch pressure without removing the finger from the surface and the label of each value is the coordinate point of the pressed grid, the code to acquire the data can be found here.

  3. Finally, the data is duly processed and the neural network model is built, then it is trained with the collected data, we have used a total of 250,000 data. For training (70%), validation (10%) and testing (20%). The training was performed with an NVIDIA GTX 1050 GPU. For more details of the model, see the code and for more details on the results of the data set, see text.

Getting started:

Before running the basic codes, consider the following recommendations:

  • Sensors and glass surface: In the tests, changes in values were detected in the sensors indirectly, that is, the movement of part of the cable near the sensor affected the captured values, so we fastened the cable to the table with tape, at a distance of 5cm from sensor, as seen in the following image:

Sensor Configuration

  • Glove and Actuators: For a better experience and sensation of sensitivity of the actuators in the hand, it is advisable to place the motors close to the interphalangeal and metacarpophalangeal joints according to the corresponding actuator.

Hand

First approach

This is only available using the surface and having the whole circuit properly wired, the Python code is here.

Senso

Second approach

For this second option of the project, much of the software is still being considered, except the dependence on Spankfun for force sensors, and in hardware only the glove-related is considered, here. the code.

Senso

For this second approach the option to detect lines that can be executed in this code was added

Senso

Acknowledgements

SIC chairs would like to thank Evan Pezent, Zane A. Zook and Marcia O'Malley from MAHI Lab at Rice University for having distributed to them 2 Syntacts kits for the IROS 2020 Intro to Haptics for XR Tutorial. SIC co-chair Christian Frisson would like to thank Edu Meneses and Johnty Wang from IDMIL at McGill University for their recommendations on Raspberry Pi hats for audio and sensors.

License

This documentation is released under the terms of the Creative Commons Attribution Share Alike 4.0 International license (see LICENSE.txt).

About

Team Vibrotactile Data Physicalization Project Repository for the IEEE World Haptics Conference 2021 Student Innovation Challenge

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published