Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixing practical issues and improving clarity #2

Merged
merged 2 commits into from
Mar 5, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions docs/learn/tutorials/beginner/basics/ex-situ_analysis.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,9 @@

PLUME record files are fully decoupled from the Unity engine. As a result, they can be parsed from any external applications. To further simplify the process, we provide a Python API to parse and extract data from the record files. This allows you to perform ex-situ analysis (i.e. analyzing the data outside its 3D context) using Python for more traditional analysis workflow (statistical analysis, machine learning, etc.). The package also comes with a set of utilities to simplify the conversion of the data to different formats used in data analysis like pandas dataframe, CSV and XDF for physiological signals to be analyzed in external software such as SigViewer, EEGLAB or MoBILAB.

!!! tip
You can run this notebook directly in [Google Colab](https://colab.research.google.com/drive/1fTzc9e6gS04L9SPHgd2IwZRqZSCd6TZH?usp=sharing).

## Installing PLUME Python

!!! note
Expand Down
107 changes: 59 additions & 48 deletions docs/learn/tutorials/beginner/basics/in-situ_analysis.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,81 +4,92 @@ In-situ analysis is the process of analyzing data within its original 3D context
## User trajectory
After recording a user exploring the virtual environment in search of Easter eggs, we can analyze their movements to better understand navigation patterns and spatial behavior. One key aspect of this analysis is tracking the user’s trajectory within the 3D space. Let's compute the trajectory of the user’s head to visualize their movement path, identify frequently visited areas, and assess how they navigated the environment during the search.

![Trajectory Analysis Module](assets/in-situ-analysis/image-20.png){width=400, align="left"}
![Trajectory Analysis Module](assets/in-situ-analysis/image-20.png){width=400}
/// caption
Trajectory Analysis Module Interface
///

In the interface, `Object ID` refers to the GUID of the object you want to compute the trajectory of.
The trajectory analysis module is situated upper-right of PLUME-Viewer, unfold it to access its parameters.

1. Select Main Camera in the hierarchy. Navigate within the tree view to `XR Interaction Setup > XR Origin > Camera Offset > Main Camera`.
2. ``CTRL+C`` to copy the Main Camera GUID.
3. ``CTRL+V`` to paste the Main Camera GUID within the `Object ID` field.
`Object ID` refers to the GUID of the object you want to compute the trajectory of.

<br clear="left"/>
1. Navigate within the tree view to `XR Origin (XR Rig) > Camera Offset > Main Camera`.
2. Select ``Main Camera`` and use ``CTRL+C`` to copy the Main Camera GUID.
3. Use ``CTRL+V`` to paste the Main Camera GUID within the `Object ID` field.

Event markers can be displayed over the trajectory to give them spatial context. We created a marker named `Egg Pick Up`, that indicates everytime a user find an egg.
Event markers can be displayed over the trajectory to give them spatial context. We [created a marker](record_custom_data.md#recording-custom-data) named `Egg Pick Up`, that indicates everytime a user find an egg.

* To show the marker on the trajectory, type `Egg Pick Up` in the `Markers` text field of the Trajectory Analysis Module.

Teleportation is a frequently used locomotion solution for navigating in VR. As logging teleportation can be widly different from a project to another, we consider there was a teleportation when the position has changed from a defined quantity (e.g. 10cm) from one frame to the next.
Teleportation is a frequently used locomotion solution for navigating in VR. As logging teleportation can be widly different from a project to another, we consider there was a teleportation when the position has changed more than a defined quantity (e.g. 10cm) from one frame to the next.

* Teleportation Tolerance is the maximal distance in meter between 2 positions of the Object over which we consider a teleportation. It is set at 0.1m by default.
* Teleportation Segments. When enabled teleportations are displayed as dotted lines, otherwise they break the trajectory.

You can change the rendering parameters of the trajectory:

* Decimation Tolerance. (parameter of the decimation algorithm, lower value will complexify the trajectory, while higher values will simplify it)
* Include Rotation. When enabled, rotation gizmos are placed over the trajectory to indicate the rotation of the object at that point in time.
* Time Range. By default, the time range encompasses the entire record. If you want a specific time range to compute the trajectory, adjust it here using the text fields or the adjustable scroll bar.

![Generated trajectory settings](assets/in-situ-analysis/image-22.png){width=400, align="right"}

Finally, click on `Generate` to create the trajectory with your chosen parameters.

* Once generated, the trajectory appears in the 3D view.
![Generated trajectory settings](assets/in-situ-analysis/image-22.png){width=400}
/// caption
Generated Trajectory Settings
///

After generation, a trajectory result panel appears under the Trajectory module.
Once generated, the trajectory appears in the 3D view and a trajectory result panel appears under the trajectory module.

* You can hide the trajectory by clicking on the *eye symbol*.
* You can delete the trajectory by clicking on the *bin symbol*.

<br clear="right"/>

![Generated trajectory result](assets/in-situ-analysis/image.png){width=600}
![Generated trajectory result](assets/in-situ-analysis/image.png){width=800}
/// caption
Trajectory of the user's head and `Egg Pick Up` markers. The color of the trajectory corresponds to time, from <span style="color:blue">blue</span> (beginning of the trajectory) to <span style="color:red">red</span> (end of the trajectory).
///

## User position heatmap
A position heatmap allows us to visualize the spatial distribution of the user’s presence by mapping their head position onto the floor of the virtual environment. Let's compute it to project the positions of the egg hunter's head, as this representation helps identify frequently visited areas, uncover search patterns, and highlight regions of interest where the they paused or focused their attention.

![Position heatmap settings](assets/in-situ-analysis/image-23.png){width=400, align="left"}
![Position heatmap settings](assets/in-situ-analysis/image-23.png){width=400}
/// caption
Position Heatmap Module Interface
///

In the interface, `Projection Caster` refers to the object of which you want to project the position.
The heatmap projection module is situated upper-right of PLUME-Viewer, unfold it to access its parameters.

1. Select Main Camera in the hierarchy. `XR Interaction Setup > XR Origin > Camera Offset > Main Camera`
2. ``CTRL+C`` to copy the Main Camera GUID.
3. ``CTRL+V`` to paste the Main Camera GUID within the Projection Caster field.
`Projection Caster` refers to the GUID of the object whose position you want to project.

`Projection Receiver(s)` refers to GUID(s) of the object(s) you want to project the position on. Here we have to give the GUID of the floors.
1. Navigate within the tree view to `XR Origin (XR Rig) > Camera Offset > Main Camera`.
2. Select ``Main Camera`` and use ``CTRL+C`` to copy the Main Camera GUID.
3. Use ``CTRL+V`` to paste the Main Camera GUID within the `Projection Caster` field.

<!-- !!! warning
Currently this module only projects on existing models within the virtual environment. Make sure to have a ground or equivalent to project to. -->
``Projection Receiver(s)`` refers to GUID(s) of the object(s) you want to project the position on. Here we have to give the GUID of the floors.

1. Select Floors in the hierarchy. Floors is an empty GameObject that is the parent of every mesh that represent floors in the scene.
2. ``CTRL+C`` to copy the Floors GUID.
3. ``CTRL+V`` to paste the Main Camera GUID within the Object ID field.
4. Be sure to enable `Include Children`, as this will pick up recursively every child of the every GUID in the Projection Receiver(s) list.
``Floors`` is an empty GameObject at the root of the project that is the parent of every mesh that represent floors in the scene.

1. Select ``Floors`` in the hierarchy.
2. Use ``CTRL+C`` to copy the ``Floors`` GUID.
3. Use ``CTRL+V`` to paste the ``Floors`` GUID within the ``Projection Receiver(s)`` field.
4. Be sure to enable ``Include Children``, as this will pick up recursively every child of the every GUID in the Projection Receiver(s) list.

* Click on Generate. Heatmap generation starts, and dynamically evolves as the replay moves the user and the rest of the objects in the scene.
<br clear="left"/>

Once generated, shaders of objects in the 3D view are replaced with the heatmap shader. On the selected receivers, the heatmap is displayed. The heatmap is colored to infer the duration of the position, blue being 0s and red being maximum nb of seconds for 1 position.
Once generated, shaders of objects in the 3D view are replaced with the heatmap shader. On the selected receivers, the heatmap is displayed. The heatmap is colored to infer the duration of the position, blue being 0s and red being the maximum time for 1 position.

![Generated position heatmap settings](assets/in-situ-analysis/image-24.png){width=400, align="right"}
![Generated position heatmap settings](assets/in-situ-analysis/image-24.png){width=400}
/// caption
Generated Position Heatmap Settings
///

* You can go back to the original environement by clicking on the *eye symbol*.
* You can delete the heatmap by clicking on the *bin symbol*.
* You can export the heatmaps as point clouds by clicking on the *download symbol*. For every object that contains heatmap values, it is exported as a `.ply` file.

![Generated position heatmap result](assets/in-situ-analysis/position_heatmap_result.png){width=600}
![Generated position heatmap result](assets/in-situ-analysis/position_heatmap_result.png){width=800}
/// caption
Heatmap of the projection of the user's head positions on the floor, the more red the more time the user spent at this position.
///
Expand All @@ -91,41 +102,41 @@ Heatmap of the projection of the user's head positions on the floor, the more re
!!! note
This feature is only compatible with interactions emitted by the [Unity's XR Interaction Toolkit](https://docs.unity3d.com/Packages/[email protected]/manual/index.html). [Pull Requests](https://github.com/liris-xr/PLUME-Recorder/pulls) to add compatibility for other VR packages are welcome.

![XR origin hierarchy](assets/in-situ-analysis/image-27.png){width="250", align="left"}

As the user explored the virtual environment searching for Easter eggs, they interacted with various objects along the way. PLUME's Interaction Highlight feature provide a visual representation of these interactions by shading interacted objects in different intensities of red. This allows us to quickly identify which objects were engaged with the most, revealing key areas of user interest and interaction patterns within the scene.

In the interface, `Interactor(s)` refers to the GUID of one or multiple interactors (i.e., Direct Interactor, Ray Interactor, Near-Far Interactor). In our example, we will compute the interactions the egg hunter made with its right hand.
![Interaction heatmap generation](assets/in-situ-analysis/image-29.png){width=400}
/// caption
Interaction Heatmap Module Interface
///

1. Select Right Hand Direct Interactor in the hierarchy (XR Interaction Setup > XR Origin > Camera Offset > Main Camera > Right Hand > Direct Interactor)
2. ``CTRL+C`` to copy the Right Hand Direct Interactor GUID.
3. ``CTRL+V`` to paste the Right Hand Direct Interactor GUID within the Object ID field.
The interaction highlight module is situated upper-right of PLUME-Viewer, unfold it to access its parameters.

<br clear="left"/>
``Interactor(s)`` refers to the GUID of one or multiple interactors (i.e., Direct Interactor, Ray Interactor, Near-Far Interactor). In our example, we will compute the interactions the egg hunter made with its right hand.

![Interaction heatmap generation](assets/in-situ-analysis/image-29.png){width=400, align="right"}
1. Navigate within the tree view to ``XR Origin (XR Rig) > Camera Offset > Main Camera > Right Hand > Direct Interactor``
2. Select ``Direct Interactor`` and use ``CTRL+C`` to copy the Right Hand Direct Interactor GUID.
3. Use ``CTRL+V`` to paste the Right Hand Direct Interactor GUID within the ``Interactor`` field.

`Interactable(s)` refers to the GUID of one or multiple interactables (i.e, XR Grab Interactable). We take every interactable into account.

1. Leave the Interactable(s) field empty if to consider every interactable in the scene.
2. Interaction Type: Hover (i.e., Touch), Select (i.e, Grab), Activate (i.e., Fire)
3. Select the `Hover` Interaction Type.
4. Click on `Generate`.

<br clear="right"/>
1. Leave the Interactable(s) field empty to consider every interactable in the scene.
We consider 3 different interaction types: Hover (i.e., Touch), Select (i.e, Grab), Activate (i.e., Fire)
2. Select the `Hover` Interaction Type.
3. Click on `Generate`.

![Generated interaction highlight settings](assets/in-situ-analysis/image-30.png){width=400, align="right"}
![Generated interaction highlight settings](assets/in-situ-analysis/image-30.png){width=400}
/// caption
Generated Interaction Highlight Settings
///

Once generated, shaders of interactable objects in the 3D view are replaced with the highlight shader. The highlight's intensity is determined by comparing the number of interactions on a specific object to the most interacted-with object in the scene. The more interactions an object has relative to the maximum, the more intense its highlight appears.

* You can go back to the original environement by clicking on the *eye symbol*.
* You can delete the highlight by clicking on the *bin symbol*.

<br clear="right"/>

![Generated interaction highlight](assets/in-situ-analysis/interaction_highlight_result.png){width=600}
![Generated interaction highlight](assets/in-situ-analysis/interaction_highlight_result.png){width=800}
/// caption
Highlight of the objects hovered by the right hand of the user.
Highlight of the Objects Hovered by the Right Hand of the User
///


45 changes: 32 additions & 13 deletions docs/learn/tutorials/beginner/basics/setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -77,25 +77,44 @@ For this tutorial, you can use your own physiological device and create a LSL st

If you don't have access to a physiological device, we provide you with a Python script (`stream_eda.py`) that simulates one by streaming pre-recorded physiological signals using [PyLSL](https://github.com/labstreaminglayer/pylsl) and [PyXDF](https://github.com/xdf-modules/pyxdf). The script is adapted from a [PyLSL example](https://github.com/labstreaminglayer/pylsl/blob/main/src/pylsl/examples/SendData.py) and sends 2 EDA signals sampled at 8Hz. To run the script, you will need a Python virtual environment with PyLSL and PyXDF installed.

Here is a step-by-step guide to creating a new Python environment using Conda.
Here is a step-by-step guide to creating a new Python environment using venv.

* Download and install [Miniconda or Anaconda](https://docs.anaconda.com/getting-started/).
* Open the `Anaconda Prompt` application.
* Create a new Conda environment with Python 3.12.
```
conda create --name my_env python==3.12
```
* Once created, activate the Conda virtual environment.
1. Open a command line and make sure **Python 3.12** is installed:
```bash
python3 --version
```
conda activate my_env
If you don’t have Python 3.12, you can download and install it from [Python Downloads](https://www.python.org/downloads/).

2. In the command line, navigate to the Python files within the Project folder `UnityProject/EasterEggHunt/Assets/PythonScripts~`.
```bash
cd `path/to/UnityProject/EasterEggHunt/Assets/PythonScripts~`
```
* In Anaconda Prompt, navigate to the tutorial files, enter the `UnityProject/EasterEggHunt/Assets/PythonScripts~` folder.
* Install the required packages using pip and the following command:
3. Create a new virtual environment inside this folder:
```bash
python3 -m venv venv
```
This creates a folder named **`venv`** containing the isolated Python environment.

4. Activate the Virtual Environment
</br>
**On Windows:**
```powershell
venv\Scripts\activate
```
**On macOS/Linux:**
```bash
source venv/bin/activate
```

Once activated, your terminal will show `(venv)` at the beginning of the line, indicating you are now inside the virtual environment.

1. Install dependencies from `requirements.txt` using the following command:
```powershell
pip install -r requirements.txt
```
* Launch `stream_eda.py` using the following command. Physiological streams can now be picked up by PLUME.
```

2. Launch `stream_eda.py` using the following command. Physiological streams can now be picked up by PLUME.
```powershell
python stream_eda.py
```

Expand Down