Sync mobile EEG, breathing and Unity

Did you think of performing an experiment in VR? And did you wish to better understand the behavioral reactions as well as psychological states of your participants? 

You could capture hand or head movements, breathing reactions along with EEG recordings. First, your EEG device needs to be mobile and thin enough to fit the VR headset, e.g. Smarting PRO mobile EEG. Then, you need to add triggers to your VR environment each time a certain stimulus appears, and align them to all the above mentioned continuous streams (EEG, breathing, hand/head movements).

Fig 1. Smarting Pro with HTC Vive and Alyzé breathing belt

Here is an example on how to stream the positions and rotations of both VR Controllers (hands) and head of a HTC Vive VR headset, as a continuous LSL StreamOutlet from Unity. At the same time, triggers from Unity are being sent each time an event of interest occurs. On the other hand, breathing using Alyzé (Ullo) and EEG using Smarting PRO are being streamed as well. Finally all the streams are recorded using the mbt Streamer. All data is automatically synced by LSL.

Used here:

  1. Windows 10, 64 bit, all on one computer; but you can use multiple computers, typically one for VR and another for physiological signal recordings (EEG and breathing), in which case all computers must share the same network e.g, local wifi.
  2. Breathing belt, Ullo, Alyzé;
  3. Smarting PRO;
  4. openViBE 3.2 (but any version works);
  5. Unity 2020;
  6. SteamVR plugin for HTC Vive in Unity;
  7. LSL4Unity (to send and receive LSL streams and markers);
  8. Python code from Ullo (to convert Bluetooth BLE into LSL)
    1. You need to have Python installed for it to run
    2. And bleak and pylsl python library (Pip install bleak, pylsl)
  9. mbt Streamer (to receive EEG from Smarting PRO and record all into one .xdf file)

Let’s start with LSL4Unity

If you have a HTC Vive VR headset

Use the existing Scene within the SteamVR plugin called “Simple Sample”

Fig 2. Simple Sample scene in Unity, example from SteamVR

Then Add LSL4Unity to your project using the Window->Package Manager-> Add Package from disc… 

Fig 3. Open the package.json file.

Then import all the existing Samples (all 3 of them).

Fig 4. Importing Samples from LSL4Unity

Streaming continuous signals

To track the position and rotation of your controllers and send them as LSL streams to Lab Recorder or mbt Streamer, you should simply drag and drop the “PoseOutlet.cs” for both controllers as a new Component in the Inspector.

Fig 5. Drag and drop script “PoseOutlet to the controllers and camera”

If you wish to track the head movements as well, just do the same, drag and drop the same script, but this time into the Camera.

Streaming Triggers

Let’s send a trigger every time the ball touches the ground, as it is bouncing in the Simple Sample scene. To do so, simply drag and drop the “SimpleOutletTriggerEvent.cs” onto the Sphere (game object that is bouncing on the floor).

Fig 6. Drag and drop script to the Sphere game object

Then open the script, you can see the stream name and you can rename it if you like.

string StreamName = "LSL4Unity.Samples.SimpleCollisionEvent";
        string StreamType = "Markers";
        private StreamOutlet outlet;
        private string[] sample = {""};

Then notice we are sending strings as Marker type, and irregular sampling rate (equal to zero).

StreamInfo streamInfo = new StreamInfo(StreamName, StreamType, 1, LSL.LSL.IRREGULAR_RATE,
                channel_format_t.cf_string, hash.ToString());
            outlet = new StreamOutlet(streamInfo)

And to enable to send LSL triggers each time the ball touches the ground, instead of “void OnTriggerEnter(Collider other)” and “OnTriggerExit(Collider other)”, type:

private void OnCollisionEnter(Collision other)
        {
            if (outlet != null)
            {
                sample[0] = "TriggerEnter " + gameObject.GetInstanceID();
                Debug.Log("Touching the ground" +sample[0]);
                outlet.push_sample(sample);
            }
        }

Do the same for “OnCollisionExit()”

Stream Breathing (continuous signals)

To use the breathing belt and convert bluetooth Low Energy (BLE) data to LSL type in PowerShell or Command Prompt “stream_breathing_amp_multi.py -m MAC address of your device”, as below:

PS C:UsersjmladenoDownloads> python .stream_breathing_amp_multi.py -m EA:50:1C:45:3B:08

Obviously, you need to open the script from its location (in my case it is in Downloads), and to find the right MAC address of your device (e.g., you can find out with a great app called: NRF Connect).

Visualize in Real Time with openViBE

To visualize in real time your streams or perform some signal processing, you can use openViBE. Play your scene in Unity (leave it on Play mode), and open the openViBE Acquisition Server;

Fig 7. OpenViBE Driver Properties of the Acquisition Server

You can see 3 streams from Unity denoting the rotation and position of 2 controllers and the camera. We cannot see our Markers as they are “string” and not “int32” (the only type accepted by Acquisition Server).

Then in openViBE Designer create the simplest scenario (Acquisition client and Signal Display) to view the streams in real time (below is displayed only the Right Hand Controller):

Fig 8. Real time visualization of controllers positions and rotations, using openViBE Designer

Record all streams at once

And finally, let’s record it all in one place, in the mbt Streamer and you’re good to go!

Fig 9. 6 seconds recording summary by mbt Streamer, on computer (kama); recording 3 unity streams, 1 EEG Smarting PRO, breathing signals and triggers from Unity.

As you can see in Fig 9., there are 3 unity continuous signal streams, 1 marker stream, a breathing belt (ullo_bb) and EEG. Once I stopped recording, after 6 seconds we can see that the ball bounced 14 times as it produced 14 events.

Conclusion 

Let us know how it goes and if you have any questions feel free to reach out to us on our contact page.

*Conclusion written by mbt team

Recommended reading