Articles‎ > ‎

Brains Go Bigtime!

posted Sep 16, 2015, 9:19 AM by Ellen Pearlman

image

Muse headsets all lined up, ready for action - photo by William Wnekowicz

In February there was a big party/exhibition at the Exploratorium in California with people walking around controlling, or trying to control various things and mental states with their minds. The team who put together one of the exhibits consisted of Marion Le Borgne, James Bobowski, David Silva, and William Wnekowicz

Muse, a dry EEG headset (as opposed to gel or saline) allows for live time streaming data of brainwaves, and was the headset of choice for the exhibition. It also used just a few processing bands of waves like alpha, beta, theta and gamma.  The way the brainwave data was stored, analysed and visualised was through an open-source project the team named CloudBrain which is generating a lot of buzz in the BCI (Brain Computer Interface) world. For visitors who allowed it, their anonymous brain data was stored in a cloud database using the Cassandra system. 

image

People’s live-time brain feeds projected on a viewer. Are you relaxed, calm, tense, thinking, or are your eyes closed? You can look up on a screen and tell right away.

Cloudbrain used an analysing module powered by a custom algorithm. The visualization was set up to look like a “radar” chart, and it projected real-time EEG on a monitor. The code that powered it was written in AngularJS or JavaScript.

The tricky part of this exhibit revolved around how to stream everyone’s headset data while they were walking around. Usually BCI consumer grade headsets work with one headset, one person, and one display. The team  chose a publisher/subscriber architecture so all EEG streams (publishers), would route any one visitor’s stream to a specific booth or (subscriber). They did this through a Spark Core and a RFID tagger. They also used an app called Spacebrew

image

Spark Core device on the left and RFID taggers on the right

The Spacebrew app was not meant to be used with so much traffic, meaning so many people. That is because Muse sends out a few thousand messages per second, and the exhibition used 20 Muse headsets, which is thousands of messages per second times twenty.  The team had to rewrite the code to make it work in rabbitMQ, which scales in size much better on the backend. 

They  needed huge server (BrainServer) capacity and a lot of dedicated internet bandwidth. The team also deployed five machines to capture Bluetooth data from the Muse headsets, and forward it to BrainServer. 

image

Lots of Bluetooth enabled headsets at this opening party

CloudBrain has a demo of, as of this writing, just two models of BCIs - Muse and OpenBCI, and how they parse data live time. 

image

A screen shot of how the MUSE parses livetime data in the ‘radar’ chart.

The first screen shot shows 25 users, and the second shows “all”, though I am not sure how many “all” is.

image

Muse “all data, more squishy lines at top which are different channels

The CloudBrain data connects back to a repository on Github of raw code. On the repository they posted the averages of the aggregated data. 

image

So it seems most people went into Alpha states of relaxation, or closed eyes, hard to say which. That is followed by Beta, or alertness and/or agitated. Coming in third was Theta, or calmness. So the MUSE functioned as a kind of biofeedback device. 

image

This is the best representation of the visitors and the ‘radar’ screens live time. 

The developers think it is very interesting for participants to see live time comparisons of how users brains stack up against others. 

However, this is the first time that data banks of users brainwaves are being developed for use in the Cloud and proves how he field of neurogaming is right on the cusp of Cloud storage technology. The implications of this are far reaching. Cognitive Technology understands the profound implication this will have on issues of bioprivacy, and recently launched the first workshop for theCenter for Responsible Brainwave Technologies.

The question will no longer be “What’s In Your Wallet”, but more like “What’s In Your Brain?”

Comments