BCI Ring

Using focus as a control trigger for interaction design.

Research
Develop
Circuitry

Brain-Computer Interaction

When preparing for an alumni lecture at the Savannah College of Art and Design, I created a web app that receives data from an EEG headband to detect four states of focus! It builds off the research published by the Seoul National University of Science and Technology (SNUST).

For this project, When a user wears the headband, the app “waits” for the user to achieve a state of concentration. Entering that state is accepted it as a trigger, starting a feedback function. This trigger is as though the user had made eye contact with a friend, said “Hey, Siri,” or pressed a button on a website.

EEG Data

Opportunities and growth

Nissan’s B2V Interface

BCI may seem like a far-off future, but it has been in use for decades! In 1970, UCLA created an interface that allowed a user to guide a cursor through a maze! (source) In recent history, Nissan presented the first Brain-to-Vehicle (or B2V) technology in Las Vegas at CES 2018.

Market Summary source

New use cases are constantly emerging. BCI has the potential to transform our Augmented and Virtual Reality Experiences as well as amplify accessible design to unprecedented levels. These opportunities can help explain how the global market for BCI is estimated to grow from $1.9 billion in 2021 to $3.3 billion by 2026. (source)

How it works

When a user is wearing a Muse wireless EEG, the sensors report the Alpha, Beta, and Theta brainwave signals to the web app, this is the trigger. The app uses rules to compare the ratios of these brainwaves to detect concentration.

The app loops as it receives new signal data. It continuously checks to determine how long the user has been concentrating. Once a minimum amount of time has been met, the app turns on the light inside the ring.

Creating the experience

The SNUST analysis of Concentration and Immersion Based on EEGprovided a foundation for this project.

These researchers found that ratios between Alpha, Beta, and Theta brainwaves in the frontal context allowed them to detect a user’s level of concentration.

I believed that I could create my own interface and algorithm to imitate their experiment; allowing me to craft my own experience around the function.

I used the repository of eegedu.com to create a functioning connection between a Muse EEG and my preliminary web app.

With the data flowing, I was able to record and then visualize brainwaves over time.

I iterated on algorithms until I was successfully producing results similar to that of the original experiment.

(Results have been tested, but not peer reviewed)

Final thoughts

I’m far from done, but I am excited. We are living in a time when science fiction is becoming the signature feature of tomorrow. Jewelry powered by concentration, games built around focus, and every-day experiences augmented by real-time thoughts.

What if we could design table saws that shut off when the user is daydreaming? What if our phones switched to silent when we were concentrating? What type of world could we design?