Classification of Covert Visuospatial Attention Direction to Assist People with No Motor Function

Brooke Follansbee, “Classification of Covert Visuospatial Attention Direction to Assist People with No Motor Function”
Mentor: Roger O. Smith, Occupational Science & Technology

People with minimal to no motor function lack the means necessary to communicate and control their environment. Our literature review revealed that Covert Visuospatial Attention (CVSA) classification, although thoroughly studied, has rarely been used in addressing the needs of these individuals (such as individuals with Locked-in Syndrome). CVSA is the process of attending to a target without overtly looking at it, which can be classified through noninvasive Electroencephalography (EEG). We collected EEG data from 4 healthy participants (all female, ages 21-27) performing a CVSA task by attending to the left or right. We then preprocessed the data and applied machine learning algorithms to classify the direction of CVSA. Due to human error, we discarded one participant’s recorded data. We discovered that the classification accuracy ranged between 70%—74.07% for the remaining 3 participants. Our classification of CVSA direction met the accuracy requirements set by pioneers in the Brain-Computer Interface (BCI) field. Therefore, in this pilot study, we conclude that CVSA direction classification through EEG can be utilized and has potential to assist individuals with minimal to no motor function with communication and control tasks.

Click the thumbnail below to open the full sized poster in a new tab.

link to full poster

Comments

  1. Great early work with lots of potential. Good explanation of the work relevance to the individuals with ALS, and at the same time, deep understating of the procedures and the tools usage.
    As mentioned at the end, it is quite essential to increase the sample size, ramping up the accuracy, and introducing the testing of ALS cases in the future tests.
    Best of luck!

  2. Really great project! The project I participated in within my lab actually studied what the size of the visuospatial attentional window is. This is an awesome application to the type of attention that we deploy every day and is very helpful. Great job! It’s exciting to see its application in the future on ALS patients.

Leave a Reply to Sana Shakir Cancel reply

Your email address will not be published. Required fields are marked *