For the first time, a research team at the GrapheneX-UTS Human-centric Artificial Intelligence Centre from the University of Technology Sydney (UTS) developed a non-invasive portal system which can decipher thoughts and convert them into text.
This technology would be particularly useful for communication with people who are unable to speak owing to injury or illness. It would also allow hitch-free communication between machines and humans, like the operation of a bionic limb.
Led by Professor CT Lin, first author Yiqun Duan, and doctoral candidate Jinzhou Zhou, participants of the study read text passages while wearing a cap which recorded their brain activity through their scalp making use of an electroencephalogram (EEG).
The EEG wave is classified into different units that detect specific patterns and characteristics from the brain. This is achieved through an AI model ‘DeWave’ created by the researchers which translates EEG signals into sentences and words after being trained on large quantities of EEG data.
‘This research represents a pioneering effort in translating raw EEG waves directly into language, marking a significant breakthrough in the field,’ said Professor Lin.
‘It is the first to incorporate discrete encoding techniques in the brain-to-text translation process, introducing an innovative approach to neural decoding. The integration with large language models is also opening new frontiers in neuroscience and AI,’ he said.
Previously, technology like Elon Musk’s Neuralink required surgery for the implantation of electrodes in the brain or scanning by an MRI machine. These methods have difficulties in converting the brain signals to words without the use of additional aids like eye-tracking which limit the system’s practical application.
With 29 participants in the study, the UTS research is likely to be more adaptable and robust than previous decoding technology only tested on a few individuals. This is because EEG waves differ according to each individual.
Since the EEG signals are received through a cap, rather from electrodes implanted in the brain through surgery, the signal is noisier.
‘The model is more adept at matching verbs than nouns. However, when it comes to nouns, we saw a tendency towards synonymous pairs rather than precise translations, such as “the man” instead of “the author”,’ said Duan.
‘We think this is because when the brain processes these words, semantically similar words might produce similar brain wave patterns. Despite the challenges, our model yields meaningful results, aligning keywords and forming similar sentence structures,’ he said.
By Marvellous Iwendi