AI has always paved ways to respond, comprehend and interpret a specific goal with reasoning, planning and creativity. AI has shown us a new world where anything can be achieved as long as you can dream it.
A new study linking Neuroimaging and AI has been introduced by Researchers at the University of Rochester Medical Center drawing a complex network inside the brain that perceives the meaning of spoken sentences.
Have you ever asked yourself why you can hear a sentence and understand its meaning — Mind you, the words can be put in different order possessing different meaning?
The word arrangement can make all the difference. For eg: I take my dog for a walk and My dog takes me for a walk. The placement of words changes the meaning of the sentence.
The research shows how the application of Artificial Neutral Networks or AI is allowing researchers to unblock the extremely complex signalling in the brain that bears Processing language functions.
The brain activity data was gathered from fMRI. The scans showed brain activities spanning across a network of different regions – The inferior parietal cortex, temporal and anterior lobes, and inferior frontal cortex.
Facebook developed an AI Model called InferSent to produce unified semantic representations of sentences. With its help, researchers were capable of predicting patterns of fMRI activity indicating the encoding sentences across brain regions.
New evidence shows how contextualized semantic descriptions are encoded during a shared language network, rather than at a particular section in the brain.
The researchers are working on methods to try to understand how language comprehension breaks down in the Initial Stage of Alzheimer’s disease.
The researchers are interested in predicting brain activity as the language is produced along with working on predicting brain activity as people speak sentences.
AI can help us understand the human mind and body easily with the conventional use of technology and innovation.