This method decodes the language in real-time. While many would be thinking about the working of this method, it works by feeding functional magnetic resonance imaging (fMRI) to the algorithm which reconstructs arbitrary stimuli that the person is hearing or thinking into natural language. During the study, many stories were listened and scientists scanned areas of the brain associated with natural language and this scan was later on fed up to the AI-powered decoder. The decoder, later on, returned a summary of what a person was listening to. Currently, in order to read brain signals, electrodes are implanted in the brains but this is not an utter accomplishment. With the new model which is the first non-invasive technique, the ideas of the patient’s thoughts can be analyzed however it can not decode the word-to-word thinking of a person. While telling about the complexity of the brain, Popular Mechanics reported that human brains break down into smaller pieces that responded differently. There the different thoughts circulating which can vary from simple such as ‘Cat’ or complex such as ‘ I should pick up the cat’. The brain has its own elements that are made up of 42 different elements referring to a specific concept such as place, color, size, and much more which when combined take the form of complex thoughts. The system is so complex that each letter is handled by different parts of the brain so when all these parts are combined, a person’s mind can be read. While the system cannot decode the brain scans word to word of what the individual is thinking, it produces an idea of the thought. Under the MRI machine,  whatever a person sees in a picture can also be described. According to the researchers, this overall task was not an easy one. The team recorded fMRI data of three parts of the brain that are linked with natural language while a small group of people listened to 16 hours of podcasts. The three brain regions analyzed were the prefrontal network, the classical language network and the parietal-temporal-occipital association network The created algorithm was given to the scan and the system transformed this scan recording into story-based content. The study, pre-printed in BioXiv, provides an original story: ‘Look for a message from my wife saying that she had changed her mind and that she was coming back.’ The algorithm decoded it as: ‘To see her for some reason I thought maybe she would come to me and say she misses me.’ Currently, this system can transform the word-to-word thinking of a person but can provide the idea of their thoughts. Also Read: Shutterstock and OpenAI to Sell up AI-generated Stock Images