Posts

“I Know What You're Thinking, Dave”

Pardon me, but does anyone else find this a tad bit creepy? Scientists at Washington University in the US have created a computer program that can translate your thoughts into written language. Yes, computers are on the verge of being able to read our minds.

The researchers took a brain implant currently used by neurologists to determine which areas of the brain are responsible for seizures in people with epilepsy, and reprogrammed it to pick up the brainwaves produced when we think of certain sounds. For this study, just four vowel sounds were used. When test subjects thought of the sound, the appropriate letter would appear on the computer screen, no typing necessary. Read more

Different Grammatical Structures Use Different Parts of the Brain

All languages have a vocabulary and a grammatical structure. However, the type of grammatical structure varies depending on which language you are looking at. In some languages, like English, the order of the words largely determines the meaning of a sentence. However, in other languages, like German, word order is more flexible because the language uses “tags,” like prefixes or suffixes, to make the meaning of the sentence clear.

If trying to learn a language with a different grammatical structure than the one you were born speaking makes your head feel like it’s going to explode at first, there may be a very good reason: you’re having to use a different part of your brain than you normally would.

In American Sign Language (ASL), the meaning of a sentence can be determined either by word order or by “tags.” So, the same sentence can be signed two ways-either using word order or using tags. In a study performed at Dalhousie University in Halifax, Canada, researchers found that individuals fluent in ASL used a different part of the brain to comprehend a sentence signed with tags than they did to understand the same sentence signed using word order.

The researchers showed 14 deaf individuals, all native ASL signers,a video of a study coauthor signing the same sentences in two different ways. While the study participants watched the video, the researchers used functional MRI scans to monitor their brain activity.

To the authors of the study, the fact that different areas of the brain were used to process the different types of syntax implies that we comprehend language using neural structures that originally evolved for other purposes. As coauthor Aaron Newman told Science News:

“We’re using and adapting the machinery we already have in our brains. Obviously we’re doing something different [from other animals], because we’re able to learn language. But it’s not because some little black box evolved specially in our brain that does only language, and nothing else.”

Learn a New Word in 15 Minutes

How long does it take for a word from a foreign language to permanently etch itself into your brain? About 15 minutes, according to researchers at Cambridge University. That’s all the time your brain needs to build a network of brain cells that will help you recall the word in question. The only catch is that you need to hear it repeated at least 160 times in that 15-minute window.

To perform the experiment, the scientists hooked people up to a monitor to measure their brain activity. They were presented first with a word they were familiar with. To simulate the experience of hearing a foreign word for the first time, they then listened to a word that was made up. Then, they listened to it again, and again, and again…Wow, that must have been annoying!

Fortunately, the volunteers were patient, and the data that was gleaned from the monitors attached to their skulls proved to be quite informative.  Researchers watched the volunteers’ brain waves as they  listened to first the familiar word and then to the the unfamiliar one, and found that the brain’s reaction to the two words was almost identical after hearing the unfamiliar one repeated 160 times in a 15 minute period. Read more