Get a quote
Loading Form..

We May Never Use Audible Languages Again

February 12, 2024


Originally written by a prolific twentieth-century philosopher, as his series of notes to himself during lectures at the University of Cambridge on the philosophy of language, Ludwig Wittgenstein is considered by some to be the greatest analytic philosopher of the twentieth century. The question below is one that the Austrian thinker publicly asked his anti-systematic followers.

“Is it conceivable that people should never speak an audible language, but should nevertheless talk to themselves inwardly, in the imagination?”

Ludwig Wittgenstein, Philosophical Investigations

Analyzing inner speech was not a novel idea in neuropsychology. Soviet psychologist Lev Vygotsky had already speculated that childhood cognitive abilities were shaped by use of language and not biologically determined. Of note, Wittgenstein died 75 years before Elon Musk would implant a Bluetooth-enabled computer chip in a human brain to allow someone to control their phone or computer by just thinking.

The beginning of modern neuroimaging...

The first radiograph was introduced by a physicist in 1895 and opened the door for amazing advances in medical diagnoses and the imaging of human disease. William Oldendorf, M.D., created a key instrument for neuroimaging in his basement that led to computerized tomography (CT Scan) that would be the predecessor to positron emission tomography (PET Scan). Functional brain imaging techniques then adopted electromagnetic methods, such as magneto-encephalography (MEG), as well as hemodynamic techniques, such as functional magnetic resonance imaging (fMRI), to track the effects of neural activity and the energy needs of brain tissues.

Neuralink: Brain Interface for Unmet Potential

A fully implantable brain-computer interface (N1) was surgically placed by Neuralink’s R1 Robot to record and transmit brain signals wirelessly to an app that, in turn, decodes mental intentions. This wireless brain computer interface is currently being developed to eventually allow people with paralysis to control external devices with their thoughts, such as using a computer keyboard that can restore autonomy to those with unmet physical needs.

Current clinical studies are openly recruiting those with quadriplegia caused by spinal cord injuries or amyotrophic lateral sclerosis (ALS). However, in addition to potentially allowing people with paralysis to move their limbs, Neuralink’s long-term ambitions include producing implants to effectively cure blindness, mental illness, and neurological disorders like Alzhheimer’s. Ironically, one of the cons is the possibility of brain inflammation that increases dementia risks.

As for potential linguistic uses, waves are the purest system of communication for transferring information from one person to another. Moreover, we know that language can be present in the absence of sound, such as when we read or engage in endophasic activities for our internal dialogue. Plus, scientists have already discovered that electric waves could preserve the shape of a corresponding sound wave in non-acoustic areas of the brain.

Silent Communication May Soon Be Possible

These recent findings shed important light on the relationship between sound waves and electric waves in the brain. Indeed both can travel through us but leave us intact. Fortunately, they allow us to interpret messages produced by momentary waves as long as our brain has the key to decode them. It is not an accident that the term information is derived from Latin, meaning to "inform" is to "share a shape."

In addition, waves rely on an important aspect of neuropsychological concept in that any language in the brain is namely the end result of sound emission decoding, whether it is received from outside or inside a person’s head. This simple fact raises a crucial question as to what happens to the electric waves in our brain when we generate a linguistic expression by speaking to ourselves in our silent voice.

According to a recent article on brain behavior, our inner voice appears to be processed auditorily in the temporal cortex in much the same way as external voices. Additionally temporal activity, inner speech, and reading to ourselves engages the frontal motor cortex as well as Broca’s area of the brain. So, localization of inner and external voices will depend on emitting and deciphering signals along those specific auditory pathways.

Nonetheless, one day human communication could effectively be void of audible languages.

________________

There is no doubt that neural implants performing millions of tasks with remarkable efficiency could allow linguistic engineers to push the limits of intelligent technologies. To learn more about the latest language technologies for delivering your organization or company messaging in the most effective way, call ProLingo at 800-287-9755 and speak with a language specialist today.

Client Spotlight
PROLINGO CLIENT TESTIMONIALS

We recently acquired the service of ProLingo for our multinational medical based technology conference. This was our first multinational conference and the event went flawless thanks to the impeccable service form the ProLingo staff.
- R. Sanchez, San Diego

5 / 5 stars

Get a Free Quote

Loading Form..