Go to Admin » Appearance » Widgets » and move Gabfire Widget: Social into that MastheadOverlay zone
In a small room tucked away at the University of Toronto, Professor Dan Nemrodov is pulling thoughts right out of people’s brains.
He straps a hat with electrodes on someone’s head and then shows them pictures of faces. By reading brain activity with an electroencephalography (EEG) machine, he’s then able to reconstruct faces with almost perfect accuracy.
Student participants wearing the cap look at a collection of faces for two hours. At the same time, the EEG software recognizes patterns relating to certain facial features found in the photos. Machine-learning algorithms are then used to recreate the images based on the EEG data, in some cases within 98-per-cent accuracy.
Nemrodov and his colleague, Professor Adrian Nestor say this is a big thing.
“Ultimately we are involved in a form of mind reading,” he says.
The technology has huge ramifications for medicine, law, government and business. But the ethical questions are just as huge. Here are some key questions:
If developed, it can help patients with serious neurological damage. People who are incapacitated to the point that they cannot express themselves or ask a question.
According to clinical ethicist Prof. Kerry Bowman and his students at the University of Toronto, this technology can get inside someone’s mind and provide a link of communication. It may give that person a chance to exercise their autonomy, especially in regard to informed consent to either continue treatment or stop.
In a courtroom, it may end up being used to acquit or convict those accused of crime. Like lie detector tests and DNA analysis, brain scanning our memories may become a legal tool to help prove innocence or guilt.
It may even change our relationship with animals. If, as student Nipa Chauhan points out, we know what they understand and feel, we may act differently toward them.
A lot. Let’s start with the concept of memory. Our memories are never “pure” — nor are they ever complete.
And our brain often fills in the blank spots with biases and personal reflections. Researchers like Adrian Nestor and his colleague Dan Nemrodov agree it’s still a bit like archaeology-digging beneath the layers to find the raw information. They haven’t found it yet, but they believe it’s just a matter of time.
That, according to Bowman and his students, raises the thorny issue of freedom, especially freedom of thought.
“Nobody can tell me what to think or when to think or how to think. This is the first time that freedom can be infringed upon,” says Bowman.
And from there it can take unpredictable turns. Could a person be compelled to undergo mind reading in order to apply for a job or to gather evidence for police? Would it ever be ethically acceptable to allow this without consent?
“How might we regulate that, especially since it’s ripe for abuse with authoritarian regimes. Without consent to do that would be very problematic,” says student Yusef Manialawy.
The prospect of mind reading also has commercial implications. Data mining can go to a whole new level, if businesses can scan your mind in terms of product preferences or even your lifestyle preferences.
“From a marketing point of view, it would be a bonanza,” says Bowman.
Not yet. And that’s because the possibilities of mind reading are so new, there’s been little discussion to establish guidelines.
However that’s changing. Marcello Ienca, a researcher with the Health Ethics and Policy Lab at the Swiss Federal Institute of Technology in Zurich, is part of a group proposing a set of neurorights. It is a way of protecting our thoughts from being extracted and interpreted without proper consent.
“I’m against any type of outright ban against this type of technological development because I think that the clinical benefits of this can be extremely important, but I also think that we have to try to minimize the risks before it becomes pervasively distributed in our society,” Ienca told The Current’s Anna-Maria Tremonti in a recent interview.
Researchers Nestor and Nemrodov insist ethics shouldn’t undermine discovery but evolve with it — because this is just the tip of the iceberg.
“We want to be able to reconstruct images based on what people think and not just what people see,” says Nemrodov.
Their next step? Trying to extract text words directly from our brain. An idea that may seem far fetched now, but is approaching sooner than we think.