How DARPA mind-reading technology works: civilian researches which duplicate DARPA researches

The principles by which the DARPA mind-reading computer in the control center reads my thoughts through the microchip implant in my head can be elucidated by considering a few civilian researches into mind-reading which duplicate the DARPA efforts. First, Sharon Begley’s January 12 2008 “Mind Reading Is Now Possible” (http://www.thedailybeast.com/newsweek/2008/01/12/mind-reading-is-now-possible.html). I want to quote the entire short article here (emphasis mine):

“Crime investigators always have their ears open for information only a perpetrator could know – where a gun used in a murder was stashed, perhaps, or what wounds a stabbing inflicted. So imagine a detective asking a suspect about a killing, describing the crime scene to get the suspect to visualize the attack. The detective is careful not to mention the murder weapon. Once the suspect has conjured up the scene, the detective asks him to envision the weapon. Pay dirt: his pattern of brain activity screams ‘hammer’ as loud and clear as if he had blurted it out.

“To detect patterns of brain activity, a subject must agree to lie still in a neuroimaging device such as a functional magnetic resonance imaging (fMRI) tube, but in an age when many jurisdictions compel not only convicts but also suspects to provide a DNA sample, that isn’t difficult to imagine. Now, neither is the prospect of reading thoughts by decoding brain-activity patterns. Just a year ago, neuroscientists couldn’t do much better than distinguish thoughts of faces from thoughts of places (the brain has distinct regions that process images of each). ‘All we could do was tell which brain region was active,’ says neuroscientist John-Dylan Haynes of the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig, Germany. ‘There were real limits on our ability to read the content of that activity.’ No longer. ‘The new realization is that every thought is associated with a pattern of brain activity,’ says Haynes, ‘and you can train a computer to recognize the pattern associated with a particular thought.’

“We’ll get to the ethical implications of that, but first consider how quickly mind reading is advancing. Less than three years ago, it was a big deal when studies measured brain activity in people looking at a grating slanted either left or right; fMRI patterns in the visual cortex revealed which grating the volunteers saw. At the time, neuroscientist Geraint Rees of University College London said, ‘If our approach could be expanded upon, it might be possible to predict what someone was thinking or seeing from brain activity alone.’ Last year Haynes and colleagues found that even intentions leave a telltale trace in the brain. When people thought about either adding two numbers or subtracting them, an fMRI scan of their prefrontal cortex detected activity characteristic of either.

“Now research has broken the ‘content’ barrier. Scientists at Carnegie Mellon University showed people drawings of five tools (hammer, drill and the like) and five dwellings (castle, igloo…) and asked them to think about each object’s properties, uses and anything else that came to mind. Meanwhile, fMRI measured activity throughout each volunteer’s brain. As the scientists report this month in the journal PLoS One, the activity pattern evoked by each object was so distinctive that the computer could tell with 78 percent accuracy when someone was thinking about a hammer and not, say, pliers. CMU neuroscientist Marcel Just thinks they can improve the accuracy (which reached 94 percent for one person) if people hold still in the fMRI and keep their thoughts from drifting to, say, lunch.

“As always, the results have to be replicated by independent labs before they can be accepted. But this is the first time any mind-reading technique has achieved such specificity. Remarkably, the activity patterns – from visual areas to movement area to regions that encode abstract ideas like the feudal associations of a castle – were eerily similar from one person to another. ‘This establishes, as never before, that there is a commonality in how different people’s brains represent the same object,’ said CMU’s Tom Mitchell.

“If what your brain does when it thinks about an igloo is almost identical to what mine does, that suggests the possibility of a universal mind-reading dictionary, in which brain-activity pattern x means thought y in most people. It is not clear if that will be true for things more complicated than pliers and igloos, however. ‘The more detailed the thought is, the more different these patterns get, because different people have different associations for an object or idea,’ says Haynes. ‘We’re much closer to this than we were two years ago, but still far from a universal mind-reading machine.’ How far? The CMU group is determining the brain patterns that encode abstract ideas (honesty, democracy), words and sentences, a big step toward a mind-reading dictionary.

“Scientists are keenly aware of the ethical issues posed by reading minds. For one thing, it probably isn’t necessary, if you decide to read people’s thoughts, to get them to lie still in an fMRI tube and think. Nothing in physics rules out remote detection of brain activity. In fact, says law professor Hank Greely of Stanford, an infrared device under development might read thoughts using little more than a headband. He can imagine a despot scanning citizens’ brains while they look at photos of him, to see who’s an opponent. The use of mind reading in criminal and terrorism investigations seems inevitable, raising issues of reliability and self-incrimination. As with all technology, some uses will bring unalloyed benefits (translating a quadriplegic’s thoughts to move a prosthetic limb). Other uses… well, as Greely says, ‘we really don’t know where this will end.’ That mind reading has begun, however, there is now no doubt.”

The italicized portion of the article encodes the essence of human brain physiology on the basis of which DARPA’s mind-reading technology can even work at all: that the human brain’s functioning is universal across individuals, races, and genders, allowing for the construction of a universal mind-reading dictionary. When a Chinese man who has grown up in Hong Kong sees a chair, the patterns of electrical activities in his brain corresponding to the perception will be the same as those of an American woman who has grown up in America and is now seeing a chair. The more advanced DARPA thought-reading computer must have overcome the problem which Begley has noted, that “the more detailed the thought is, the more different these patterns get, because different people have different associations for an object or idea”. Perhaps the DARPA computer has overcome this problem by breaking down the complex, and more idiosyncratic, patterns of electrical activities in the brain during complex thoughts into elementary components, separating the patterns for the object in question from the patterns for the associations engendered by the thought about the object. When complex and detailed thoughts are broken down into elementary components, you can be sure that the “constituent patterns” are still universal. When the DARPA thought-reading computer reads my thoughts by receiving data, from the microchip implanted in my head, about the patterns of electrical activities in my brain, there is a specific software which matches the patterns detected with those found in the “mind reading dictionary”. This software is referred to as the “setting” in my “Secret History of the International Court of Justice.”

Now I want to go to a New Scientist report, “Mind-reading machine knows what the eye can see”, 05 March 2008 by James Urquhart: http://www.newscientist.com/article/dn13415-mindreading-machine-knows-what-the-eye-can-see.html. I have talked about how the Russians are able to see on the computer screens in front of them the precise images which I’m seeing – in which case I have become a walking surveillance camera for them – or which I’m simply picturing to myself in my head (e.g. the “pyramid” or triangular nose of the Russian surveillance agent seen on June 2 2009). They are able to do that obviously because the thought-reading computer has reconstructed the images I’m seeing by matching the patterns of activities on my visual cortex with the universal patterns listed in the “mind reading dictionary”. The 2008 UC Berkeley research reported in New Scientist deals precisely with this specialized operation of thought-reading: turning a living human being into a surveillance camera for the control center. Let me quote the New Scientist report in its entirety:

“A device that reveals what a person sees by decoding their brain activity could soon be a reality, say researchers who have developed a more sophisticated way to extract visual stimuli from brain signals.

“Scientists at the University of California, Berkeley, US, developed a computational model that uses functional MRI (fMRI) data to decode information from an individual’s visual cortex – the part of the brain responsible for processing visual stimuli.

“’Our research makes substantial advances towards being able to decode mental content from brain activity as measured using fMRI,’ Kendrick Kay, a co-author of the study, told New Scientist. ‘In fact, our results suggest it may soon be possible to reconstruct our visual experiences from brain activity.’ [This is how the thought-reading technology can transform you into a surveillance camera for the control center.]

“Previous research has shown that fMRI can pick out brain activity associated with viewing different images. But so far it has only been possible to identify very basic images, from fixed categories, such as a face or a house. The process also depends on prior knowledge of the associated brain activity.

“Now the Berkeley team has shown that brain imaging can reveal much more complex and arbitrary images, without prior knowledge of brain activity. [To do this, you need a ‘mind-reading dictionary’ in place.]

“The team first used fMRI to measure visual cortex activity in people looking at more than a thousand photographs. This allowed them to develop a computational model and ‘train’ their decoder to understand how each person’s visual cortex processes information. [This is the construction of the ‘mind-reading dictionary‘.]

“Next, participants were shown a random set of just over 100 previously unseen photographs. Based on patterns identified in the first set of fMRIs, the team was able to accurately predict which image was being observed. [The ‘thought-reading computer’ has successfully reconstructed what the person is seeing when the ‘setting’ matches the brain activity pattern detected with the same pattern stored in the ‘mind-reading dictionary’ which has assigned the pattern with a previously decoded visual image.]

“’It is going to be particularly powerful in the field of visual perception and possibly the field of decoding motor responses,’ says John-Dylan Haynes of the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig, Germany.

“The research also hints that scientists might one day be able to access dreams, memories and imagery, says Haynes, providing the brain processes dreams in a way that is analogous to visual stimuli. [This is how the DARPA thought-reading computer tells the Russians that I am picturing to myself the particular triangular nose of their surveillance agent.]

“’The difficulty is that that it’s very hard to set up models for other types of complex thoughts, such as memories and intentions,’ Haynes says.” [Well, as you have seen, this is simply not the case. Our Pentagon neurologists have already solved this problem long ago, around late 2007 to early 2008.]

In fact, this is the problem I wish to point out to you. Even though DARPA has already developed a functional mind-reading technology by the middle of 2008, researchers in universities continue to attempt to “reinvent the wheel” as if the wheel had never been invented at all. Are these scientists doing this out of ignorance?

The Macrospherians (Putin, Obama, the Chinese president, and the ICJ judges) must have considered as some sort of crime the US military’s practice of deliberately keeping out of the civilian world the amazing technology which it has developed for itself – the practice of letting the civilian researchers waste their time reinventing a wheel which has already been invented. During the third run of the International Court trial the Macrospherians have thus also ordered the criminal defendants (the leading Microspherians) to recommit this crime of allowing university researchers to waste their time studying mind-reading technology. Thus we see that, more than three years after DARPA scientists are able to use a computer to reconstruct visions, Jack Gallant and his assistant Shinji Nishimoto are reproducing DARPA scientists’ achievement at UC Berkeley – using a far less effective method at that. See the report by UC Berkeley’s News Center, “Scientists use brain imaging to reveal the movies in our mind” (http://newscenter.berkeley.edu/2011/09/22/brain-movies/). In their work they use functional Magnetic Resonance Imaging (fMRI) and computational models to decode and reconstruct people’s dynamic visual experiences (in this case, when people are watching Hollywood movie trailers) and they presumably could reproduce the movies inside our heads that no one else sees, such as dreams and memories. Yasmin Anwar reports:

“Nishimoto and two other research team members served as subjects for the experiment, because the procedure requires volunteers to remain still inside the MRI scanner for hours at a time.

“They watched two separate sets of Hollywood movie trailers, while fMRI was used to measure blood flow through the visual cortex, the part of the brain that processes visual information. On the computer, the brain was divided into small, three-dimensional cubes known as volumetric pixels, or ‘voxels.’

“’We built a model for each voxel that describes how shape and motion information in the movie is mapped into brain activity,’” Nishimoto said.

“The brain activity recorded while subjects viewed the first set of clips was fed into a computer program that learned, second by second, to associate visual patterns in the movie with the corresponding brain activity. [Again, the construction of a mind-reading dictionary.]

“Brain activity evoked by the second set of clips was used to test the movie reconstruction algorithm. This was done by feeding 18 million seconds of random YouTube videos into the computer program so that it could predict the brain activity that each film clip would most likely evoke in each subject.

“Finally, the 100 clips that the computer program decided were most similar to the clip that the subject had probably seen were merged to produce a blurry yet continuous reconstruction of the original movie.”

Jack Gallant is therefore trying to reinvent the specialized DARPA mind-reading technology which allows a human being implanted with thought-reading microchip to become a mobile surveillance camera for the control center. Not only does this technology already exist in fully operational form, but Jack Gallant tries to detect the patterns of brain activities by following blood flows in the brain, which is far less effective than following the electrical activities in the brain. He is thus not just wasting his time reinventing an invented wheel, but he isn’t even reinventing it correctly. Jack Gallant’s research project is truly pathetic, and he probably has no idea that he has been manipulated by the International Court system to waste his time and energy like this. (His webpage is here: http://neuroscience.berkeley.edu/users/users_profile.php?id=12; and his lab’s webpage is: http://gallantlab.org/.) Yasmin Anwar, reporting for Berkeley News, is thus completely wrong when she writes: “However, researchers point out that the technology is decades from allowing users to read others’ thoughts and intentions…” She is completely ignorant of the fact that the US military has already been reading people’s most detailed thoughts and intentions and deploying the technology on the battlefield, such as in Afghanistan – not to mention how Dick Cheney has ordered it to be deployed on foreign diplomats (and how it has been used on me).

Naweed Syed at the University of Calgary whom I have mentioned before is another civilian researcher who has been manipulated by the International Court system to reinvent a wheel which has already been invented by the US military. The brain chip design he is so excited about is in fact an early, incomplete, experimental version of the DARPA brain chip which currently resides in my head. What an idiot. He is wasting his time and energy and does not know that the technology he aims to discover is already fully complete and operational and can be found right in my head. He should just take the brain chip out of my head to save himself the troubles. DO NOT TRUST MIND-READING RESEARCHES BEING PURSUED AFTER 2010: these researches are orchestrated to waste the researchers’ time and energy.

There are already a plenty of talk about how companies are ready to market mind-reading technology allowing consumers to remotely control household electronics and their computer (like keyboard and mouse) with their thoughts. See for example, “Dump the mouse, control PC with brain” (May 11 2012: http://www.hindustantimes.com/technology/IndustryTrends/Dump-the-mouse-control-PC-with-brain/SP-Article1-854241.aspx). Here Kevin Brown, senior inventor at IBM, describes how mind control, using headsets like the commercially-available Emotiv Systems headset, can allow the consumers to remotely control their computers with their thoughts alone. “[The Emotiv System] can… pick up on what our brain is telling our muscles to do… [eventually] our EEG brainwaves. Users can quickly train the software to understand various patterns… [Again, the construction of a 'mind-reading dictionary'.] [It] is… recognizing certain patterns, and passing that information to a control unit which can then respond to that input.” The most attention deserving is the BrainGate Neural Interface. See this news release from Brown University, “BrainGate neural interface reaches 1,000-day milestone”: http://news.brown.edu/pressreleases/2011/03/braingate. Here it is reported that a microchip implanted in the head of a woman with paralysis to translate her brain signals toward control of assistive devices has been allowing her to accurately control a computer cursor for almost three years since implantation, “providing a key demonstration that neural activity can be read out and converted into action for an unprecedented length of time.” Go research more on how neurologists are working on devices similar to BrainGate allowing disabled patients to remotely control their prosthesis with their thoughts. The microchip implanted in the head of those people around me allowing them to be forcibly interfaced with the computer in the control center operates essentially by the same principle as this BrainGate Neural Interface. While the microchip allows the computer to read their thoughts by picking up the patterns of electrical activities in their brain, it can also allow the computer to remotely control them by artificially inducing the patterns of electrical activities in their brain which correspond to the actions or thoughts which the computer wants them to perform or think. This is how the technology of “talking through people” work: those shadowy figures sitting in the control center have a microchip implanted in their head allowing them to communicate to the computer what they want, and the computer can control you at the same time; so, when the shadowy figure thinks about how he wants you to raise your hand or stand up or say this word or that word, the computer will control you to raise your hand or stand up or say this word or that word. In other words, the microchip system has essentially reduced you to the status of a prosthesis or electronic devices remotely controlled by a BrainGate Neuro Interface patient – except, of course, that the shadowy figure in the control center doesn’t suffer any paralysis at all.

When the civilian world tries to reinvent the wheel which the US military has already invented, it is essentially playing with fire. IBM aside, in less than a year Intel would announce that it would have developed by 2020 brain chips which would enable consumers to type without a keyboard or mouse, control the TV without a remote, make calls, and surf the Web simply by using their brain waves (See http://www.computerworld.com/s/article/9141180/Intel_Chips_in_brains_will_control_computers_by_2020?taxonomyId=11&pageNumber=1, and http://www.unwittingvictim.com/Intel.html.) The consumer would be able to control computers and other devices with nothing more than thought. Intel would emphasize that the chips were for the consumer market – “Big Brother won’t be planting chips in your brain against your will”. Even if this were so – it is not, of course, since the government has already been forcibly planting such brain chips in people’s head without their knowing – that possibility emerges for the government to intercept the signals you would be transmitting from your head to the devices you are supposed to remotely control – to intercept your thoughts, in other words – just as, when you call up someone on the phone, you open up the possibility for the government to intercept your conversation. Eventually, just as NSA’s Echelon automatically intercepts and stores every single communication made in the United States – phone call, email, fax – so its thought reading computer would automatically intercept and store in a giant computer every single thought that has ever been thought in the United States. We thus come back to the same scenario of the “utopia” in “Cheney’s Plan”.

When I first understood the principle for this thought-reading technology, not understanding that the DARPA neurologists must have learned to break up the patterns of electrical activities measured in the human brain into their most elemental components, I was plagued by the inference that thought-reading must be hampered by the problem of intelligibility. This problem is something you will naturally think of when you are versed in Heidegger’s philosophy and in the interpretation of Heidegger’s early philosophy by the UC Berkeley professor Hubert Dreyfus. (See Dreyfus’ Being-in-the-World.) When we see a chair we don’t just see a chair but we understand also that there is a certain functionality which goes with the chair. Inherent in the perception of a chair is the understanding that this is something to sit on and that it has a certain relationship with the table and the rest of the world of furniture. When you see a chair and the thought-reading computer picks up the electrical patterns of your brain and translates them into the perception of a chair, the understanding of the functionality of the chair must be part of the electrical patterns. (The “associations” which go with the perception of an object.) This means that, when someone from a primitive culture or from a past culture where chairs have never been seen or used sees a chair, he cannot exhibit the same electrical patterns in his brain. The thought-reading computer therefore will not translate the patterns it picks up from the brain of this primitive man or past man into “chair”. Maybe a four-legged platform (the vorhanden state of the chair), but not really a “chair” (the zuhanden state of the chair: to use Heidegger’s terminology). Since DARPA’s thought-reading technology is developed by studying the functional patterns of the brains of people in modern post-industrial societies, it cannot be expected to read the mind of New Guinea tribesmen entirely correctly (assuming that the New Guinea tribesmen are so completely isolated from outside civilizations that they have never seen the items commonly in use in post industrial societies; such isolation is of course rarely the case anymore).

The control center has been telling me – assuming it is not lying to me – that the United Nations has secretly set up a committee to study the scientific values of this “thought-reading technology” and other weird things which the Russians have dug out of the secret boxes of the Pentagon, and that some scientists could not help but want to enlarge the “mind-reading dictionary” into an universal repository of all human thoughts (represented by their corresponding functional patterns of the brain) which can possibly be collected – this would become the mental equivalent to the Human Genome Project, which can be said to be the universal repository of all human genes which can possibly be found in human beings. For this project to be complete, they would have to go to the jungles of Amazon and New Guinea and Southern Africa to microchip the so-called primitive peoples (from the Yanomami through the New Guinean aboriginals to the San Bushmen) to see if the thought-reading computer, built from scanning the thoughts of “modern people”, can read the thoughts of primitive people as well. It will turn out that it in fact can! The thought-reading computer has no problem in reconstructing from the brain activities of primitive people their visual images of even those objects from our high tech civilization which they have either never seen or are hardly familiar with. The DARPA computer automatically breaks down the patterns detected in primitive people’s brain into those for pure visual images of the objects (the Vorhandenheit of the objects) and those for understanding of, or familiarity with, the objects (their Zuhandenheit): in the current case, for non-understanding. It will in fact sound very strange to someone following in the footsteps of Professor Dreyfus that, even without understanding what the object is for at all, a human being can see the object just as well as another human being who fully understands its use among the network of tools in the culture from which the object has come. (“See”: that is, exhibit the same patterns of electrical activities in the brain…) The myth that the first native Americans like the Aztecs who couldn’t tell apart the Christian white man in armor from the horse he was riding on because they had never seen a white man nor a horse is probably just a myth and no more. On the other hand, these scientists, while looking at the brain functional patterns for concepts like “totem” and “taboo” which permeate the mental life of New Guinea tribesman but which you can’t really find in the case of “modern people” (except remotely among vodoo practitioners or schizophrenic patients), are probably surprised to find that my “genealogy of primitive religiousness” is quite correct: the primitive people do practice their religious rituals because they see the cosmos as a biological organism and rituals as the system of catalysis which would allow the necessary energy to flow through the cosmic organism, allowing the latter to replenish itself (http://www.lawrencechin2011.com/scientlitfr.html: “A Thermodynamic Genealogy of Primitive Religions”).

I have a more thorough explanation of the DARPA mind-reading and mind-control technology here: http://www.lawrencechin2011.com/suppl_pld_8bg42/suppl_pld_8bg42c.pdf (p. 32 – 49).

Below are some video presentations on BrainGate, the most celebrated case of brain-computer interface, the basis of mind-reading and mind-control of the latest phase:

60 Minutes report on BrainGate:

The Back to Class with Brown Faculty presentation, November 17 2008, where Leigh Hochberg ’90, associate professor of engineering, discussed Brown’s collaborative research towards developing assistive technology for people with paralysis and limb trauma.

2 thoughts on “How DARPA mind-reading technology works: civilian researches which duplicate DARPA researches

  1. Here is a latest news item about BrainGate system 2: http://www.nature.com/news/mind-controlled-robot-arms-show-promise-1.10652. While I have theorized that the communication component on the DARPA brain chips inside my brain is a nano radio device made of carbon nanotubes, the researchers in the BrainGate project are working on a different wireless communication system for their micro brain chip (“sensor”): Wireless, Ultra Low Power, Broadband Neural Recording Microsystem: http://nurmikko.engin.brown.edu/?q=node/1. “Our implantable microsystem enables presently 16-channel broadband neural recording in a non-human primate brain by converting these signals to a digital stream of infrared light pulses for transmission through the skin. The implantable unit employs a flexible polymer substrate onto which we have integrated ultra-low power amplification with analog multiplexing, an analog-to-digital converter, a low power digital controller chip, and infrared telemetry. The scalable 16-channel microsystem can employ any of several modalities of power supply, including via radio frequency by induction, or infrared light via a photovoltaic converter.”

    I have a more thorough explanation of the DARPA mind-reading and mind-control technology here: http://www.lawrencechin2011.com/suppl_pld_8bg42/suppl_pld_8bg42c.pdf (p. 32 – 49).

Leave a Reply