Photo by Sean Gladwell via Getty Images.

Debunking the Tech-Positive Myth of Wetware

Newsflash: the human brain is not, and never should be, like a computer!

by Elvia Wilk
|
Nov 29 2018, 4:13pm

Photo by Sean Gladwell via Getty Images.

“You wanna see something cool?”

Nathan, the CEO brogrammer from Alex Garland’s 2015 movie Ex Machina, holds out a bluish, squishy pouch. He’s giving a visitor a tour of his bunker-like laboratory, where he develops the artificial brains for his collection of humanoid robots.

“Structured gel,” Nathan explains. “I had to get away from circuitry—I needed something that could arrange and rearrange on a molecular level, and keep its form when required. Holding for memories, shifting for thoughts.”

“This is your hardware?” asks the visitor.

“Wetware,” Nathan responds.

Ex Machina is (still) science fiction, but the term “wetware” is increasingly used IRL by technologists to describe various combinations of hardware, software, and organic material. The word could be applied to any number of inventions, from artificial brains assembled from scratch—as in the movie—to electronic implants and cyborg-like prostheses.

The term wetware, which probably emerged in the 1950s but didn’t enter popular discourse until the 70s, originally referred to the human brain itself: that “natural computer” in your skull. Programmers and engineers—and plenty of cyberpunk writers—imagined replicating that wetware with hardware, that is, building machines to one day replicate human intelligence. Most saw this as a two-way analogy: if computer can become like brain, brain must already be like computer.

Throughout history, theories of both consciousness and intelligence have upgraded periodically according to the most advanced technology of the day. Research psychologist Robert Epstein describes this development in a 2016 essay titled The Empty Brain. In the third century BCE, he says, the human mind was conjectured to work like a hydraulic system; in the seventeenth, when chemistry was advancing, the brain was a chemical reactor; in the industrial era, the brain was a mechanical system. According to that trajectory, if we imagine the brain as an especially advanced computer—which has likely been the case since mathematician John von Neumann’s 1958 book The Computer and the Brain—it’s only because computers are the most sophisticated machines we have right now.

“Our shoddy thinking about the brain has deep historical roots, but the invention of computers in the 1940s got us especially confused,” writes Epstein. Computational models of mind rest on the “information processing” metaphor, as if brains process, store, send, and receive information—but there is little evidence to suggest this is how thought actually works. “Information, data, rules, software, knowledge, lexicons, representations, algorithms, programs… Not only are we not born with such things, we also don’t develop them—ever.”

Yet the view Epstein debunks prevails, and continues to influence how computers are designed. This kind of circular, mutually reinforcing fiction is probably part of why we have so much trouble imagining, much less studying consciousness; we insist on measuring it with the same computational instruments we’ve invented in its supposed image. In other words, maintaining that the brain is a replicable technology—no matter how incredibly powerful we believe it to be—is exactly what holds back our understanding of it. Certainly robotics will incorporate more organic material as the uncanny valley shrinks and the potential for machines to become squishier and more lifelike increases. But just because computers might be able to perform like brains, a brain will never be just a (wet) computer.

Tagged:
Wetware
extremely online
Tech
brain