The distinction between hardware and wetware is going to really start to blur in the coming years:
Charles Higgins, an associate professor at the University of Arizona, has built a robot that is guided by the brain and eyes of a moth. Higgins told Computerworld that he basically straps a hawk moth to the robot and then puts electrodes in neurons that deal with sight in the moth’s brain. Then the robot responds to what the moth is seeing — when something approaches the moth, the robot moves out of the way.
Higgins explained that he had been trying to build a computer chip that would do what brains do when processing visual images. He found that a chip that can function nearly like the human brain would cost about $60,000.
“At that price, I thought I was getting lower quality than if I was just accessing the brain of an insect which costs, well, considerably less,” he said. “If you have a living system, it has sensory systems that are far beyond what we can build. It’s doable, but we’re having to push the limits of current technology to do it.”
There are going to be some humdinger ethics issues to deal with along this road.