“[H]ominids definitely didn’t need exponentially vaster brains than chimpanzees. And John von Neumann didn’t have a head exponentially vaster than the head of an average human. And on a sheerly pragmatic level, human axons transmit information at around a millionth of the speed of light, even when it comes to heat dissipation each synaptic operation in the brain consumes around a million times the minimum heat dissipation for an irreversible binary operation at 300 Kelvin, and so on. Why think the brain’s software is closer to optimal than the hardware? Human intelligence is privileged mainly by being the least possible level of intelligence that suffices to construct a computer; if it were possible to construct a computer with less intelligence, we’d be having this conversation at that level of intelligence instead.”
— Eliezer Yudkowsky, AI Visionary Eliezer Yudkowsky on the Singularity, Bayesian Brains and Closet Goblins (Scientific American, March 1, 2016)