In the 18th century, wealthy nobles and merchants would sometimes take a tour through the countryside, maybe for business reasons, maybe just because they were getting into the newfangled fad of appreciating nature.
When their carriage rolled through remote rural towns, often everyone nearby would stop what they were doing to stare. Kids, old stooped grannies, farmers in the fields, the local parson, everyone.
They did this because it was something new to see. In some towns in France, newcomers arrived so seldom that two towns a few miles apart might speak mutually incomprehensible dialects.
This leads to an idea I have about why we'll never really get around to developing artificial intelligence.
Here's the thing: people in the past were operating under a mental handicap that was, until recently, not recognized. This has nothing to do with processing power or raw intelligence. The brains of Isaac Newton or Sun Tzu or the nameless Olmec priest-mathematician who came up with the concept of zero - those were better than our brains. Unless you are Stephen Hawking or one of a few dozen other people currently alive.
But those brains could hold only a limited amount of information at any one time. They had a limited reference pool of ideas.
Sun Tzu never saw the pyramids. Newton never left England, never saw anywhere as far away as Ireland except in engravings. The Mesoamericans didn't know Europe or Africa or Asia existed (and vice versa, of course) until 1492.
I know all this. Moreover, I can look up almost any piece of information, if it is already known. I can add my information to this pool. I can share it with anyone with an internet connection, where it can be debated, refuted, commented upon, and copied.
Does this make me smarter? Again, not in terms of processing power. I am unlikely to make any great discoveries because of the amount of information we all possess.
What I have, what we share and maintain together, is a prosthetic memory.
The development of the prosthetic memory started with spoken language, which allowed our prehuman ancestors to get across important concepts like "watch out for tigers" to their offspring. Writing expanded this ability enormously, so that instead of passing along information from one living person to another, one person could continue to disseminate information, even after death, to multiple people. death, to multiple people. The moveable type printing press was another exponential expansion.
And then we got electronic communications, and the internet.
Some futurists and science fiction writers have been playing with the idea of the Singularity for a while now. This is the idea that artificial intelligence, once developed, can build still smarter robot brains, which will build smarter brains, which will in turn. you get the idea. Somewhere along this cycle, AIs become so clever that they either solve all our problems, from cancer to global warming, or they wipe us out as a pesky nuisance, Terminator-style.
But why will we ever build artificial minds, when we can enhance our own minds with all the knowledge of the human race, ever. We cannot know everything at once, but we can look up anything, learn anything.
Our prosthetic brain can tell us everything, from where to turn left, to facts about the Olmecs, or Isaac Newton, or how much information is in a zettabyte.
Matthew Claxton is a reporter with the Langley Advance.