Winter Plowing, Artificial Intelligence & The End of the World

Part of my property is older farmland. You can still see the furrowed ridges, the soil is loose and wet, and when it froze this month the entire ground was pulverized into gravelly dirt.

To better understand what was happening, I turned to Google. And the most complete answers and explanations I could find came from an 1866 treatise on farming, published in Richmond, Va. Which makes sense, on the one hand – while the science and specifics of farming have changed greatly in the last 200 years, the basic concepts haven't.

But it still blows my mind that Google can locate this old publication, printed long before searchable PDFs and indexable text.

Full stop.

Today I was reading a research paper out of the Georgia Institute of Technology titled “Using Stories to Teach Human Values to Artificial Agents.” It posits that teaching a computer/robot how to behave in human interactions is a monstrously complicated task that could be made much simpler if the computer could just read and understand “stories.”

 “If values cannot easily be enumerated by human programmers, they can be learned. We introduce the argument that an artificial intelligence can learn human values by reading stories.”

So far, it makes perfect sense to me. And not because we've hit that point in the evolution of technology, but because this is a part of how human behavior has always been passed: We have many religious texts or any number of fables and allegories, which we use to pass on behavior and values.

And that works, as our communities curate the work we pass on.

“Giving artificial intelligences the ability to read and understand stories may be the most expedient means of enculturing artificial intelligences so that they can better integrate themselves into human societies and contribute to our overall well being.”

This isn't meant to be a doomsday prediction, really. But if computers begin to learn autonomously, the amount of “stories” available to them is astounding. Every day millions of words are printed in newspapers around the world. But these are not good stories, for the most part. And by definition, what appears in the newspaper rarely conforms to normal behavior.

Dog bites man: not a story. Man bites dog: Front page.

It seems to me that using stories to teach behavior hinges heavily on which stories are used. But the bulk of today's stories, the daily press and blogoshphere, do not teach the behavior we want to pass on but rather focus on the things we do not. And in the context of a news story, written to be above judgement, can values be derived?

Garbage In; Garbage Out.

Even with value alignment, it may not be possible to prevent all harm to human beings, but we believe that an artificial intelligence that has been encultured—that is, has adopted the values implicit to a particular culture or society—will strive to avoid psychotic-appearing behavior except under the most extreme circumstances.

Sadly, psychotic behavior appears all the norm today. At least, judging by the news.

As to the farming, the answer was simple enough: The ground is aerated and wet, and water expands when it freezes. Elliott & Shields, the publisher of the farming article, recommended winter plowing after the first frost to take advantage of the pulverized soil.

What's that got to do with robots? Maybe nothing, unless they start farming (which, of course, robots already do, albeit in limited ways). But it was an interesting reminder of just how much information exists and how quickly we're moving to unlock it.

I won't be preparing the soil for another couple of months, but I will be on the lookout for robots wandering around with the day's newspaper.

Posted on February 16, 2016 .