Scientists
have begun what they say will be a century-long study of the effects of
artificial intelligence on society, including on the economy, war and
crime, officials at Stanford University announced Monday.
The
project, hosted by the university, is unusual not just because of its
duration but because it seeks to track the effects of these technologies
as they reshape the roles played by human beings in a broad range of
endeavors.
“My
take is that A.I. is taking over,” said Sebastian Thrun, a well-known
roboticist who led the development of Google’s self-driving car. “A few
humans might still be ‘in charge,’ but less and less so.”
Artificial
intelligence describes computer systems that perform tasks
traditionally requiring human intelligence and perception. In 2009, the
president of the Association for the Advancement of Artificial
Intelligence, Eric Horvitz, organized a meeting of computer scientists
in California to discuss the possible ramifications of A.I. advances.
The group concluded that the advances were largely positive and lauded the “relatively graceful” progress.
But
now, in the wake of recent technological advances in computer vision,
speech recognition and robotics, scientists say they are increasingly
concerned that artificial intelligence technologies may permanently
displace human workers, roboticize warfare and make of Orwellian surveillance techniques easier to develop, among other disastrous effects.
Dr.
Horvitz, now the managing director of the Redmond, Wash., campus of
Microsoft Research, last year approached John Hennessy, a computer
scientist and president of Stanford University, about the idea of a
long-term study that would chart the progress of artificial intelligence
and its effect on society. Dr. Horvitz and his wife, Mary Horvitz,
agreed to fund the initiative, called the “One Hundred Year Study on
Artificial Intelligence.”
In
an interview, Dr. Horvitz said he was unconvinced by recent warnings
that superintelligent machines were poised to outstrip human control and
abilities. Instead, he believes these technologies will have positive
and negative effects on society.
“Loss
of control of A.I. systems has become a big concern,” he said. “It
scares people.” Rather than simply dismiss these dystopian claims, he
said, scientists instead must monitor and continually evaluate the
technologies.
“Even if the anxieties are unwarranted, they need to be addressed,” Dr. Horvitz said.
He
declined to divulge the size of his gift to Stanford, but said it was
sufficient to fund the study for a century and suggested the amount
might be increased in the future.
Dr.
Horvitz will lead a committee with Russ Altman, a Stanford professor of
bioengineering and computer science. The committee will include Barbara
J. Grosz, a Harvard University computer scientist; Deirdre K. Mulligan,
a lawyer and a professor in the School of Information at the University
of California, Berkeley; Yoav Shoham, a professor of computer science
at Stanford; Tom Mitchell, the chairman of the machine learning
department at Carnegie Mellon University; and Alan Mackworth, a
professor of computer science at the University of British Columbia.
The
committee will choose a panel of specialists who will produce a report
on artificial intelligence and its effects that is to be published late
in 2015.In a white paper outlining the project,
Dr. Horvitz described 18 areas that might be considered, including law,
ethics, the economy, war and crime. Future reports will be produced at
regular intervals.
Dr. Horvitz said that progress in the field of artificial intelligence had consistently been overestimated.
Indeed,
news accounts in 1958 described a neural network circuit designed by
Frank Rosenblatt, a psychologist at Cornell University. The Navy
enthusiastically announced plans to build a “thinking machine” based on
the circuits within a year for $100,000. It never happened.
Still,
Dr. Horvitz acknowledged, the pace of technological change has
accelerated, as has the reach of artificial intelligence. He cited
Stuxnet, the malicious program developed by intelligence agencies to
attack Iranian nuclear facilities, as an example.
“My
grandmother would tell me stories about people running outside when
they saw a plane fly over, it was so unusual,” he said. “Now, in a
relatively few decades, our worry is about whether we are getting a
salt-free meal when we take off from J.F.K. in a jumbo jet.”
No comments: