For those who may have missed it, an article in last week’s Bloomberg Businessweek — under the heading “creating chips that learn and respond as they gain experience” — described recent and ongoing advances in AI, cognition, and human-computer interaction:
In a windowless room deep inside IBM’s Almaden Research Center in San Jose, scientists are teaching a computer chip to learn from what it sees, much like a human.
The effort is paying off, if performance at Pong is any measure. When the chip, part of a project called SyNAPSE, first learned to play the classic videogame in March, it did poorly. Weeks later, the company reports, it was nearly unbeatable.
The SyNAPSE chip was designed to learn through experience, find correlations, create hypotheses, and remember outcomes. As chips such as the one from SyNAPSE become smarter and smaller, it will be possible to embed them in everyday objects. That portends a future in which the interaction between computer and user is far more natural and ubiquitous.
“Computers were originally designed to solve math problems and that’s what they’re really good at — symbolic computation,” says Steve Esser, one of three scientists teaching the SyNAPSE chip. “Anything that involves visual processing, auditory processing, or speech processing — they can do it, but they’re just not very good at it…”
“[But] computing is undergoing the most remarkable transformation since the invention of the PC,” said Intel Chief Executive Officer Paul Otellini during his company’s developer conference in September. “The innovation of the next decade is going to outstrip the innovation of the past three combined.”
Check out a video of SyNAPSE after the jump…
And still more from the Businessweek story:
At Microsoft, researchers are looking at the ways in which users will interact with computers in 3D spaces. In September the company announced a project reminiscent of Star Trek‘s Holodeck, a simulated reality room where people could interact naturally with virtual objects and other individuals.
Microsoft’s effort, called Holodesk, is a like a mini-Holodeck for the office desk, which lets workers interact and manipulate virtual 3D images. It uses an Xbox Kinect camera and an optical transparent display to give people the illusion that they’re interacting with 3D graphics. For example, a user can juggle virtual balls or hold a virtual prototype of a smartphone.
The Holodesk is part of Microsoft’s work in what it calls natural user interfaces. It looks at how people will interact with computers when computing power is everywhere and not limited to a PC.
“There is this real sense that this is a dramatic new trend for the industry and for Microsoft,” said Steve Clayton, who writes about internal research for the company blog, Next at Microsoft. “We’ve been investing a lot of time and effort over this vision.”
At Intel, the world’s biggest chipmaker, Brian David Johnson spends quite a bit of time thinking about the future — the year 2020, to be precise. In fact, the futurist recently participated in a conference call about building Intel’s 2020 CPU. As chips become embedded in many different devices, and not just personal computers, the company has realized that it needs to change its approach.
“Fast and less-expensive and smaller isn’t enough anymore; we really need to have an understanding of what we’re going to do with it,” says Johnson, who travels the world talking to people about how they envision the future.
“To be a human in 2020, it will begin to feel like data is taking on a life of its own,” he says. The proliferation of computing into everyday objects will generate massive quantities of sensor and other data, with algorithms talking to algorithms and machines talking to machines, he adds. “That algorithm — that thing that processes that massive amount of data — will need to have an understanding of what it means to be human.”
Innovations in computing are also leading to discoveries in medicine and other industries. At IBM, researchers have found that the same particles used to create silicon chips can be used in the human body to fight antibiotic-resistant staph infections, heal wounds, and even help fight cancer.
At IBM, Esser’s focus will remain on teaching the SyNAPSE chip to learn and remember — something Watson, the company’s Jeopardy-playing supercomputer, couldn’t do.
“With Watson, someone had to type in the Jeopardy questions. He wasn’t able to listen to Alex Trebek and understand what he was saying,” says Esser. “This chip,” he says of the SyNAPSE project, “would be able to listen to him.”
Read the full story here — and share your thoughts below.
(Contributed by Erwin Gianchandani, CCC Director)