Technology writer Steve Lohr had a great piece about the future of computing — and other fields — in The New York Times yesterday:
TECHNOLOGY tends to cascade into the marketplace in waves. Think of personal computers in the 1980s, the Internet in the 1990s and smartphones in the last five years.
Computing may be on the cusp of another such wave. This one, many researchers and entrepreneurs say, will be based on smarter machines and software that will automate more tasks and help people make better decisions in business, science and government. And the technological building blocks, both hardware and software, are falling into place, stirring optimism [more following the link…].
Michael R. Stonebraker, a pioneer in database research, is one of the optimists. Software used by companies and government agencies — in products sold by Oracle, IBM, Microsoft and others — descends from research done in the 1970s by Mr. Stonebraker and Eugene Wong, a colleague at the University of California, Berkeley, as well as a team of scientists at IBM.
Today, Mr. Stonebraker sees an opportunity for new kinds of ultrafast databases. The new software, he explains, takes advantage of rapid advances in computer hardware to help businesses and researchers find insights in the rising flood of data coming from so many sources, including Web-browsing trails, sensor data, genetic testing and stock trading.
So, at 68, Mr. Stonebraker is a co-founder and chief technology officer of two start-ups in the field of data-driven discovery, VoltDB and Paradigm4.
“Now is the time,” says Mr. Stonebraker, who is an adjunct professor at the Massachusetts Institute of Technology’s computer science and artificial intelligence laboratory. “The economics and the technology are ripe.”
The case for optimism is by no means unqualified. The march of these technologies raises social issues, including privacy concerns, and the timing is uncertain. All of the bold predictions in the 1990s that the Internet would disrupt traditional industries like media, advertising and retailing did come true — a decade later.
But a series of related technologies, scientists and entrepreneurs say, has reached a critical mass — come to a digital boiling point, so to speak — so that new products and capabilities become possible. The technical ingredients, they note, include powerful, low-cost computing and storage spread across thousands of computers. The digital engine rooms of Google and Amazon are prime examples.
Another fast-improving technology involves inexpensive and intelligent sensors, which are crucial to a new breed of automated machines like experimental driverless cars and battlefield drones. Clever software — notably machine-learning algorithms — animates much of the current wave of smarter technology. Two well-known examples are found in Watson, the “Jeopardy”-winning computer from IBM, and the movie recommendations on Netflix.
ADVANCES in such underlying technologies are fueling the current excitement in fields like artificial intelligence, robotics and data analysis and prediction. “All parts of the technology pipeline are gearing up at the same time, and that’s how you get this explosion of new applications and uses,” says Jon Kleinberg, a computer scientist at Cornell University.
Behind the seeming explosion, experts say, is a process of technology evolution. Paul Saffo, a technology forecaster, compares the process to the evolutionary biology concept known as “punctuated equilibria” formulated by the paleontologists Stephen Jay Gould and Niles Eldredge. The idea is that species often evolve in periodic spurts.
Yet, they say, there are typically years of progress before a commercial breakthrough in the technological realm.
“Even in Silicon Valley, it takes most technologies 20 years to become overnight successes,” says Mr. Saffo, a consulting professor at Stanford’s school of engineering.
The Internet provides a case study of both technology’s evolutionary progress and its exponential growth. In 1969, there were only four computers connected to the nascent Internet, compared with roughly a billion computing devices today, from laptops to cellphones, says Edward Lazowska, a computer scientist at the University of Washington [and chair of the Computing Community Consortium Council].
The early increases in connected computers drew scant attention. “But at some point in the late 1990s,” Mr. Lazowska says, “you were going from 4 million to 8 million to 16 million to 32 million to 64 million, and people started to notice that something revolutionary was going on.”
…
FOR Mr. Stonebraker, the hardware advance that opens the door to his start-ups is the striking improvement of solid-state memory, as performance climbs and prices plunge. Solid-state, or flash, memory is most widely known as the lightweight storage technology used in consumer devices like small music players and smartphones.
But increasingly, solid-state memory can be used in big computers, holding a hefty database in memory instead of sending data off to be stored on disk drives. According to Mr. Stonebraker, some data-handling tasks can now be completed 50 times faster than with conventional systems.
“Memory is the new disk,” he says. “The obvious thing to do is to exploit that technology.”
In the yin and yang of computing, it is software that exploits hardware, enabling a computer to do useful things. And machine-learning programs and other data-sifting software are advancing swiftly.
“There is no point in collecting and storing all this data if the algorithms are not able to find useful patterns and insights in the data,” says Mr. Kleinberg at Cornell. “But the software is scaling up to the task.”
Check out the full article here.
(Contributed by Erwin Gianchandani, CCC Director)
Trackbacks /
Pingbacks