On October 20, as part of the rollout of the National Strategic Computing Initiative, the White House announced their nanotechnology-inspired grand challenge to develop transformational computing capabilities by combining innovations in multiple scientific disciplines.
Create a new type of computer that can proactively interpret and learn from data, solve unfamiliar problems using what it has learned, and operate with the energy efficiency of the human brain.
In support of this rollout, the Computing Community Consortium (CCC) has released a statement of support as well as a whitepaper on the Opportunities and Challenges for Next Generation Computing.
This whitepaper articulates some opportunities and challenges for dramatic performance improvements of both personal to national scale computing, and discusses some “out of the box” possibilities for achieving computing at this scale.
The Large Scale Computing Challenges include:
- Anticipating extreme weather events through modeling and monitoring
- Understanding quantum effects in materials and chemistry models
- Search engines for science
- Prediction of human-in-the-loop systems
New approaches are needed to enable the next generation of computing innovations. There is an immense reservoir of innovation possible if computing performance continues to advance at all performance levels. The challenges outlined above frame some of these opportunities. To achieve them, we should invest in a diverse portfolio to further enable more performance and cost-performance growth.
For more information, see the Office of Science Technology Policy blog and the list of supporting documents. Follow this blog for a synopsis of the NSCI Workshop summary later this week.
Trackbacks /
Pingbacks