Computing Community Consortium Blog

The goal of the Computing Community Consortium (CCC) is to catalyze the computing research community to debate longer range, more audacious research challenges; to build consensus around research visions; to evolve the most promising visions toward clearly defined initiatives; and to work with the funding organizations to move challenges and visions toward funding initiatives. The purpose of this blog is to provide a more immediate, online mechanism for dissemination of visioning concepts and community discussion/debate about them.


Archive for the ‘big science’ category

 

Cross-layer Reliability Visioning Progress

August 23rd, 2009 / in big science, research horizons, workshop reports / by Kapilendra Patnaik

The Cross-layer Reliability Visioning Study Group met July 8-9, 2009 in Los Alamos, NM.  This was the second of three scheduled meetings focused on how to address the growing challenges imposed by changes in device technology, system sizes, and application requirements.  A major goal of the Visioning process is to reach some consensus on how to achieve reliable computing using unpredictable components across different layers that dictate system reliability (i.e., device technology, design, architecture, software).  While the first meeting focused on defining the multi-dimensional cross-layer reliability design space, including both theoretical and practical aspects of the problem, the second meeting focused on considering cross-layer reliability from different application domains (e.g., […]

Does Better Security Depend on a Better Internet?

February 21st, 2009 / in big science, research horizons / by Peter Lee

Last week the New York Times printed an article by John Markoff entitled, Do We Need a New Internet? In the article, Markoff states, “…there is a growing belief among engineers and security experts that Internet security and privacy have become so maddeningly elusive that the only way to fix the problem is to start over.” Stanford’s Nick McKeown is quoted in the article, “Unless we’re willing to rethink today’s Internet, we’re just waiting for a series of public catastrophes.” The article speculates that in a new network architecture, some users would “give up their anonymity and certain freedoms in return for safety.” It’s certainly exciting to see core computer […]

LSST Science Requirements

June 17th, 2008 / in big science / by Peter Lee

NSF has an account for Major Research Equipment and Facilities Construction (MREFC), to support the development of very large research instruments. Typically, the goal of these instruments, which may cost hundreds of millions of dollars to build and tens of millions of dollars annually to operate, is to find answers to some of the most fundamental questions in science today. For example, LIGO (Laser Interferometer Gravitational-wave Observatory) is designed to detect ripples in space-time caused by changes in very large masses (e.g., a star exploding). Such observations, if made successfully, would finally confirm Einstein’s prediction of the existence of gravitational waves. LIGO has a construction cost of about $300M and […]