Computing Community Consortium Blog

The goal of the Computing Community Consortium (CCC) is to catalyze the computing research community to debate longer range, more audacious research challenges; to build consensus around research visions; to evolve the most promising visions toward clearly defined initiatives; and to work with the funding organizations to move challenges and visions toward funding initiatives. The purpose of this blog is to provide a more immediate, online mechanism for dissemination of visioning concepts and community discussion/debate about them.


Archive for the ‘big science’ category

 

Computing a Cure for HIV

June 27th, 2014 / in big science / by Ann Drobnis

On June 26, the National Science Foundation (NSF) released a Discovery article titled Computing a Cure for HIV, written by Aaron Dubrow, Public Affairs Specialist in the Office of Legislative & Public Affairs.  The article provides an overview of the disease and how it continues to afflict millions of people worldwide. Over the past decade, scientists have been using the power of supercomputers “to better understand how the HIV virus interacts with the cells it infects, to discover or design new drugs that can attack the virus at its weak spots and even to use genetic information about the exact variants of the virus to develop patient-specific treatments.” Here are 9 […]

Recent ISAT/DARPA Workshop Targeted Approximate Computing

June 23rd, 2014 / in big science, CCC, policy, Research News / by Ann Drobnis

The following is a special contribution to this blog by by CCC Executive Council Member Mark Hill and workshop organizers Luis Ceze, Associate Professor in the Department of Computer Science and Engineering at the University of Washington, and James Larus, Full Professor and Head of the School of Computer and Communication Sciences at the Ecole Polytechnique Federale de Lausanne.  Luis Ceze and Jim Larus organized a DARPA ISAT workshop on Approximate Computing in February, 2014. The goal was to discuss how to obtain 10-100x performance and similar improvements in MIPS/watt out of future hardware by carefully trading off accuracy of a com putation for these other goals. The focus was not the underlying […]

DARPA Officially Launches Robotics Grand Challenge – Watch Pet-Proto Robot in Action

October 24th, 2012 / in big science, research horizons, Research News / by Kenneth Hines

Today, the Defense Advanced Research Projects Agency (DARPA) officially kicked off its newest Grand Challenge, DARPA Robotics Challenge (DRC). As we’ve blogged previously, the Grand Challenge calls for “a humanoid robot (with a bias toward bipedal designs) that can be used in rough terrain and for industrial disasters.” DARPA also released a video of Pet-Proto, a humanoid robot manufactured by Boston Dynamics. Pet-Proto, a predecessor to DARPA’s Atlas robot, is an example of what the agency envisions for the challenge. Watch Pet-Proto in action, as it navigates obstacles:   More about the challenge from DARPA: The Department of Defense’s strategic plan calls for the Joint Force to conduct humanitarian, disaster relief and related operations.  The plan identifies requirements to extend aid […]

NSF Announces “Exploiting Parallelism and Scalability” (XPS) Program

October 23rd, 2012 / in big science, research horizons, resources / by Kenneth Hines

This week, the National Science Foundation issued a solicitation for its new Exploiting Parallelism and Scalability (XPS) program. The program aims to support groundbreaking research leading to a new era of scalable computing. NSF estimates that $15 million in awards will be made in FY 2013 for this program.  As the solicitation notes, the Computing Community Consortium (CCC) furnished a white paper earlier this year titled 21st Century Computer Architecture, through which members of the computing research community contributed strategic thinking in this space.  The white paper drew upon a number of earlier efforts, including CCC’s Advancing Computer Architecture Research (ACAR) visioning reports. Here is a synopsis of the Exploiting Parallelism and Scalability (XPS) program from the National Science Foundation: Computing […]

NSF Awards $21 Million to Enable Use of Big Data

October 15th, 2012 / in awards, big science, Research News / by Kenneth Hines

Last week, the National Science Foundation (NSF) awarded $21.6 million to 34 institutions across the country through the foundation’s Campus Cyberinfrastructure-Network Infrastructure and Engineering (CC-NIE) program. The projects will seek to improve U.S. University and college computer networks that are necessary for movement of the large data sets required for data-intensive scientific research. The awards to the 34 institutions across 23 states support two categories of awards: Network Integration and Applied Innovation awards provide support of up to $1 million for up to two years.  These awards address the challenges of achieving end-user network performance across complex, distributed research and education environments.  They seek to integrate existing and new technologies with applied innovations by […]

GNS Healthcare and Aetna Collaborate to Make Use of Big Data

September 27th, 2012 / in big science, research horizons, Research News / by Kenneth Hines

GNS Healthcare, a healthcare analytics company and Aetna, an American managed health care organization, are collaborating to make use of GNS’ supercomputer “REFS” (Reverse Engineering and Forward Simulation). By using predictive analytics with Aetna claims and other health information, the REFS platform will create data models to help the early identification of metabolic syndrome, which increases the risk of heart disease, stroke and diabetes. To see how REFS works with data and creating models, watch the video below: Read more about the collaboration from the GNS Healthcare press release below: