Back in January, the National Science Board (NSB) released a report — National Science Foundation’s Merit Review Criteria: Review and Revisions — recommending that NSF “better define the two criteria for the benefit of the science community.” The report specified three principles governing the National Science Foundation’s (NSF) approach to utilizing these criteria. Last month, the NSF, together with research councils from 50 countries, established a Global Research Council and issued six merit review principles at the conclusion of the first-ever Global Summit on Merit Review. The principles (following the link):
Computing Community Consortium Blog
The goal of the Computing Community Consortium (CCC) is to catalyze the computing research community to debate longer range, more audacious research challenges; to build consensus around research visions; to evolve the most promising visions toward clearly defined initiatives; and to work with the funding organizations to move challenges and visions toward funding initiatives. The purpose of this blog is to provide a more immediate, online mechanism for dissemination of visioning concepts and community discussion/debate about them.
Archive for the ‘policy’ category
NSF-Led Merit Review Global Summit Results in Six Principles
June 6th, 2012 / in policy, resources, workshop reports / by Erwin Gianchandani“Rethinking Privacy in an Era of Big Data”
June 5th, 2012 / in big science, conference reports, policy, research horizons, Research News / by Erwin GianchandaniLast week, the UC Berkeley’s School of Information held a forum — called the DataEDGE Conference — seeking to explore the challenges and opportunities associated with the transition to a data-intensive economy. One of the speakers was danah boyd, Senior Researcher at Microsoft Research and an Assistant Professor at New York University, who discussed the implications of Big Data on privacy — and the role for researchers and technologists moving forward. The New York Times‘ Bits Blog has coverage of boyd’s talk: “Privacy is a source of tremendous tension and anxiety in Big Data,” says Danah Boyd, a senior researcher at Microsoft Research. Speaking last week at a conference on Big Data at the University of […]
NIST Holding BIG DATA Workshop Next Week
June 4th, 2012 / in big science, policy, research horizons, resources / by Erwin GianchandaniThe National Institute of Standards and Technology’s (NIST) Information Technology Laboratory (ITL) has announced plans to hold a workshop on its Gaithersburg, MD, campus next week — Wednesday and Thursday, June 13 and 14 — exploring “key national priority topics” in support of the Federal government’s recently-announced Big Data R&D Initiative. The BIG DATA Workshop is free and open to all, but attendees must pre-register online by this Wednesday, June 6th in order to clear security. According to NIST/ITL (following the link):
“Five Reasons ‘Big Data’ is a Big Deal”
May 29th, 2012 / in big science, policy, research horizons, Research News / by Erwin GianchandaniMobiledia is out this week with an interesting article about “Big Data”: Technology is improving Siri, powering driverless cars, improving cancer treatment and even being called Big Brother. But “big data” is what makes it possible, and why it’s so important. Big data refers to the analytic algorithms applied to vast amounts of data across several different places, or simply the math and computer formulas used to sift through massive amounts of data and analyze the results to answer questions and solve problems. The edge big data has over traditional analytics is its ability to include data types that aren’t organized in tabular formats, including written documents, images and […]
Data, Computing at Center of Presidential Advisors’ Meeting
May 25th, 2012 / in big science, policy, research horizons, Research News / by Erwin GianchandaniData and computing were front and center at today’s meeting of the President’s Council of Advisors on Science and Technology (PCAST) in Washington, DC, with U.S. Chief Technology Officer (CTO) Todd Park summarizing the Administration’s rollout this week of a “digital roadmap” seeking to take advantage of existing government data repositories — and David Ferrucci, head of IBM’s Watson project, and Anthony Levandowski, product manager for Google’s self-driving car technology, delivering talks about the fundamental advances being enabled by their teams’ work (more following the link).
“Troves of Personal Data, Forbidden to Researchers”
May 21st, 2012 / in policy, Research News / by Erwin GianchandaniThe New York Times has posted an interesting story to its website this evening — authored by John Markoff — describing researchers’ access to personal data collected by companies: When scientists publish their research, they also make the underlying data available so the results can be verified by other scientists. At least that is how the system is supposed to work. But lately social scientists have come up against an exception that is, true to its name, huge. It is “big data,” the vast sets of information gathered by researchers at companies like Facebook, Google and Microsoft from patterns of cellphone calls, text messages and Internet clicks by millions of users around the world. Companies often refuse to make such […]