Computing Community Consortium Blog

The goal of the Computing Community Consortium (CCC) is to catalyze the computing research community to debate longer range, more audacious research challenges; to build consensus around research visions; to evolve the most promising visions toward clearly defined initiatives; and to work with the funding organizations to move challenges and visions toward funding initiatives. The purpose of this blog is to provide a more immediate, online mechanism for dissemination of visioning concepts and community discussion/debate about them.


Grand Challenges from the 11th Heidelberg Laureate Forum

September 26th, 2024 / in conferences, research horizons / by Haley Griffin

Today is day 4 of the 11th Heidelberg Laureate Forum, and throughout the week I have been asking the computing laureates to identify the grandest grand challenges in computing research, and extrapolating grand challenges based on relevant lectures and discussions. Here are some of the challenges that emerged:

  • Increasing Data Efficiency of Computing Systems. Dr. Alexei Efros posited that computers need to require less data to perform well in order to solve a wider range of problems. While children are very good at learning from a few examples, computers are much less data efficient. 
  • Improving Accuracy of Large Language Models. Dr. Vinton Cerf identified hallucination as a significant problem with LLMs today, especially regarding serious topics like financial or medical advice. He explained that the projected confidence of Chatbot style systems is due to the quality of human writing that they are generating conclusions from, and not based on their actual performance. He concluded, “we have a big job in the community that works in this to find ways of detecting and defending against that kind of failure.”
  • Understanding the “Why” of AI. Dr. David A. Patterson discussed the domination of AI in the field of computing today, and how well many experts have ideas on how to improve AI, especially how it understands and interprets information, there is no underlying theory as to why we need it to achieve these things. He believes that if we are able to understand the “why”, we’d be able to make more efficient use of AI.
  • Reducing the Power Requirements of Computing. Dr. Vinton Cerf explained the importance of reducing power needs of computing systems due to environmental and cost concerns. On a related note, he also expressed the need to find different ways of generating power that don’t produce carbon dioxide.
  • Identifying Malicious Deep Fakes and Disinformation. Dr. Raj Reddy spoke about the deep fakes and disinformation that are “the bane of our society”, and suggested using AI tools to help identify them in order to enable correction and/or removal.

When I asked Dr. John Hopcraft about the grand challenges in computing he replied from a computing education lens: “I think the grand challenges are not actually in the computer itself, but in creating the talent of computer scientists. Some of the problems in the US, there is so much emphasis on success [publications and awarded funding] that I think it’s hurting things…one of the grand challenges is, how do we get the creation of talent better?” 

CCC is continuing to work on identifying and defining grand challenges in computing research today, and there will be more to come on these efforts.

I look forward to continuing to share emerging ideas from Heidelberg!

Grand Challenges from the 11th Heidelberg Laureate Forum

Comments are closed.