The following is a special contribution to this blog from Elizabeth L. Grossman, a member of Microsoft Corporation’s Technology Policy Group.
Yesterday, Microsoft’s Innovation & Policy Center, in Washington, DC, hosted a panel discussion on “21st Century Cities” as part of the @Microsoft Conversations series. The panel explored the technology and policy opportunities and challenges around making our cities smarter and more energy efficient, such as how information technology (IT) can link people, transportation, and buildings. This blog post provides information about some of the Microsoft sustainability activities in this area — particularly how we use our campus and facilities as a living lab — and future directions in computing and policy to realize these benefits at scale [more following the link].
At the panel discussion, Rob Bernard, Chief Environmental Strategist at Microsoft, described a pilot project on Microsoft’s campus in Washington state to illustrate how IT can reduce energy use in buildings. Specifically, we are working with partners to instrument our buildings and develop analytics and visualization tools that allow us to effectively integrate and make decisions from the data (500 million data points a day) that come from new and existing sensors and systems. Not only did we have to develop connectors to abstract and integrate the data from multiple systems, but, as we now can get as many as 9,000 alarms on a single day, we had to develop models, algorithms, and rules to help determine not just what is wrong with a system, but what functions and people are being affected and what the costs and business impact of fixing (and not fixing) a problem may be. This is the information needed to make good decisions about what should be responded to, and how.
The project is in its first phase of deployment, but already benefits and savings are being seen. One source is the integration of information that allows us to run our buildings more cost and energy efficiently (e.g., staggering when heating and cooling systems turn on at the start of the day). Another is the real-time commissioning of building systems; leveraging the continuous data flow coupled with fault detection algorithms, we can tune or replace equipment as determined by performance and priority, not by a fixed schedule of on-site inspections.
As promising as the initial indications from the pilot project are, there are still outstanding questions and challenges facing building owners and operators and computing experts, as was discussed at yesterday’s panel in Washington, DC. Future opportunities include:
- Better Analytic Models: Improving algorithms, rules, and tools is an important step in enabling effective presentation of the information and good decision-making based on multiple factors, such as cost, energy use, and security. Also, there will need to be ways to track faults not just in the systems being monitored, but in the sensors and monitoring systems themselves. This challenge is made even more complicated by the myriad of different types of equipment and protocols with varying degrees of “smarts.”
- Scaling Up: There are questions about the technologies, business models, and incentives that will enable the use of IT in buildings to scale up. Local governments might want to go from one building to whole cities. International organizations (such as multinational companies or those that run Department of Defense installations) might want to have a global view of facilities in different countries. Multiple infrastructure sectors (e.g., buildings, transportation, and the power grid) might want to integrate across sectors, for example, in planning for and adjusting the interactions of electric vehicles and building electricity use, or traffic congestion and building location.
- Privacy and Security: As these systems, from automated toll collection with congestion pricing to adjustment of heating and cooling depending on the number of people in a room, become more sophisticated, deployment strategies will need to be sensitive to security and privacy goals. It will be vital to be thoughtful about the policies and technologies that control who needs what access to data, at what level of detail, and for how long, as well as the ways information will be aggregated, anonymized, and protected.
- Installed Base: We’re not building new cities from scratch; we will have to figure out how to integrate information technology into existing infrastructure (such as buildings and automobiles). Buildings and other structures like bridges (which can last 50 to 100 years) and are in place and in use. Approaches to gathering data will need to be flexible in terms of the level of disruption imposed, and analytics and sensors will need the ability to be upgraded.
These questions are all about enabling energy efficiency through IT. Another, related topic is energy efficiency of IT, and Microsoft is thinking about that as well (data plants instead of data centers and more efficient use of computational resources).
In the end, the goal of all these projects and all the participants in the 21st Century Cities panel discussion is to use constrained infrastructures and environments (from building systems to roads to the power grid) more effectively. Information technology has a central role to play in those efforts.