This week the New York Academy of Sciences published a meeting report I wrote and produced covering a recent symposium on "smart" buildings. You can see it here.
The basic idea behind smart buildings is that sensing devices such as photosensors, thermostats, and other more sophisticated gadgets are converging with increasingly powerful information technology, making it possible to use software to monitor a building's performance in realtime, diagnose problems, and ideally find ways to improve it. The idea is becoming increasingly attractive not just for economic reasons, but also within the context of green building. Because buildings consume a huge amount of the energy we use, finding ways to improve their efficiency is seen as an attractive way to work toward meeting carbon emissions goals.
One big concept that came up repeatedly over the course of the meeting was the importance of thinking about building systems holistically. As automated building management systems improve, they are becoming increasingly able to integrate discreet parts of the building's operation (heating, lighting, security, etc.) into a single comprehensive picture. Such approaches offer potential ways to deal with the proverbial New York City apartment that is so overheated in the wintertime a tenant has to open the windows, to take a crude example. It could also help to identify whether a malfunction in one part of a single system really has any effect on the system as a whole.
I was particularly interested in Gregory Provan's talk. Provan is a mathematician at University College Cork in Ireland, and uses mathematical models and machine learning approaches to glean insight into how well a building is performing. In an approach called "continuous commissioning" he begins modeling the building's performance parameters during its design. Later, as the structure is built, occupied, and renovated, data are recorded on an ongoing basis, and compared with the original projections to see when it isn't meeting those goals. Already machine learning algorithms can identify faults within a building when they spot fluctuations in the data. The future is in using this iterative approach to figure out ways to automate the process of improving how the building performs as a system, such that a computer could recommend solutions to inefficiencies in the way the building as a whole was planned.
There's also a very interesting talk from Stephen Samouhos, a graduate student at MIT. He comes from a family that owns a construction business, and so has an insider's perspective on how building owners, builders, and operators think. His basic message is that the technology is great and will become even better, but it won't become scalable until the folks in these parts of the industry can be convinced of the economic and practical value of more expensive, more-complicated-to-run systems.
Jane Snowdon of IBM also gave a very nice overview of how some of the world's most powerful computers work, and how they could help managing individual buildings, but whole cities. She also touches on a very interesting point about information technology and its role in climate change. Although most of us don't think about it, computing consumes huge amounts of energy to enable us to do our Christmas shopping online. Finding ways to improve this requirement is something that the machines themselves are helping to do.
The eBriefing includes a more detailed writeup, as well as multimedia of the speakers' lectures. Check it out.
Comments