Earlier in June I spent a week travelling around New Zealand as the keynote speaker at ProductSpec’s Paradigm Shift conference. Every night I’d give a presentation on the future of the architecture industry, and every morning we’d travel to a new city to do it all again. In total we visited five cities and spoke to hundreds and hundreds of architects (as well as most of my family).
Unfortunately the presentations couldn’t be recorded due to the confidentiality of some of the projects. What follows is a summary of the presentation for those who couldn’t make it and, for those who were there, the footnotes and backstory to the key points.
Introduction
https://www.youtube.com/watch?v=USyoT_Ha_bA
I introduced my Paradigm Shift lecture with a discussion of Ivan Sutherland’s Sketchpad (which I’ve written about before). It seemed appropriate to begin a presentation at a conference entitled “Paradigm Shift” with one of the canonical examples of a paradigm shift: Sketchpad.
Looking back, Sketchpad is an unmistakeable paradigm shift. It denotes the exact moment physical drawings were replaced by electronic representations. It is the moment that catalysed the drawing board’s death; the moment that shackled architects to the computer screen; the moment that redefined project delivery.
Although Sketchpad’s paradigm shift is obvious today, it wasn’t necessarily apparent at its conception back in 1963. It constituted a new paradigm, sure, but the shift was much less certain. Computers in 1963 were still rare, large, and expensive. It wasn’t at all obvious that we would use these highly sophisticated machines to do something as menial as drafting. Sketchpad’s paradigm was there in 1963 but the infrastructure to support the shift was almost unimaginable.
Two decades passed before computers were small enough and cheap enough that an individual architect could own one. Only then did CAD become widely adopted. Only then were we able to look back on Sketchpad and recognise it as the moment a paradigm shifted.
Introducing my talk with Sketchpad was a way of explaining the difficulty of presenting at Paradigm Shift. The organisers had asked me to reimagine architecture in twenty years time. It’s something I often do while drinking but in the sobriety of preparing a presentation, I just couldn’t bring myself to sincerely speculate on the long-term future. The presentation I initially put together looked like a bad collection of Gizmodo articles – a top-ten listicle of cool looking bullshit that almost definitely wouldn’t be in the future. After rebuilding the presentation a billion times, I decided the best I could do was identify a couple of major trends in America (which is already ten years in the future as far as New Zealand is concerned) and then speculate on where they might lead. Perhaps they would become new paradigms but, as Sketchpad showed, we won’t be able to definitively say they are paradigm shifts until many years later.
Data is one of the major trends in the AECO industry right now. I’m not just saying that because my employer, CASE, loves data. Sure, when I first arrived CASE gave me a t-shirt that says bldg=data. And yes, the same equation is on the back of my business cards. And yes, we start every presentation at CASE with that formula. But despite my post-doctorate indoctrination, I truly believe the logic behind bldg=data.
In some ways bldg=data is already true. Every major architecture firm is already using a database (a BIM model) to produce buildings.
But notice that the equation states ‘bldg=data’ rather than ‘bldg=BIM’. It is an important distinction. BIM has become something of a misnomer. The building information – the ‘BI’ of the acronym – is the unique aspect of BIM, but it is the model – the ‘M’ – that architects have found particularly captivating (unsurprisingly given their chosen profession). They seek the platonic ideal of a model. They seek the model described in their modelling standards; the model that is singular, infinitely interoperable, and perfectly consistent. While models are valuable, they are only valuable for the building information they contain. There is a wealth of building information that doesn’t make it into these models. At CASE it is the building information rather than the building information model that we are interested in. Hence we use the word data instead of BIM.
Also note that the equation states that ‘data equals buildings’ rather than ‘data becomes buildings’. The equals signifies that while data produces buildings, buildings also produce data (see my latest article in ARCHITECT). Buildings are producing data about their climate, occupancy, operational functioning, and a range of other factors. Whether the building truly equals data is up for debate, but there is no denying that buildings are increasingly becoming valuable sources of data for architects.
With data established as the paradigm, I spent the rest of the presentation considering how architects might be shifted by embracing data.
1. The client
In a recent article for ARCHITECT magazine I wrote about the technology paradox. I’ll leave you to read the article but I’ll summarise it by saying that the client, not the architect, has the most to gain from data.
To simplify a complicated situation, the architect’s typical fee structure (which is based on time rather than value) means that the architect tends to pass the benefits of technology on to their clients. This might manifest as lower fees or higher quality buildings. In either case, it is the client who stands to gain the most from architects using technology. It is the client who will shape how data is used by architects.
One thing to note about clients: they are immensely interested in what happens inside their buildings. In another article for ARCHITECT magazine I’ve discussed how building owners are beginning to track the location of people within their buildings. RetailNext (in the video above) are just one example of a company providing these tracking systems. There are far more sophisticated systems developed in-house at companies like Disney (read this article about the system they use in their theme parks).
At this stage in the presentation there were some nervous laughs and some worried looks. You could tell people were disturbed by the idea that their movements could be tracked without their knowledge.
The only thing that might abate some of these fears is that companies are not really tracking people as much as they are tracking the architecture. When a store owner wants to measure a new layout’s effectiveness, they track people’s movements as a proxy for the space’s performance. When Disney wants to measure the impact of queueing, they examine how long people remain in the park after a long queue.
The tracking of architecture might give architects reason for concern. Clients are already collecting massive amounts of data about how their buildings are performing. They will collect even more data as more and more sensors are embedded into architectural elements (which I have also written about for ARCHITECT magazine). Architecture will increasingly be evaluated in objective terms. No longer will it be enough to entice the client with a pretty render, architects will need to sell their buildings based on performative data.
I had a few examples of architects beginning to use data as a selling point, but it’s probably best I leave them out of this article.
2. The process
I’ve written a lot about the impact of technology on the design process. A few years ago I wrote an article about the MacLeamy curve and how technology potentially lowers the cost of making design changes. In my thesis I expanded this argument, drawing parallels between what has happened in software engineering and what is happening in architecture.
If you have read these articles, I don’t have much to add. For those that haven’t, a quick summary follows.
One consequence of using a data-driven model is that in some circumstances it becomes easier to make design changes. In extreme circumstances the designer can hold off seemingly pivotal decisions, such as a project’s massing, until the very end of the project. Normally these conceptual decisions would be impossible to change without discarding all prior work and starting again, but in a data-driven approach, the change might be as simple as updating some data and leaving the computer to figure out the rest.
There are two important implications. The first is that architects using data can make more informed design decisions by delaying important decisions until they have the best understanding of how these decisions impact the project. The second implication is that the design process becomes more iterative. If the project is easier to change, the architect will make more changes, test more design options, experiment more, learn more, design more.
Of course, anyone who has worked with a digital model will know that while they can be flexible, they can also break easily. I’ve written a lot about this in my thesis. So my description above is a little idealised. With that said, we are definitely moving in the direction of more flexible design processes as we adopt data-driven modelling methods.
3. The knowledge
At CASE we occasionally talk about the DIKW Pyramid. At the base of the pyramid is data. Stacked above it sits information, then knowledge, and at the pinnacle, wisdom.
Most architectural practices are currently operating in the lower half of the pyramid. They generate vast quantities of data in everything from BIM to their emails. On occasion they transform this data into information – drawing sets and the like. Yet firms rarely go back through the data they’ve accumulated to generate new knowledge.
CASE had done a number of projects where we’ve gone back through a company’s data and extracted latent knowledge. We’ve uncovered modelling trends, we’ve exposed relationships between project economics and ecology, we’ve identified the sources of modelling errors.
As firms generate more and more data, they are going to gradually learn to leverage this data beyond immediate project aims. Like their clients, architects are going to have much better knowledge about how their design decisions are affecting the end users. Architects will use data to design from a much more knowledgable position.
Paradigm Shifts in New Zealand
Data is changing the architecture industry. It is changing our clients, it is changing our design process, it is changing the way we learn. In my opinion, data will constitute a paradigm shift simply by virtue of it’s economic imperative – data is valuable.
At all five Paradigm Shift presentations, there was a question about whether New Zealand firms could benefit from data the way large American firms in my presentation had. For context: New Zealand firms tend to be small and working on tiny budgets.
Data certainly benefits from scales of economy. Larger firms are able to amortise and apply research across many projects. They also have access to far more data. As a result, larger firms have tended to be the pioneers of data in architecture. It’s strange to think of large firms as representing the avant-garde of architecture, but in this case the innovators tend to be large.
Fortunately small firms in New Zealand don’t compete with large American firms, although they tend to benefit from filter-down innovation. BIM for instance, has travelled from America to New Zealand. Firms in New Zealand benefit from this technology without needing to go first. They waited for others to generate the knowledge and processes, and then snapped it up once it became economically viable for them.
The same will happen with data. New Zealand firms, like all firms, are already generating vast amounts of data. Those that can harness the data will be in a better position to deliver projects to clients and ultimately make better architecture. These overseas techniques will make their way to New Zealand and perhaps then we can say that the paradigm has shifted.
Special thanks to GIB and the other sponsors of Paradigm Shift for flying me over, which saved my arms getting tired flying myself over. Dad-joke courtesy of Jon Thompson. Beyond the dad-jokes, I am also extremely thankful to Jon and Este Galinanes-Garcia for inviting me over and once more inspiring me. And thanks to Courtney and quiz-master-Don and the crew at Productspec.net for making the event happen.
Evgeny Shirinyan
Hello Daniel,
Thanks a lot for sharing the lecture synopsis!
I really appreciated your statement on building information and critique on current BIM-related buzzwords.
I’d like to share the project friend of mine have done last year, and it’s more on people data, not buildings: http://mathrioshka.ru/mobile-and-sensible-moscow/ As it seems to me here we got another technology (maybe less sophisticated than BIM) – GIS. Must know for architect.
On data and information and knowledge etc. You mean that architects work more with data and information than with knowledge – I think this statement relates to digital process first of all, not the design process itself which is based more on knowledge and experience. It was interesting case I tested Inforbix that is now included in Autodesk PLM360 – http://www.writandraw.org/en/2012/04/11/testing-inforbix-reverse-engineering-of-an-information-management/ The similar direction (if I get your words right) as you develop in CASE
Daniel
Hey Evgeny,
I tend to agree that there’s a lot to be learnt from GIS. In internal discussions at CASE we often reference what is happening in the world of GIS for inspiration for where we might be going. The Mobile and Sensible Moscow project looks facilitating. It reminds me of a more refined version of one of the SmartGeoemtry clusters this year, flows and bits: flowbits.io
I hadn’t come across Inforbix before. It’s too bad that it was acquired. The CASE Building Analytics platform does something similar, although it only works with Revit at the moment. The Revit objects have lots of useful meta-data that we can use to further classify the components and check the model’s adherence to certain standards. I’m expecting we’ll see more solutions like this. All the data is already there, and as you say, if we can structure that data then we have the opportunity to extract knowledge from it.
Evgeny Shirinyan
GIS is awesome because it’s pure database concept. Very simple, granular and it’s older than me and you. Have a look at QGIS, for example. Also it’s pure world of mathematics, topology and calculation. Sometimes it’s good to invent a bicycle but I became a bit sceptic on Processing mapping dataviz.
What was interesting with Sensible Moscow – it’s not about technology. The results were ground breaking for city administrations, they can’t believe that their picture on traffic in the city was wrong. And that is the precious case.