Cite as: Dimitrie Stefanescu, Algorithmic Abuse, PLAT Architectural Journal, Huston, Fall 2011.
A recurring concern among the practitioners and promoters of what we shall refer to generally as “computational architecture” is the oft-mentioned (but rarely justified) “crisis of complexity” in which the world and its architects apparently find themselves. This condition manifests itself as an information overload, which is seen as the natural consequence of an ever-larger pool of numbers and decimals vying for our attention[1]. In response to this crisis, computational architecture is in a rush to dictate a paradigm shift by promoting the assimilation and implementation of concepts and theories emerging from science and philosophy – which, in combination, are intended to help us to navigate the confusing world described by chaos theory.
Digital Computation and Limitations
Naturally, given the epistemological and ontological framework in which the architectural discourse defines its crisis, “information” and its attendant verb “compute” become the most critical terms of the design process. Information, rationalised as pure numeric data, becomes the driving morphological force not only for the natural world, but for architecture.
Information, rationalised as pure numeric data, becomes the driving morphological force not only for the natural world, but for architecture.
Information is processed through the act of computing. Though it is most often used in the context of digital media, “computation” can denote any kind of process or algorithm. Here we must distinguish between two types of computation: material and digital. Material computation refers to the processes that manifest nature’s way of shaping the world around us. These processes primarily drive toward the structural optimisation of matter. For example, we can look at the way a soap bubble negotiates (“computes”) the complex relationship between air pressure, gravity, and surface tension to find a shape of minimal energy which balances these parameters. Digital computation, on the other hand, concerns the comparatively rudimentary simulation of such processes within silicon chips. Despite the relative simplicity of the latter method, recent technological advancements have greatly increased our ability to simulate – and thereby explore – different natural processes. More and more complex behaviours and phenomena can be digitally approximated using new algorithms, allowing science to advance in its quest to discover the rules that make our world tick.
Architecture’s Use of Scientific Images
It is this trend, however, which brings into question the relationship between the built environment and science. Architecture’s use of scientific images is not new, but, as Antoine Picon notes, this meeting is productive only when there are similarities between the realities upon which both operate.[2] Contrary to Picon’s conditions for productivity, current relations between science and architecture are often based on superficial similarities or metaphors, necessitating a skeptical review.
current relations between science and architecture are often based on superficial similarities or metaphors
As an example, we only have to consider the (in)famous Voronoi algorithm. Though it appears in nature at a variety of scales, it makes few (if any) natural appearances at the architectural scale. Critically, “scale” concerns not only (or even primarily) physical dimensions, but with the forces that define the organisation of matter at a given threshold. There is a huge difference between the electrostatic-, pressure-, and tension-based factors that operate at the microscopic scale – where the effects of gravity are almost negligible – and the way these forces operate at the scale of the architectural design process.[4] Yet the Voronoi algorithm is often advertised as a generator of organic, natural, efficient designs, which, needless to say, is not inevitable: a Voronoi-pattern facade is not necessarily more environmentally friendly than one using prefabricated elements.
This analogical relationship to natural phenomena – this mimicry – is also problematic because of its failure to acknowledge the legitimacy of the built environment as an inherent part of nature.[5] Architectural products are already part of the natural world as a manifestation of material computation. This acknowledgement eliminates modernity’s two distinct ontological zones – the human, cultural regime and the non-human, natural regime – and paves the way for an understanding of architecture as a hybrid system in which social forces and natural mechanisms blend. We don’t need to abuse digital computation in order to “fake it” – this only leads to a suppression of the ecological conflict between them. By replacing the dialectical relationship between the built environment and nature with one of inclusion, we discover a better framework for tackling the complexity and subtlety of the environmental issues confronting architecture.
Disembodied Data
Furthermore, there is a significant theoretical base supporting these experiments in computational architecture that attempts to legitimise the architecture as meaningful and usable. Yet, more often than not, these jargon-cluttered texts are little more than rhetoric carefully disguised as unbiased theory. Links to abstract datasets are used to justify the claims of legitimacy and performance of a given design. The results, however, amount to nothing more than disembodied data[6] – un-rigorous translations of information by arbitrary rules and subjective algorithms into geometric forms.
The speculation that the diagram becomes reality and that reality becomes a diagram, while valid from a philosophical standpoint, is limited in practice – and certainly in architecture – by the precision of our measurements and our bounded predictive capabilities.[7] In short, computational architecture is far from being able to devise a Deleuzian abstract machine pertaining to itself.
[…]computational architecture is far from being able to devise a Deleuzian abstract machine pertaining to itself.
To sum up, I would like to advocate for a more considered assimilation of computational tools. Rushing to repeat history and promote radical shifts has a high chance of failure and of improper application, and architecture is a realm in which mistakes are difficult and painful to fix. Speculating on the raw power of generating novel formal language is something that should be looked upon with caution.
Conclusions
The desire to make the new scientific and philosophical paradigm legible through metaphorical translation of its ideas into architectonic expression is problematically and uncannily reminiscent of postmodernism’s failed formal project.
Arie Graafland has argued that “The computational universe turns dangerous when it stops being an [sic] useful heuristic device and transforms itself into an ideology that privileges information over everything else.”[8] Coupled with the natural limitations of a systematic approach, this warning denies computational architecture the convenient “unbiased” arguments often employed to justify its design process.
The apparent objectivity of computational techniques cannot mask the subjective, authorial aspects of the design process, still less erase the social, political, and cultural responsibility of the designer. Whatever methods and tools we use, we still play a critical role in the decision making process out of which the built environment emerges.
Bibliography
1. DeLanda, Manuel. 2000. A Thousand Years of Nonlinear History. London: Zone Books.
2. Stewart, Ian and Jack Cohen. 1995. The Collapse of Chaos. Discovering Simplicity in a Complex World. London: Penguin Books.
3. Picon, Antoine. 2003. Architecture, Science, Technology and The Virtual Realm in Architectural Sciences, Princeton: Princeton Press, p.292–313.
4. Graafland, Arie. 2010. From Embodiment in Urban Thinking to Disembodied Data. The Disappearance of Affect. Delft: TU Delft.
5. Latour, Bruno. 1993. We Have Never Been Modern. Hertfordshire: Harvester Wheatsheaf.
Notes
[1] Regarding informational overload, Cory Doctorow has proposed in his blog post on The Guardian on the 22nd of February 2011 an interesting corollary stemming from it: redundancy. He also argues that meaningful and important information will eventually surface into the mainstream through the use of new social media mechanics.
[2] A. Picon. 2003. Architecture, Science, Technology and The Virtual Realm in Architectural Sciences, Princeton: Princeton Press, 294.
[3] From the way cells are organized and shaped to the veins on a dragonfly’s wing to the scales on a crocodile’s skin to the way matter is distributed in the universe, the principles of the Voronoi diagram fundamentally shape matter at completely different and surprising scales.
[4] For a more in-depth description of this phenomena, see Dimitrie Ștefănescu, “f* Voronoi.” Last modified October 28, 2010.0. http://improved.ro/blog/2010/10/f-voronoi.
[5] This is probably the biggest change in thinking we, as architects, have to assimilate. Ever since the Roman rituals for founding a city by removing a plot of land from the chaotic influence of nature, the artificiality of the built environment has been conceived as existing in opposition to the natural world. The critique of this dichotomy is an important element of contemporary architectural discourse, and deserves a longer discussion than can be provided here. An extended and incisive treatment of the subject can be found in Manuel de Landa, A Thousand Years of Nonlinear History. London: Zone Books.
[6] Graafland, Arie. 2010. “From Embodiment in Urban Thinking to Disembodied Data.” The Disappearance of Affect. Delft: TU Delft, 42.
[7] We will never be able measure to the last significant digit – therefore, whatever the accuracy of our predictions, they will still have the seed of a design. Coupling this with Kurt Gödel’s incompleteness theorem, which states that for any system there will always be true facts within it that cannot be proven to be true using the rules of the same system, we can clearly see the inherent limitations of any systemic or structuralist approach. For example, for all the advances of technology and science, weather predictions more than five days in advance have had the same accuracy rate since the 1950’s.
[8] (Graafland 2010, 46).