Algoritmical Abuse

Published as:

Dimitrie Stefanescu, Algorithmic Abuse, PLAT Architectural Journal, Huston, Fall 2011.

Introduction

The “crisis of complexity” in which the world and its architects apparently find themselves. This condition manifests itself as an information overload, which is seen as the natural consequence of an ever-larger pool of numbers and decimals vying for our attention.[1] In response to this crisis, computational architecture is in a rush to dictate a paradigm shift by promoting the assimilation and implementation of concepts and theories emerging from science and philosophy – which, in combination, are intended to help us to navigate the confusing world described by chaos theory.

Naturally, given the epistemological and ontological framework in which the architectural discourse defines its crisis, “information” and its attendant verb “compute” become the most critical terms of the design process. Information, rationalized as pure numeric data, becomes the driving morphological force not only for the natural world, but for architecture.

Information is processed through the act of computing. Though it is most often used in the context of digital media, “computation” can denote any kind of process or algorithm. Here we must distinguish between two types of computation: material and digital. Material computation refers to the processes that manifest nature’s way of shaping the world around us. These processes primarily drive toward the structural optimization of matter. For example, we can look at the way a soap bubble negotiates (“computes”) the complex relationship between air pressure, gravity, and surface tension to find a shape of minimal energy which balances these parameters. Digital computation, on the other hand, concerns the comparatively rudimentary simulation of such processes within silicon chips. Despite the relative simplicity of the latter method, recent technological advancements have greatly increased our ability to simulate – and thereby explore – different natural processes. More and more complex behaviors and phenomena can be digitally approximated using new algorithms, allowing science to advance in its quest to discover the rules that make our world tick.

It is this trend, however, which brings into question the relationship between the built environment and science. Architecture’s use of scientific images is not new, but, as Antoine Picon notes, this meeting is productive only when there are similarities between the realities upon which both operate.[2] Contrary to Picon’s conditions for productivity, current relations between science and architecture are often based on superficial similarities or metaphors, necessitating a skeptical review.

As an example, we only have to consider the (in)famous Voronoi algorithm. Though it appears in nature at a variety of scales,[3] it makes few (if any) natural appearances at the architectural scale. Critically, “scale” concerns not only (or even primarily) physical dimensions, but with the forces that define the organization of matter at a given threshold. There is a huge difference between the electrostatic-, pressure-, and tension-based factors that operate at the microscopic scale – where the effects of gravity are almost negligible – and the way these forces operate at the scale of the architectural design process.[4] Yet the Voronoi algorithm is often advertised as a generator of organic, natural, efficient designs, which, needless to say, is not inevitable: a Voronoi-pattern facade is not necessarily more environmentally friendly than one using prefabricated elements.

Architectural products are already part of the natural world as a manifestation of material computation.

This analogical relationship to natural phenomena – this mimicry – is also problematic because of its failure to acknowledge the legitimacy of the built environment as an inherent part of nature.[5] Architectural products are already part of the natural world as a manifestation of material computation. This acknowledgement eliminates modernity’s two distinct ontological zones – the human, cultural regime and the non-human, natural regime – and paves the way for an understanding of architecture as a hybrid system in which social forces and natural mechanisms blend. We don’t need to abuse digital computation in order to “fake it” – this only leads to a suppression of the ecological conflict between them. By replacing the dialectical relationship between the built environment and nature with one of inclusion, we discover a better framework for tackling the complexity and subtlety of the environmental issues confronting architecture.

Computational architecture is far from being able to devise a Deleuzian abstract machine pertaining to itself.

Furthermore, there is a significant theoretical base supporting these experiments in computational architecture that attempts to legitimize the architecture as meaningful and usable. Yet, more often than not, these jargon-cluttered texts are little more than rhetoric carefully disguised as unbiased theory. Links to abstract datasets are used to justify the claims of legitimacy and performance of a given design. The results, however, amount to nothing more than disembodied data[6] – un-rigorous translations of information by arbitrary rules and subjective algorithms into geometric forms. The speculation that the diagram becomes reality and that reality becomes a diagram, while valid from a philosophical standpoint, is limited in practice – and certainly in architecture – by the precision of our measurements and our bounded predictive capabilities.[7] In short, computational architecture is far from being able to devise a Deleuzian abstract machine pertaining to itself.

The apparent objectivity of computational techniques cannot mask the subjective, authorial aspects of the design process, still less erase the social, political, and cultural responsibility of the designer.

To sum up, I would like to advocate for a more considered assimilation of computational tools. Rushing to repeat history and promote radical shifts has a high chance of failure and of improper application, and architecture is a realm in which mistakes are difficult and painful to fix. Speculating on the raw power of generating novel formal language is something that should be looked upon with caution. The desire to make the new scientific and philosophical paradigm legible through metaphorical translation of its ideas into architectonic expression is problematically and uncannily reminiscent of postmodernism’s failed formal project. Arie Graafland has argued that “The computational universe turns dangerous when it stops being an [sic] useful heuristic device and transforms itself into an ideology that privileges information over everything else.”[8] Coupled with the natural limitations of a systematic approach, this warning denies computational architecture the convenient “unbiased” arguments often employed to justify its design process. The apparent objectivity of computational techniques cannot mask the subjective, authorial aspects of the design process, still less erase the social, political, and cultural responsibility of the designer. Whatever methods and tools we use, we still play a critical role in the decision making process out of which the built environment emerges.

Computational Responsibilities

Back to Essays. Alternatively, you might enjoy reading next:
comments powered by Disqus