June 15, 2009
Could advances in computing and materials sciences herald a return for architecture to its Humanist roots?
Could advances in computing and materials sciences herald a return for architecture to its Humanist roots?
The scientific revolution during the Enlightenment led to the division of the architecture-building profession into two distinct disciplines, one based on a continuation of the renaissance graphical and Euclidean-based tradition, the other on a philosophical mathematical abstraction of the physical world.
The discovery of the calculus by Gottfried Leibniz and Isaac Newton led to the continuum being modelled as a series of discrete mathematical monads, as termed by Leibniz. The work of many others such as Robert Hooke, who developed laws related to elastic properties of materials, enabled the physical scientific philosopher to abstract materials and forces into a non-graphical mathematical written language. Prior to this, the building designers of the past created their buildings utilising graphical techniques based on geometries developed by Euclid, and Pythagorean methods to resolve the forces and proportions of their designs. There was a direct and interrelated a priori connection between idea, form and material proportioning to support weight and environmental loadings.
While the disjuncture of the building design process resulted in the specific professions of the architect and engineer, the scientific revolution and the mathematical abstraction of materials enabled the new engineering philosophers to develop new materials such as reinforced concrete, and push the limits of materials such as steel and timber.
The new materials enabled architects to extend the height and scale of their creations to previously unheard of dimensions; however, the complexities of the handwritten mathematical formulas used to calculate building forces, and the resultant sizes of the various elements, required whole floors of white-coated engineers and drafters in the early to mid 20th century. Wind tunnels and seismic shake tables required lab technicians, model makers and large-scale facilities.
The architectural profession continued to resolve their designs graphically and by hand, according to the centuries-old tradition; however, while both the engineer and the architect worked collaboratively and both produced drawings to communicate their designs to the builder, there was a clear demarcation in dialogue between the roles of engineer and architect. The engineer was the facilitator who enabled the construction of a design created by the architect. A linear programme of concept > architectural drawing > engineering calculation > engineering drawing > then construction.
The traditional structural design process is a reactive method of calculations based on a predetermined conceptual geometry. The structural designer determines the deflections, stresses and forces on a structure and then uses this information to determine the appropriate member sizes (i.e. beams, columns, slabs etc.) based on the original building geometry. Generally the areas/members that are the most highly stressed are used to determine the size of the buildings elements, and therefore there can be significant over design and redundancy in the building. This does not necessarily result in the most efficient design or use of resources.
NON-EUCLIDEAN AND NON-LINEAR DESIGN PROCESSES
Post 1950s, the early digital age enabled mathematicians and engineers to codify calculations into ever smaller but more powerful computers, such that computations of ever increasing complexity could be undertaken in minutes by a small team of engineers.
The engineering challenges of World War II produced great advances in materials and techniques. Modernism, the delight of the mechanised form, and mass production techniques led to parallels in the architecture industry. Progress was expressed using complex non-orthogonal forms, lightweight structures, and buildings that reached into the sky. The complexities of the new building challenges, built in zones of high seismicity and/or wind, could only be realised and analysed using computer applications. Structural engineering sought responses from the engineering disciplines of aeronautics and information technology to solve the complexities of design and the search for new materials. Non-linear and Non-Euclidean design processes were required. Architecture and structural engineering was no longer just about Euclidean construction. It required a close examination of the subtleties of nature.
The engineer was no longer just a facilitator of a predetermined architectural design, as in the pre-computer age, but actively collaborated with the architect in pushing the boundaries of material, form and construction.
The computer and CAD revolution in the late 1980s enabled both architects and engineers to share a common working platform for their designs, which led to the start of the blurring of the professions, as more complex building geometries required a closer collaboration and understanding of each others art. In addition to traditional roles requiring hand drawing, there emerged the architect-programmer, and engineer-programmer, who share common computer languages such as Lisp, C++, and Visual Basic among others. Empathy was created between disciplines through the common computer languages. Engineers were using the computer tools for analysis, and architects were using the computer for visualisation.
TWENTY-FIRST CENTURY INTEROPERABILITY
Now in the new century another shift in the design paradigm is taking place beyond non-linearity, a fourth dimension is entering into the design process. Computers are now utilised beyond the realms of techno-boffins. They are everyday tools that incorporate sophisticated means of communication and information transfer. Beyond architectural visualisation and engineering analysis, digital techniques and technology enable interoperability between platforms and manufacturing techniques. Ingeborg Rocker, assistant professor of Architecture at the Harvard Graduate School of Design, describes this phenomenon as versioning…
In contrast versioning a term borrowed from the software development industry [ ] linking software configuration management and engineering data management suggests architecture as a processional convergence of projection and production. Hereby the genuine conceptual and pragmatic implications of the digital medium are equally considered… Departing from the traditional, predominantly ontological comprehension of architecture, versioning suggests that architecture is an evolving and dissolving differential data-design that no longer simply exists, but rather becomes, as it becomes informed in and through the processs different/ciation.(1)
The new paradigm enables both the architect and engineer to enter into seamless dialogue, with the facility for real-time simulation and response the architect orchestrating the design process in a conversation with an engineer responding creatively to the technical matters. This shifts the emphasis from drawings to rapid prototyping techniques and three-dimensional representations. The engineer of the 21st century draws less from the traditions of the industrial age, but is rather an intellect that can source creative responses to design propositions from an interdisciplinary spectrum. In turn, architects embracing the creative possibilities offered by the new techniques are engaging with the technical environment that was traditionally the domain of the engineering specialist.
Versioning, as termed by Rocker, will eventually enable a direct transition from design to production. Computer techniques will serve less as a method to represent and analyse ideas in two dimensions, and more as tools to directly create the idea in physical space.
EVOLUTIONARY STRUCTURAL OPTIMISATION
One aspect of this new paradigm is an evolutionary technique of structural design that utilises the computer-based Finite Element Analysis procedure in a dynamic morphological process. The Evolutionary Structural Optimisation (ESO) technique, developed by RMITs Professor Mike Xie and Grant Steven, seeks the most efficient use of material by altering the shape, topology and geometry of the structure and its various elements. There is a direct and rational connection between form and material. The optimisation of topology is not based on a priori geometries and therefore the outcome of the process is undetermined. It uses finite elements, or bricks, a monad like digital element to build a structure and analyse its stresses and deformations under loads, temperature etc. Each finite element is provided with a set of properties and constraints. At every iteration each finite element is analysed to determine its state of stress and deformation and compare it with all other elements in the structure. The ESO method is an automated procedure in which each iteration consists of a finite element analysis (to determine stresses) and then removal of inefficient/redundant elements (of which there may be many thousands, and therefore it is only possible to do this by high speed computer). The cycle (iteration) of finite element analysis and element removal is repeated many times until a desired geometry is produced.
Typically, the number of iterations ranges from 10 to 100. Over a series of iterations the computer automatically determines the optimum distribution of struts, complex shaped arches, columns and beams. The architect/engineer provides only the initial conditions, such as the external geometry, loads and constraints.
Phyletic gradualism, to borrow a term from biology, describes a process whereby evolutionary change results within a lineage from the slow and continuous accumulation of those mutations that are favoured by natural selection. This causes descendant structures, and also species, to remain well-adapted to gradually changing habitats; there is no sharp demarcation between an ancestral species and its descendant. (2) Much like its biological reference, under ESO the structure evolves and mutates gradually according to the various conditions to which it is subjected.
It is with this background that Professor Mike Xie of RMIT University and I initiated the Innovative Structures Group at RMIT, to develop the Integrated Multidisciplinary Design Environment (ICMDE) process. A feedback loop of wind, earthquake, structural engineering and CAD software, ICDME utilises
Evolutionary Structural Optimisation (ESO), and the related BESO (Bi-directional Evolutionary Structural Optimisation) technique as a core to which interoperable platforms are incorporated to create a seamless design process. It is a procedure that can produce novel forms of structures, in which the shape of the structure has a direct relationship to the specific loads and support conditions applied. Gaudis reference to natural growth and morphogenesis within his work has led the Group to collaborate with Mark and Jane Burry of SIAL in utilising the ESO technique to further understand Gaudis design rationale.
The ICMDE process and ESO-BESO will enable architects and engineers to expand the structural possibilities of their projects. With regards to traditional building forms, they will enable a more efficient utilisation of materials, by determining where material is being inefficiently utilised and redistributing it to other parts of the structure.
The introduction of versioning in the architectural debate consequently alters the nature of professional expertise as it renegotiates the boundaries between practice and theory, between design and concept. In contrast to traditional Taylorism, which emphasises the division of labour, the purpose of versioning is to integrate all processes performed throughout the designs literal and continuous in-formation. Ingeborg Rocker
In the post-digital age, we are witnessing the commencement of a new revolution, which could see the return of the building design professional as a pre-Enlightenment Renaissance all-rounder. The computational power of computers is so great that engineers can now model the forces from wind, earthquake, temperature and other loads on their laptops. Computational Fluid Dynamic techniques enable the digital visualisation of wind forces, which in time may make the wind tunnel redundant. Interoperability between CAD and analysis tools and the portability of computers is creating a seamless design interaction between architect and engineer. The graphical visualisation of environmental forces is enabling a clear dialogue between the design professions, resolving the 300-year-old disjuncture between mathematical philosophers and the arts. This author believes we could well see the day when there will no longer be a distinction between building engineer and architect.
Peter Felicetti is an engineer and architect, and principal of Felicetti Consulting Engineers a practice that is recognised for its innovative approach and involvement with many award-winning projects. With Professor Mike Xie he initiated the Innovative Structures Group at RMIT University.
1. (1) p11, Architectural Design. Versioning: Evolutionary Techniques in Architecture, Wiley-Academy, 2002.
2. (2) p12 M Hildebrand. Analysis of Vertebrate Structure, Fourth Edition, John Wiley & Sons, 1995.