Branching system strategies for Discrete Geometrical Elements – I



Via implementation of relation based design parametricism introduced complexity, which is unparalleled in any previous architectural style. This complexity enabled designers to search for new and unconventional approaches towards design. It is also one of main critique points against parametricism. When dealing with a continuous form, which is subdivided into constructor elements, only a one-way relation is established – “the whole” shapes “the part”, but ‘the part” has no influence over “the whole”, thus the independancy of the part is lost and in turn each part becomes different in shape. This greatly increases the cost for production.
If the relation were to become bi-directional(?) then we could define which parameters of the part, and which parameters of the whole are being read as canon (locked). A good example of this is Gothic cathedrals. The elements of these cathedrals were picked from a preset element (brick) library. For each specific cluster of the building the most suited morphological instance of a brick has been chosen. The suitability evaluation of each element was done by referencing the design intention defined by architectural drawings as the “ideal whole”. This approach has enabled a whole – part – whole relation.


If the typical form to element relation were to be flipped and the element (part) had a complete control over the form (whole), then the final output would be extremely hard to control. Examples in Fig01. show the extremes of overall form change with the slight change of constructor elements. Thus additional notions on how we describe the part need to be implemented.
The definition of “the part” does not contain just physical objects as architectural elements. Whole assembly together with influencing forces, geometrical, ideological, volumetric constraints as well as the relations between these objects, forces and constraints must be incorporated into the definition of “the part”. With this shift the building becomes a part of its elements, its constraints and the forces that influence it. The whole becomes a marriage of the physical and the unseen, thus it becomes contextual not just to the environment but also to itself.
This approach can be re-written as a schemata:


fig01. Part to whole relation base scheme

Changing the constructors of “the parts” domain enables us implement much more precise design methods of the final assembly (“the whole”). If we were to extract this scheme from the structure shown in Fig02 it could be read like this:


Fig02. Part to whole relation scheme reconstructed for specific instance

This structure can be read as a biological process: Over generations each element has the potential to birth two individuals, which are exact copies of the original except that they are born at new coordinates, defined within the “mother” element. Right before all elements in current generation are born they check if they are inside of a “suitable” environment (mathematically it can be described as F(element_centroid) NOT in X_domain, NOT in Y_domain). Analogy to the tale of “Goldilocks and Three Bears” fits in well at this instance – one porridge was too hot (X_domain), one porridge was to cold (Y_domain) and one was just right (domain between X and Y). There is also a second check that takes place – it is the collision test. If the area in which the elements from current generation would be born in is taken, then no new elements will be generated in said area.

This enables us to produce a non-self-intersecting structure of homogeneous elements, which populate any volume that we specify. To rephrase – the whole informs the environment, which informs the assembly ruleset of the elements. The elements in turn keep the relation to one another as well as their own attributes in a separate information feed thus enabling the production and assembly of the structure to be simple and cheap.


Once the notion of “suitable environment” has been implemented to the definition of “a part”, more constructors have begun to emerge – the most important one being the starting position of the population (the x,y,z coordinates as well as x.rot,y.rot,z.rot alignment of element(s) at generation #0). Given the restraints of how the elements multiply it is clear that the change of the starting position would influence the number of generations which can fit in the suitable environment. To find the starting position which generates maximum numer of elements inside given environment with given constraints is virtually impossible without the use of either evolutionary solvers such as Galapagos for Grasshopper or machine-learning solvers such as Octopus for Grasshopper.

fig03. Preliminary tests of colission detection as well as generation_0 search.

Also it is worth noting that the position and alignment of the starting element dictates what role said element receives in the final structure. If the element is placed on the ground plane and further generations are generated vertically upwards, then the element will be read as a column base. If the starting element is constructed at the top of predefined environment bounds and further generations of elements are being generated vertically downwards, then the element can be read as a keystone of a pseudo-dome, which would have either convex or concave curvature depending on the angle between elements’ branches.



The approach defined in previous chapters relies heavily on the morphology of the constructors (parts) so when forming a library of said parts a high focus must be directed to the effects that each constructor has on the assembly. For instance a constructor (package of physical part and it’s corresponding ruleset) which has one branch will form a curve-like assembly. If the constructor has two branches with angle between them being higher then 180 degrees, the assembly will start folding into itself, and so on. A mixture of these behaviors, controlled by a specified pattern enables a potentially infinite variety of wholes (final assemblies). The schemata for the described library is a pretty simple one: each element has one root node (the starting position and alignment data) and a number of branch nodes (anchor positions for next generation of elements) fixed to it. While the root node only describes the position and alignment in X,Y,Z coordinates, the branch nodes additionally have rotational information embedded, meaning that every generation of elements will be rotated in relation to the previous generation.
To prepare a library which can be used to control the behaviour of “the whole”, series of experiments must be made, recording how different parameters change the output. Since the parameters are not integer based, meaning that the change in the parameters is fluid rather than stepped, the evaluation must be made by recording and evaluating an animation rather than series of instances.


Fig04. From left: single branch element – A and its assembly at 10 generations, multiple branch element – B and its assembly at 10 generations, assembly C created by ABABABABABA pattern

As seen in Fig04 each library element receives an index tag (such as A,B,…,Z). Then a pattern gets defined by either a mereological formula (such as A=BB, B=C, C=ZA, etc.) or by generation based placeholder set (Generation_0 – A, Generation_1 – B, Generation_2 – A, Generation_3 – Z). For now I will focus on the second approach since it is much easier to find behavioral patterns when dealing with semi-binary data instead of sophisticated part-to-part relationship defining algorithms. The strength of the primitive pattern is it’s flexibility – the sequence can be adjusted either manually (inserting symbols by hand), pseudo manually (remapping music to numbers, remapping DNA sequence to numbers, etc.) or automatically (using previously mentioned evolutionary solver algorithms to generate the most suitable pattern for given factors).


fig05. Mixed element library implementation to generate final structure. Left side – collisions off, Right side – collisions on

Phase I of the research has been completed during Winter of 2017. Findings have been implemented in Aggregation and Graph-based modeling workshop in VAA, Vilnius as well as AAHN15 – Creative Tools course in LTH, Lund during Spring of 2017.