It was drawing in my notebook one day, varying the size of one or two simple "L" shaped abstractions, that I observed my workflow had evolved from doodling to employing a basic algorithm to register forms (for me of some speculative urbanism) in space.
So I began setting up an environment in which I could build a perspective 'machine.' A boundary for randomly occurring urban aggregations, a perspective "view finder" that would situate itself amongst the urbanism, and a drawing 'machine' that would draw it according to the ruleset for my hand drawings over the years. This led to a grasshopper script that would take geometry, break it up into a amorphous (for now) aggregation of blocks, flatten it into a 2D drawing, select certain lines over others (supposing that only a sampling is needed to register the presence of an urban mass and that detail might want to be with held more see blog post titled ""), and then wiggle the path of these lines in a way that might give it the appearance of urban vitality.
The point here wasn't really to try and replicate my hand drawn perspective which is inherently contrived or distorted, the point is to explore whether the algorithm can register the space similarly. This proof of concept allowed me to move forward with the introduction of other elements that would hopefully also help strengthen the algorithm's ability to quickly communicate a perspective of a design iteration of a speculative urbanism.
Hand DrawnEarly Algorithm
The first element I wanted to try adding was foliage. So I developed a script that would redirect some of the original Make2D lines selected (the more horizontal ones as greenery likes to hang from ledges), and redirect them to a vine algorithm. This simply consisted of calling a few points from the line, drawing vertical lines, finding points on those lines, and moving them left and right and using these jogged points to be the points of a spline curve.
Squiggles representing greenery scaled to their position in the registered space
However the immediate execution flattened the registration of space, as my vine length didn't appear to adjust to distance. (They get narrower because they are sampling more and more distant line, but they are all roughly the same length, subverting the composition's depth. Correcting this required I bring in metadata about the where the lines came from (i.e. distance from the viewfinder) and sorted the lines accordingly. So a line with a shorter distance would have more length, and vice versa.
Squiggles representing greenery unscaled based on distance
Reposition the view finder and aggregation to see how the algorithm adjusts demonstrates the constraints discussed thus far were in fact working beyond a single test case, confirming the feasibility of an urban perspective producer.
Algorithm applied to a new position and new subject urbanism.
While obviously urbanism is a dubious claim for this representative task, placeholder plazas, bridges, people, and vertical circulation elements were introduced and adapted to their various constraints.
Process Image
To the left you can see what is 3D Modeled by grasshopper in upper right view vs. what is drawn using grasshopper in upper left view. And the lower left view shows a zoomed out view of the composition. I show this process image to show the translation process of the algorithm. It is both abstracted, and rather direct. Here are a few tests.
The next step will be setting this up to iterate a designed urbanism that auto-instantiates from a set of its own rules. It's a bold claim to employ rules to both lay out a spatial aggregation and employ rules of where and how to represent it. Stay tuned...
Another Algorithm Drawn Perspective of a Speculative UrbanismAlgorithm Drawn Perspective of a Spectulative Urbanism