Let’s Face It: Connecting Nodes Sucks

Introduction to visual scripting in Dynamo BIM

I remember back in College I foolishly decided to dive into a generative design studio. I set out to use Grasshopper for the first time. It was magical. I created a point with three inputs, a second point with three more inputs, and then I connected the two points with a line node. The total effort? 9 nodes, 1 line, 30 minutes, three online tutorials, and one tall coffee.

Visual scripting example

Given this introduction to visual scripting, I had in the back of my mind that mass adoption would take decades for this kind of thing because the barrier to entry was so high. Let’s face it: drawing a line on paper is easy. Drawing a line in CAD requires you to type “line”. Drawing a line in Grasshopper requires a 30 minute tutorial. It is not the most intuitive of things.

Introduction to Algorithms

My name is Clifton Harness. I am a CEO. This doesn’t mean much, other than people are more likely to return my sales emails because they know who they are dealing with (maybe?). I have a B.Arch from The University of Texas (Hook ‘Em) and I worked in Real Estate Development for two years. I made the conscious decision to quit my job (a very fiscally irresponsible one) to do a software startup. TestFit was founded to help architects and developers get to the highest-and-best use for real estate (nearly) instantaneously. In this artfully written post, I will describe four algorithms that we built, and how they stack up against Design Script (Dynamo BIM in this case).

What is an algorithm? Easy. It’s a list of steps to follow. You do them all the time, every day. Lets look at one together: Not waking up.

  1. Realize sleepy time is now over
  2. Open Eyes
  3. Heavy sigh
  4. Roll over
  5. Close Eyes
  6. Fall back asleep */ Stay in this loop */

What just happened? This is the algorithm for sloth.dyn. I do this almost every day, and despite my wife’s many interrupt handling functions, nothing has changed. All an algorithm is? A list of steps to follow.

An Algorithm for the Placing of Fire Walls

Automated placement fire walls

This one was extremely difficult for me to put into words how it should work, but my partner (genius software developer Ryan Griege) came up with an amazing solution. We (he) wrote some code to do this instantaneously. Some basic rules:

  1. Buildings should have an adjustable maximum size
  2. Minimize the number of buildings
  3. Don’t place firewalls in corners

With this code, I will never draw another polyline. Ever.

An Algorithm for Placing Staircases

automate staircase placement

This one is easy.  All we need are some rules to follow. Stairs should be placed within 50 feet of a dead-end corridor

  1. Stairs should be placed at least every 250 feet from one another
  2. Place at least one stair per building area
  3. Place a stair in the corners of the garage

We slowed down the algorithm by an order of several magnitudes so that you can see what is happening. With this kind of logic we can place not only stairs but could be expanded to place other kinds of rooms, such as mechanical rooms, IDF closets, or elevators.

An Algorithm for the Placing of Units

This one was much harder, and I won’t go into a great deal of detail, but the basic premise is to try to course units perfectly given the shape of the building and the size of building mass you are attempting to fill with units. Units have criteria on widths, depths, size, and glazing requirements.

An Algorithm of Many Algorithms

multiple algorithms power TestFit

When we take these three algorithms and have them work together, we have a pretty useful test-fit solution for multifamily buildings. I skipped a lot of detail, because that is our intellectual property (I hope you understand), but gave enough for you to get a sense of what is actually going on under the hood of this software.

The Explosive Power of Dynamo

TestFit Dynamo

TestFit is a test fit solution (did we name it correctly?). A test fit is a diagrammatic wireframe- it is not a building. How can we make it into a building? In this sample Dynamo script, we turn that wireframe into Revit geometry. I cannot even imagine what a proficient Dynamo programmer could do with this. I am thinking they perform VDC operations onto the wireframe, start building out rooms with doors, windows and facade details. Even better? Do energy analytics, start running mechanical systems, or costing information. But that is a painfully limited scope of what dynamo can do. Its true power is that it is broad in scope–a lot of different things can be done. TestFit is narrow and powerful. Both TestFit and Dynamo BIM together? Lethal.

The Future of Generative Design

TestFit generative design

Point solutions for specific typologies and morphologies written in custom code (like python or C) will knock Generative Design out of the park. We did 120 iterations (imagine a million iterations!) on a site and came back with a fat stack of data to analyze. Trace paper cannot compare to this. The problem is that most generative platforms (I am looking at you project fractal) don’t constrain the geometry enough for the result to be worth anything. With TestFit, we will get to the point where the geometry is constrained enough for generative design to make sense. Some examples of constraint:

  • Adding additional fire stairs, when buildings get too big
  • Adding additional elevators, when buildings get too tall
  • Growing parking, when units become too numerous
  • Regulatory constraints (max FAR, max lot coverage, max height)

With these constraints plugged in, we can then pursue goals to achieve for the generative analysis:

  • Most energy efficient
  • Most natural light
  • Most economic to build
  • Best views from the building

Without a proper constraint engine that encompasses almost everything, the goals to achieve are useless. We are building that constraint engine, right now, and for us it starts with TestFit.

Clifton is CEO of TestFit.io.  Learn more about TestFit to Revit via Dynamo BIM


Macro BIM: When global parameters rule everything

CAD to micro BIM to macro BIM

As a skilled CAD draftsman, hand-drafting is painfully slow, and as a BIM modeler, CAD is painfully slow, but what makes BIM painfully slow? It is an interesting thought experiment.

The Glory Days of Blueprints <1980s

Glory Days of Blueprints 1980s

The tools for building design have always reflected the available technology. Medieval masons scratched on plaster in their lofts and the modern architectural masters sketch with ink on paper. The glory days of blueprints ended far before I was born in 1991, but alas we are stuck with the word “blueprint” from a chemical reprographic process.

From Hand To Mouse

Introduction of CAD real estate architecture

In the AECO industry, we never fully captured the use of CAD. Many are still manually setting the height of annotations, creating sheets one at a time, or using static blocks or no blocks at all. And now, BIM has emerged. The concept of BIM is simply Building Information Modeling. Take the entire building-worth of information and model it! I asked myself then, as an intern, if “going to BIM” was a good idea. If we could not harness the power of CAD fully, how will we capture BIM fully? As I have learned in the real world, this view is far too pessimistic. Even partial adoption of either of these ideas is better than hand-drafting everything.

From CAD to BIM


Within a single generation, one architect can do the work of an entire team of architects by utilizing BIM. It seems that technology is enabling us to do significantly more with less. The latest innovation is BIM. What most folks use right now is micro BIM. Micro BIM can model EVERYTHING. It holds so much detail that there are even options for it to show you less detail. Micro BIM can tell you how many nails (edit: this is hyperbole- most won’t do this in practice), linear feet of pipe, or number of water closets there are in a 3,000,000 sf building. 

It is very impressive, but attaining accuracy from a BIM model takes a significant amount of investment up-front. The whole building must be built, by hand (mouse?) inside a computer. And just like a real world building, editing this virtual building can become very time consuming. Every object must be modeled, hosted, and placed. It is a painstaking bottom-up process… and for a while is slower at immediate results than CAD.

CAD vs BIM implementation AECO

But what will my generation do? Can we squeeze this process again?

CAD to micro BIM to macro BIM

Macro BIM

Micro BIM has been successful because building a virtual building piece-by-piece is more accurate than drawing a representation of it. Macro BIM is simply BIM from the top-down. Global parameters rule everything. Macro BIM significantly reduces the amount of time spent on design by accurately defining the relationship between micro BIM components.

TestFit and Macro BIM aka BIM from the top down

Here is how to implement it:

Algorithm Architects

This group of people will use macro BIM to efficiently solve micro BIM problems. The lines of code these algorithm architects write up are something more like a design parameter governing all lower-level building components:

  • Ensure all entry doors are within 100′ of a parking space
  • Ensure all units are at least 30 feet deep
  • Provide fire safety stairs in every building area

When several (hundreds?) of these parameters work together we see a real building take shape with macro-BIM. How all of these parameters fight one another for prominence will create friction, but in the end create a symphony. There are few implementations of Macro BIM out there already: Site Ops, Dprofiler, and Residential Engine are a few.

This video gives a brief overview of what macro BIM can do for multifamily buildings.

The long term aim? Generative design. Stay tuned.

Re-thinking site selection for real estate developers

Site Identify for site identification

What is the typical process for site selection? Typically it is a very bottom-up approach. Talking to brokers, driving around looking, or hearing about one at a happy hour. With the power of public records and massive amounts of data, we can look to software for a different route.

What is a top-down approach to site selection?

I am going to need to know two things, minimally, in order to run an economic model on new potential sites. First, where it is, and second, what kind of building (if any) does it have on it. For this workflow I will use two different platforms.

  1. Site Identify is a tool to do massive search functions on parcel + assemblage data.
  2. TestFit is a tool to solve site plans quickly.

For this search I have a specific building criteria in mind:

  • Building Type: Multifamily (TestFit specializes in this)
  • Construction Type: 4 Story Stick (type v) wrapping a garage (1.5 stalls per unit)
  • Total Units: Roughly 270 (900 sf avg)
  • Roughly a 60 DU/AC density

Some math: 270 units / 60 units per acre = 4.5 acres zoned for multifamily. This helps me to target the minimum and maximum sized site for development.

Taking this criteria, I can program the search into Site Identify:

  • Market: San Antonio (inside loop 410)
  • Acreage: 4-5 Acres
  • Existing Improvement Age: Built before 1968 (50 years)

This search yielded twelve possibilities, the first four are a no-go, the middle four are maybes, and the final four are pretty good: link to map

The two most promising of these twelve are definitely 421 Roosevelt and 815 E. Ashby. I am going to plug them into Residential Engine to get a site plan to work with.

421 Roosevelt easily yields more than 275 units in a four level wrap.

815 E Ashby Knocks it out of the park at first glance, but it has a freeway lining its north side:

I am going to flip to a two tray garage and try to turn my back on 281 as much as possible. Here are the results:

I think the second option is much more realistic for the location. Here is a video of how this workflow looked in Residential Engine:

To Plug into the Pro-Forma:

421 Roosevelt: 281 Units / 887 Average Unit Size on 4.76 Acres

815 E Ashby: 273 Units / 880 Average Unit Size on 5.02 Acres

Final Thoughts

While this is not an ordinary workflow, this could be the blueprint for the future of development. Instead of waiting for something to come along, parcels could be targeted during a down-cycle and relationships made with land-owners, and then executed during the next cycle.

Just to recap: We looked at San Antonio (inside loop 410) and found 12 parcels (not assemblages) that could be developed into a 4 level wrap apartment community. We culled out sites that were not very desirable and landed on two to test for a site plan. Residential Engine gave us a few options to consider with our financial model.

A special thanks to Site Identify for allowing me to test their awesome online software! If you are interested in it contact David Morin.