How computational design, modeling are changing engineering
Visual scripting tools empower engineers by leveraging the power of coding and computational design to automate tasks and optimize data
- Define computational design and modeling.
- Identify ways in which computational design can be applied to engineering workflows.
- Explore how design space exploration informed the design on a higher education facility.
The building design and engineering industries are changing. Tools like Grasshopper for Rhino and Dynamo for Revit make it easier than ever for architects, engineers and contractors to leverage the power of coding to automate tasks and analyze data.
Visual scripting platforms replace text-based programming languages with a “connect-the-dots” platform designed for beginner programmers and visual thinkers. Rather than write code from scratch, users connect pre-made code “nodes” within a 2-D visual canvas (see Figure 1). By making the power of coding accessible to nonprogrammers, these tools are changing how projects are designed and constructed.
These technologies are unleashing a wave of new techniques that streamline existing workflows and enable entirely new modes of design integration. Engineering firms across the world strive to comprehend not only how new technology will affect practice, but also how to integrate the technology into design. Design workflows that leverage these technologies constitute what is commonly called computational design.
New processes never come without challenges. Applying computational design requires a significant time investment and skill sets formerly uncommon to engineering practice. Computational script writers need to possess both a firm knowledge of building design principles (by discipline) and basic coding logic. Because these skillsets are disparate, visual scripting is commonly learned by designers within the professional environment.
Resources such as Lynda, LinkedIn Learning and Performance Network aid in establishing foundational knowledge of computational design software. Visual scripting platforms are supported by vibrant open-sources communities that produce freely available code nodes and share example files through forums, but new users may find navigating these digital ecosystems challenging. If executed properly, the initial time investment in visual scripting can pay tenfold as scripts are applied across multiple projects. As coding propagates across industries and curriculum, computational design will evolve from an intimidating design option to a necessary tool.
The power of computational design can be best conveyed using examples that showcase how computation affects workflows. This discussion will begin with overviews of several fundamental automation scripts that improve Autodesk Revit workflows, which will be collectively referred to as computational building information modeling and progress in complexity to explore fully automated data-centric workflows that support computational performance analysis. These examples share one thing in common: they were built by engineers and designers who learned to code.
Diving into computational design
The fundamental building block of computational BIM is its ability to reduce repetitive tasks. A common example is a “get-set” script that pulls data from a BIM model and assigns new data across multiple elements.
For example, updating title block information is a common repetitive task performed across disciplines and is often accomplished through manual data input. A get-set script can read (“get”) elements within the title block and output (“set”) new parameters. A get-set title block script can read sheet names and/or numbers, then update the title block’s key plan to reflect the associated building area.
If nomenclature varies across projects, users are not pigeonholed; scripts are flexible and can be easily modified to align with specific project needs and/or client standards. Get-set scripts are a great way to break ground in computational design as they can usually be completed with a few nodes and offer immediate time-saving benefits.
While BIM models have advanced cross-discipline collaboration considerably, room for improvement remains particularly in mechanical and electrical coordination. Computational BIM becomes increasingly complex with the development of “find-create” scripts, which can enhance mechanical, electrical and plumbing coordination by creating new elements based on information in the model.
A find-create script can optimize coordination tools available within design software to create a platform to enhance that coordination through computation. Consider the example of typical MEP coordination within the building model that involves identifying powered mechanical equipment and manually adding electrical connections and their parameters.
A mechanical and electrical, or MEEL, coordination schedule can be created in the building model to identify all powered equipment and their parameters. A find-create script can filter all mechanical elements populated by the building model’s MEEL coordination schedule and create a new element: in this case, an electrical connection. The electrical connection is placed using a standard node that locates the center point of an element, then reads and assigns all electrical parameters populated by the mechanical equipment family.
Intelligence can be added to the script by modifying the electrical connection to align with equipment needs. The script reads the mechanical equipment tag and modifies the electrical connection type; i.e., all electrical connections with tags prefixed by “RTU-” can be converted to direct connections. The MEEL find-create script enhances coordination by seamlessly creating electrical components that align with mechanical equipment parameters while avoiding the potential for human error.
The role of computational BIM in the design process can evolve from small discrete tasks to a comprehensive workflow. Although most engineering involves unique calculation and design, common workflows within the design process can be streamlined.
An example of a streamlined design process can be found in standard office mechanical design, in which square plaque diffusers are commonly sized based on noise criterion. Mechanical ceiling layout involves calculating room design airflow, placing diffusers in the building model, then manually sizing and assigning airflows to diffusers. A mechanical workflow script can read space loads within the building model, assign diffuser airflow, size the diffuser based on user-specified noise criterion and tag the diffuser to coordinate with the grille, register and diffuser schedule. The grille, register and diffuser workflow saves time, reduces human error and ensures design consistency across a project.
While these capabilities are inherent within most BIM software, computation flourishes with its ability to customize the design parameters. This workflow becomes beneficial to projects involving more complex designs or large-scale floorplates.
Enhancing design workflows
Recognizing that computational methods can automate everything from discrete tasks to full workflows offers an opportunity to reevaluate the “what” and “when” of the design process. Rather than simply improving existing workflows, computational design methods offer opportunities for entirely new workflows.
These changes don’t happen overnight. Grassroot development of new workflows is often a product of incremental organic improvements that react to immediate project needs. Innovation on one project often becomes the foundation for breakthroughs on the next. Once digital capabilities have matured, firms can challenge when in a project key services are provided.
Slow analysis that was once relegated to the design development phase of a project is now feasible in the schematic or even conceptual phases of design. Early introduction of automation and data analysis techniques within the design process change the value proposition. Nowhere is this fact more apparent than in the realm of performance analysis.
Traditional performance analysis workflows are limited to validating design options only after design decisions have been made with the static perspective of a designer’s past experience that may not be specific to the current design. This is because of one simple constraint: running analysis requires a design to analyze. Even the best performance analysis is forced to contend with this fundamental delay.
If anything were possible, analysis would guide design decisions as they are being made, much like an expert calling upon years of experience and knowledge. Fortunately, the constraints of the past no longer hold, and computational performance analysis provides the ability to inform at the speed of design.
Parametric modeling, access to vast computational resources and machine learning techniques enable a new approach to performance modeling that allows integrated design teams to consider nuanced, design-specific analysis in real time. This is possible because the data required to power potent statistical models and machine learning algorithms can be generated with advanced visual scripting workflows. This data and the statistical models it supports can be mined to make sense of complexity and predict performance, rather than simulate it.
The process can be referred to as design space exploration and it consists of three steps:
- Model the design challenge parametrically using visual scripting.
- Generate design permutations.
- Explore the resulting data for trends and insight.
Design space exploration was leveraged on the ongoing Virginia Tech Innovation Campus project in Alexandria, Va., to inform the various facets of a gemlike façade design. The challenge was to balance daylighting, energy loads and aesthetics while generating as much electricity as possible from integrated photovoltaic systems. Each facet provided a unique solar condition, yet all façades needed to be integrated into a cohesive whole.
The team wanted to understand how design decisions like orientation, tilt, window shape, glazing visible light transmittance, fin depth, photovoltaic type and interior program affected the various design criteria for each façade. Which decisions matter most? Which combinations best balance competing criteria?
To answer these and other questions, a series of Grasshopper parametric models were developed to explore photovoltaic generation, internal daylight and peak cooling and heating loads. Façade geometry was parametrically modeled in Grasshopper so that every key design decision was captured as a unique parameter with value ranges that represented reasonable design possibilities. This way, every possible state of the parametric model represented a potential design that the team might consider.
Different questions required different key parameters and thus were modeled as separate parametric models. This geometry was then connected to the Energy Plus and Radiance analysis software using Grasshopper’s Ladybug and Honeybee plugins. Hundreds of design permutations were simulated using Thornton Tomasetti’s Colibri plugin until a database of simulation results was collected that represented all positions of the parametric model. This database can be referred to as a “design space,” as it represents the space of all design possibilities contained within the parametric model. Once the parametric models had been translated into databases that data could be mined for insight.
At this stage, the team operated more as data scientists than as engineers or architects. They leveraged PowerBi to explore and visualize the multidimensional data and used multivariate linear regression, a simple machine learning algorithm, to measure the relative impact of each design decision.
The results confirmed some assumptions and also surprised some team members. The architects expected that adding vision glass between floor and sill would have little impact on internal daylight levels, but they were surprised to see the magnitude that internal program affected peak cooling loads. These insights helped focus the team on the most impactful decisions. In the case of peak loads, it offered the opportunity for the heating, ventilation and air conditioning engineer to explain the limitations of radiant cooling systems so that the team could locate high internal load spaces away from sunny façades, reducing peak loads enough to allow the highly efficient systems to be used. Rather than simply identify optimal possible designs, the team leveraged data to guide priorities and inform general approaches.
A typical meeting consisted of an integrated team of architects, HVAC engineers and daylighting experts discussing potential solutions and consulting pre-simulated data. Because the carefully crafted parametric models generated data for all key decision combinations, the analysis could follow wherever the conversation happened to go. Effectively, the team instantly “analyzed” new design ideas on the fly because the analysis already existed. Critically, these discussions happened in early schematic design when the team still had time to adjust crucial decisions. Armed first with a greater understanding of the problem, the team could then develop the best design. This enabled the team to inform design rather than validate options.
The vast amount of data caused its own set of challenges and often lead to analysis paralysis. It is easy to comprehend four options, but 400? The team quickly discovered that visualizing and simplifying complexity was key. They needed a way to navigate the maze of information and react to design decisions on the fly.
Applying data to design
To this end, the team leveraged artificial neural network regressors (a type of machine learning algorithm) to power models that could predict performance in real time. By training the artificial neural networks on the design space data, the performance of any state of the parametric model can be predicted, even those that had not been pre-simulated.
Essentially, the machine learning model interpolates between existing simulation data and displays the predicted data on top of the physical form of the parametric model. Seeing colored data in context with the physical design is much more understandable than interpreting a dot on a scatter plot.
This method of contextualizing data can provide teams with an interactive parametric model that can be used to discuss possible scenarios. Rather than develop a PowerPoint that only reports the performance of options considered before the meeting, teams are now able to create custom apps that can respond to live decisions and unforeseen questions. Such design interfaces can communicate the effect of owner preferences on initial design options, helping to build consensus and navigate alternatives. As a bonus, these models can be reused on future projects at no cost if they represent common design challenges rather than analysis of specific options.
This new approach to informing design relies upon the ability to automate the entire analysis workflow using scripting. Once it is feasible to analyze hundreds of potential designs, engineers are no longer required to wait for design options to begin the analysis process. The team can model the design challenge parametrically and deliver information as key decisions are made. This and other data-centric workflows can revolutionize how engineers and architects work together.
All of the workflows discussed herein involve the generation, interpretation or manipulation of data through code and, as such, should been considered as BIM applications. Firms that use Revit and Rhino are well-positioned to leverage the power of coding because these software platforms provide robust access to their underlying data. Firms that use AutoCAD or Sketchup lack easy access to data and risk being left behind by the continuing digital revolution.
Although not blatantly apparent, the building industry has been using computational BIM for more than 10 years through Revit add-ins, such as Ideate Software. These add-ins extend the power of BIM models by accessing the application programming interface. Dynamo and Grasshopper provide the same benefits, but through visual coding on a free and open-source platform. These visual coding platforms strip companies of the limitations of building modeling software and allow users to adapt their software to their workflows, not vice versa.
Education is key to staying up to date with an ever-changing technology landscape. As more universities add programming and parametric logic to their architecture and engineering curriculum, professionals without formal training will need to develop the digital literacy necessary to direct those who did.
While every architect, engineer or contractor cannot be expected to attend night school to learn to code, we hope that many recognize that visual scripting is a pathway to applicable digital skills designed for people like us. Coding is becoming a part of everyday life faster than many anticipated, and it is only a matter of time until these practices are fully immersed in the building design process.