How computational design, modeling are changing engineering
Visual scripting tools empower engineers by leveraging the power of coding and computational design to automate tasks and optimize data
- Define computational design and modeling.
- Identify ways in which computational design can be applied to engineering workflows.
- Explore how design space exploration informed the design on a higher education facility.
The building design and engineering industries are changing. Tools like Grasshopper for Rhino and Dynamo for Revit make it easier than ever for architects, engineers and contractors to leverage the power of coding to automate tasks and analyze data.
Visual scripting platforms replace text-based programming languages with a “connect-the-dots” platform designed for beginner programmers and visual thinkers. Rather than write code from scratch, users connect pre-made code “nodes” within a 2-D visual canvas (see Figure 1). By making the power of coding accessible to nonprogrammers, these tools are changing how projects are designed and constructed.
These technologies are unleashing a wave of new techniques that streamline existing workflows and enable entirely new modes of design integration. Engineering firms across the world strive to comprehend not only how new technology will affect practice, but also how to integrate the technology into design. Design workflows that leverage these technologies constitute what is commonly called computational design.
New processes never come without challenges. Applying computational design requires a significant time investment and skill sets formerly uncommon to engineering practice. Computational script writers need to possess both a firm knowledge of building design principles (by discipline) and basic coding logic. Because these skillsets are disparate, visual scripting is commonly learned by designers within the professional environment.
Resources such as Lynda, LinkedIn Learning and Performance Network aid in establishing foundational knowledge of computational design software. Visual scripting platforms are supported by vibrant open-sources communities that produce freely available code nodes and share example files through forums, but new users may find navigating these digital ecosystems challenging. If executed properly, the initial time investment in visual scripting can pay tenfold as scripts are applied across multiple projects. As coding propagates across industries and curriculum, computational design will evolve from an intimidating design option to a necessary tool.
The power of computational design can be best conveyed using examples that showcase how computation affects workflows. This discussion will begin with overviews of several fundamental automation scripts that improve Autodesk Revit workflows, which will be collectively referred to as computational building information modeling and progress in complexity to explore fully automated data-centric workflows that support computational performance analysis. These examples share one thing in common: they were built by engineers and designers who learned to code.
Diving into computational design
The fundamental building block of computational BIM is its ability to reduce repetitive tasks. A common example is a “get-set” script that pulls data from a BIM model and assigns new data across multiple elements.
For example, updating title block information is a common repetitive task performed across disciplines and is often accomplished through manual data input. A get-set script can read (“get”) elements within the title block and output (“set”) new parameters. A get-set title block script can read sheet names and/or numbers, then update the title block’s key plan to reflect the associated building area.
Figure 3: Parametric modeling leads to data-heavy outputs, which may be overwhelming. Programs such as Thornton Tomasetti’s Design Explorer provide flexible, interactive data dashboards to help visualize complex data. Courtesy: SmithGroup[/caption]
While these capabilities are inherent within most BIM software, computation flourishes with its ability to customize the design parameters. This workflow becomes beneficial to projects involving more complex designs or large-scale floorplates.
Enhancing design workflows
Recognizing that computational methods can automate everything from discrete tasks to full workflows offers an opportunity to reevaluate the “what” and “when” of the design process. Rather than simply improving existing workflows, computational design methods offer opportunities for entirely new workflows.
These changes don’t happen overnight. Grassroot development of new workflows is often a product of incremental organic improvements that react to immediate project needs. Innovation on one project often becomes the foundation for breakthroughs on the next. Once digital capabilities have matured, firms can challenge when in a project key services are provided.
Slow analysis that was once relegated to the design development phase of a project is now feasible in the schematic or even conceptual phases of design. Early introduction of automation and data analysis techniques within the design process change the value proposition. Nowhere is this fact more apparent than in the realm of performance analysis.
Traditional performance analysis workflows are limited to validating design options only after design decisions have been made with the static perspective of a designer’s past experience that may not be specific to the current design. This is because of one simple constraint: running analysis requires a design to analyze. Even the best performance analysis is forced to contend with this fundamental delay.
If anything were possible, analysis would guide design decisions as they are being made, much like an expert calling upon years of experience and knowledge. Fortunately, the constraints of the past no longer hold, and computational performance analysis provides the ability to inform at the speed of design.
Parametric modeling, access to vast computational resources and machine learning techniques enable a new approach to performance modeling that allows integrated design teams to consider nuanced, design-specific analysis in real time. This is possible because the data required to power potent statistical models and machine learning algorithms can be generated with advanced visual scripting workflows. This data and the statistical models it supports can be mined to make sense of complexity and predict performance, rather than simulate it.
The process can be referred to as design space exploration and it consists of three steps:
- Model the design challenge parametrically using visual scripting.
- Generate design permutations.
- Explore the resulting data for trends and insight.
Design space exploration was leveraged on the ongoing Virginia Tech Innovation Campus project in Alexandria, Va., to inform the various facets of a gemlike façade design. The challenge was to balance daylighting, energy loads and aesthetics while generating as much electricity as possible from integrated photovoltaic systems. Each facet provided a unique solar condition, yet all façades needed to be integrated into a cohesive whole.
The team wanted to understand how design decisions like orientation, tilt, window shape, glazing visible light transmittance, fin depth, photovoltaic type and interior program affected the various design criteria for each façade. Which decisions matter most? Which combinations best balance competing criteria?
To answer these and other questions, a series of Grasshopper parametric models were developed to explore photovoltaic generation, internal daylight and peak cooling and heating loads. Façade geometry was parametrically modeled in Grasshopper so that every key design decision was captured as a unique parameter with value ranges that represented reasonable design possibilities. This way, every possible state of the parametric model represented a potential design that the team might consider.
Different questions required different key parameters and thus were modeled as separate parametric models. This geometry was then connected to the Energy Plus and Radiance analysis software using Grasshopper’s Ladybug and Honeybee plugins. Hundreds of design permutations were simulated using Thornton Tomasetti’s Colibri plugin until a database of simulation results was collected that represented all positions of the parametric model. This database can be referred to as a “design space,” as it represents the space of all design possibilities contained within the parametric model. Once the parametric models had been translated into databases that data could be mined for insight.
At this stage, the team operated more as data scientists than as engineers or architects. They leveraged PowerBi to explore and visualize the multidimensional data and used multivariate linear regression, a simple machine learning algorithm, to measure the relative impact of each design decision.
Figure 6: Running multiple linear regression on the design space data quantifies the relative impact of each input parameter. Courtesy: SmithGroup[/caption]
Applying data to design
To this end, the team leveraged artificial neural network regressors (a type of machine learning algorithm) to power models that could predict performance in real time. By training the artificial neural networks on the design space data, the performance of any state of the parametric model can be predicted, even those that had not been pre-simulated.
Essentially, the machine learning model interpolates between existing simulation data and displays the predicted data on top of the physical form of the parametric model. Seeing colored data in context with the physical design is much more understandable than interpreting a dot on a scatter plot.
This method of contextualizing data can provide teams with an interactive parametric model that can be used to discuss possible scenarios. Rather than develop a PowerPoint that only reports the performance of options considered before the meeting, teams are now able to create custom apps that can respond to live decisions and unforeseen questions. Such design interfaces can communicate the effect of owner preferences on initial design options, helping to build consensus and navigate alternatives. As a bonus, these models can be reused on future projects at no cost if they represent common design challenges rather than analysis of specific options.
This new approach to informing design relies upon the ability to automate the entire analysis workflow using scripting. Once it is feasible to analyze hundreds of potential designs, engineers are no longer required to wait for design options to begin the analysis process. The team can model the design challenge parametrically and deliver information as key decisions are made. This and other data-centric workflows can revolutionize how engineers and architects work together.
All of the workflows discussed herein involve the generation, interpretation or manipulation of data through code and, as such, should been considered as BIM applications. Firms that use Revit and Rhino are well-positioned to leverage the power of coding because these software platforms provide robust access to their underlying data. Firms that use AutoCAD or Sketchup lack easy access to data and risk being left behind by the continuing digital revolution.
Although not blatantly apparent, the building industry has been using computational BIM for more than 10 years through Revit add-ins, such as Ideate Software. These add-ins extend the power of BIM models by accessing the application programming interface. Dynamo and Grasshopper provide the same benefits, but through visual coding on a free and open-source platform. These visual coding platforms strip companies of the limitations of building modeling software and allow users to adapt their software to their workflows, not vice versa.
Education is key to staying up to date with an ever-changing technology landscape. As more universities add programming and parametric logic to their architecture and engineering curriculum, professionals without formal training will need to develop the digital literacy necessary to direct those who did.
While every architect, engineer or contractor cannot be expected to attend night school to learn to code, we hope that many recognize that visual scripting is a pathway to applicable digital skills designed for people like us. Coding is becoming a part of everyday life faster than many anticipated, and it is only a matter of time until these practices are fully immersed in the building design process.
Do you have experience and expertise with the topics mentioned in this content? You should consider contributing to our CFE Media editorial team and getting the recognition you and your company deserve. Click here to start this process.