Taking BIM for Structural Engineering to the Limits and Beyond

Autodesk University
Autodesk University
11 min readApr 12, 2017

--

By Matt Wash, Graham Aldwinckle, and Xavier Nuttall

The AEC industry is in a transition period from traditional ways of working to embracing new technologies and processes which other industries have seen the benefits of for many years. Through real-life examples, this article discusses how using structured data through collaborative workflows can yield productivity gains, minimize risk and waste, and provide the best possible outcome for all stakeholders involved in the design, construction, and operation of a facility.

Let’s be realistic: this change is not going to happen overnight, so now is the perfect opportunity for you and your teams to make a stepped change to continuous improvement. Toyota has embraced lean manufacturing for many years. They have proven that by eliminating the waste and adding value at every step of the way, not only have they produced the highest quality product, but they have always remained profitable. So how do they do it, and how can their principles be applied to structural engineering?

Look at the way your company would traditionally design and document a structural solution and identify the areas where you can eliminate waste and ensure you are adding value at every step.

Interoperability
In the projects discussed in our presentation, we can identify a number of areas where we eliminated the duplication of effort and ensured we were able to adapt to possible future changes. For the steel canopy study, the initial considerations were primarily those of the architecture and structure. By using parametric tools, constraints and variables can be established to identify almost an infinite number of possible solutions accounting for every stakeholder request. On every project, we know the client is looking for value for money, so it is essential we provide a solution that is both optimized and flexible for possible change. To achieve all of the above it was essential to have one single source of data that is bidirectional between multiple platforms.

Parametric Steel Studies
On the canopy, the architect was responsible for the form, and as such fed their requirements into the parametric Grasshopper model (i.e., curvature of the canopy, maximum height, etc.). As the structural engineer, Arup could take these constraints when analyzing the same centerline geometry to import the data directly into our analysis software. Once analyzed the required structural member sizes were pushed back into Rhino and the architect could visualize whether or not the architectural form was satisfied.

If not, the parameters could be tweaked and the process could be repeated until both parties were satisfied with the outcome. At various points during the process, the client, builder or QS could be involved to add their constraints and variables. This continued iterative process can be repeated over and over again until a satisfactory outcome is reached for all stakeholders.

For the pedestrian bridge, we needed to engage the client in an environment they were familiar with. We encouraged them to help solve the problem and explained the design considerations in a very simple visual way. By immersing the stakeholders into this process, we were able to allow them the ability to influence the solution creatively. By controlling the start and end point locations of the bridge in a virtual model, the client was able to quickly identify the impact of design decisions, and could see how parametric modelling gave almost infinite possible solutions to the challenge.

So how did this change the traditional approach to design and detailing structures of this nature? Similar to that of the steel canopy there was no double handling of information. Only one centerline geometry model was created during the process and there were no parallel analysis and documentation models, so a detail was never updated twice. The entire team were immersed in a truly holistic collaboration where traditional silos were taken down.

In the early stages there were hand sketches generated to explore possible structural options, and this is a critical point to note. Not everything will need to be parametrically scripted at all stages. What is the point of wasting a day to generate a parametric script for a solution that the architect may reject straight away? A back-of-the-envelope sketch may be all that is required to convey the intent of a crude solution that could be accepted or rejected. Once the principal of the solution is agreed, then is the time to apply parametric techniques to refine and optimize the solution.

So how do you embrace this type of workflow without needing to create drawings along every step of the journey?

If your client is open to the idea of using the 3D model as the deliverable, then great. There is no reason why you shouldn’t be able to go through the entire design process and issue an LOD300 model without creating a single drawing, but is this a realistic ambition? The traditionalists will say they still need to see plans, sections, elevations and details of the solutions, and we all know that as much as we can automate this process we still require manual input to convey all that needs to be read in conjunction with the 3D model.

The only reason we produce 2D views is because historically without today’s technology it was impossible to embed all the necessary information into a drawing. Making use of 3D interactive PDF views embedded within traditional drawings is a good way to clearly demonstrate design intent without the need to add manual 2D information. The industry will eventually all work digitally, we hope.

Queensland State Velodrome BIM Model Deliverable, Brisbane Australia — Arup
Queensland State Velodrome 2D Drawing Deliverable, Brisbane Australia — Arup

Optimization ‘v’ Rationalization
Balancing optimization ‘v’ rationalization is a critical part of any project, and this is where a balance of engineering judgment along with stakeholder requirements is necessary. Optimizing the steel canopy to provide a minimum weight solution may be the cheapest solution to a QS, but consideration must be taken to the number of connection types that might be required, and the time taken on-site to build the canopy. Having a flexible parametric workflow will enable a number of solutions to be analyzed for cost, time and quality.

The construction industry is forcing us to find better ways of working and for us to become more efficient and productive. Quite simply if you are not embracing these opportunities you will eventually go out of business. So what do you and your teams need to take the next step?

Identify the value you are adding for your client:

  • “By using parametric and optimization techniques on our last project we reduced the tonnage of steel on by x% which equates to around $y.”
  • “By rationalizing the connection design on our last project we saved x weeks in fabrication time and were able to complete on-site work by x months.”

Identity the value you are adding to your company:

  • “By investing two weeks at the start of the construction documentation phase building a parametric script to adapt to Architectural changes we were able to update our model in x days, which traditionally would have taken x+y days to do traditionally.”

Future Skills and Training
The role of both the structural engineer and technician will need to change. Having an understanding of visual coding/scripting through the likes of Dynamo will allow mundane tasks to be automated so you can focus on the unique aspects of our project — and the creative side. Make a list of the tasks you carried out on every project that take time. How can you avoid the mark up process? Make the data the focus of your design and documentation and move the data from one source to another without recreating the information, through the likes of “Flux” or “Excel,” or Industry Foundation Class (IFC) export/import.

Once you have identified the processes that are not adding value to your business, “Google it.” More often than not there will be a free Dynamo script that does exactly what you are trying to do. If not, there will likely be something similar that you can use to modify for your challenge.

Dynamo ‘v’ Grasshopper
A question that is frequently asked within our firm is, “When should I use Dynamo and when should I use Grasshopper?” There are many contributing factors as to which is the right tool for the job, but as a general rule Dynamo has far more potential during the documentation phases and provides the opportunity to develop multiple scripts that will work on jobs of all types. Grasshopper is better suited to the concept phase of a project where change is at its most fluid. That is not to say you couldn’t use Dynamo from start to finish of a project, but if you are documenting within the Revit environment, Dynamo has a clear advantage over Grasshopper when drawings are part of the deliverable or when the rest of the design team is using Revit.

Interoperability: Use the Right Software/Workflow for the Job in Hand
“Open BIM” through the use of IFC is breaking down any barriers that exist exchanging data between platforms. The exchange is by no means perfect but is much better than the alternative of building parallel models or choosing a single product to do everything. In the examples of the bridge and canopy structure in our presentation we used Revit, Dynamo, Grasshopper, Geometry Gym, Tekla, GSA Structural Analysis, Strand 7, Excel, Statica, Advance Steel. Most of these tools are freely available or already are part of most consultant’s armory.

Building Your Team and Sharing Knowledge
Whilst it would be fantastic to be able to send you and your entire team on an external course to learn every piece of software, for most companies this is just not financially viable. Software overload has replaced information overload. So how can you up-skill your teams and encourage continuous improvement? It is important to identify who in your team deserves the opportunity to learn new skills and those who you know will be happy to share those new skills amongst your colleagues. Nominate a champion for each piece of software and provide them with the opportunity to have some time to continue to improve their knowledge of the product. In return, ask the champions to run lunchtime sessions to show your colleagues how their skills have improved productivity and efficiency.

Demonstrate the Return on Investment to Your Company Leaders
“Shawnee learnt how to automate the checking of the shop detailers model/drawing through an automated process using “Solibri” software to check our Revit model member sizes, geometry and connections. This saved us 100 hours compared to checking the paper copies of the shop detailer drawings.”

Most new software that your company will have to pay for will offer a free trial, so use this opportunity to demonstrate how much money and time you could save.

Develop an Internal Network of Like-Minded People
Don’t reinvent the wheel. In a large organization there will likely be someone who has faced a similar problem to the one you are trying to solve. Use the lessons learnt from their experience to ensure you capture what went well and what didn’t. If your company has an Intranet, post your question to a forum where you will get maximum exposure.

Sharing knowledge is easy, but we don’t do it as often as we should. When we have solved a problem and provided the solution to our immediate client, how often do we share that knowledge amongst people outside of the project team? At Arup we use Yammer forums to post challenges to cast the net outside the immediate project team to capture the opinions of anyone who wants to offer one.

Tools Register
We are creating a tools register of every piece of software we use within Arup and identified everything about that tool. Who are the champions, what are the pros/cons/watch-its, what is the latest version? The tools register is to be used at the start of a project to identify which workflow is best suited to each job, as part of the inception review.

Simple and Complex Projects
Parametric modelling and visual coding through Dynamo are not just applicable to complex projects. As a general rule, our parametric workflows start using Grasshopper and Rhino in the concept and scheme stages, when geometry is the primary data exchange. Grasshopper provides instant visual previews of major changes and in these initial stages of the project there is no requirement to add “information” parametric workflow. As the design progresses, the “information/data” become more and more important for documentation, and this “information/data” is better suited to Dynamo.

For example, we have developed a number of parametric scripts that are applicable to our documentation on all projects regardless of their complexity that are not possible through the use of Grasshopper. Set up of the project with regard to generating views, including plans, elevations and sections etc. is scripted using the exact same process for any job. Adding penetrations through beams to allow for ducts, cable trays, etc., to pass is only possible through the use of our Dynamo parametric toolkit.

Content and Structured Data
It’s all about the data. And it’s the content that holds this data. By filtering the Revit schedules, it’s easy to find anomalies in the entered data, which assists the team in ensuring they have suitably well structured data. It’s still amazing how we still see projects with different ways of specifying the same element (e.g., a beam size sometimes entered as WxD, or DxW).

Once structured data is mastered, one of the most compelling uses is in standard views within Revit. Our presentation demonstrates some examples for rebar views, but the principles apply for the whole range of materials, and family types in Revit.

Data Harvesting
Once structured data is used across a number of projects, data harvesting, feeding into big data, is a valuable exercise. But it only works if the data is consistent, to allow easy comparisons. The benefit of data harvesting is access to the wider pool of knowledge that can be automatically gathered across a vast array of projects. Presenting this back to the engineer, allowing them to compare project types, means a greater quality of output if it can be shown to compare well to average project metrics, such as rebar densities.

Matt Wash is a professional engineer currently working as the Australasian structural digital design leader at Arup in Brisbane. Matt has over 20 years of experience with Arup globally.

Graham Aldwinckle is a chartered structural engineer with over 22 years of industry experience. He has a wide range of experience with different building types, including residential, commercial, retail, education, and mixed use.

Xavier Nuttall is a professional structural engineer working out of the Arup Sydney office. Over the last six years, Xavier has specalised in complex geometry and structural optimisation.

Learn more with the full class at AU online: Taking BIM for Structural Engineering to the Limits — and Beyond.

--

--

Autodesk University
Autodesk University

Learn, connect, explore. The official account for Autodesk University.