AECbytes "Building the Future"
Article (October 7, 2008)
atomicBIM: Splitting Data to Unleash BIM’s Power
Principal, Einhorn Yaffee Prescott Architecture & Engineering PC
To be most effective, future BIM solutions will require greater attention to the entire life-cycle of the BIM process. Current BIM applications create massive, singular datasets. However, given that the emerging BIM team is going to be distributed and ever-larger, greater granularity of data is going to be vital for the future, collaborative process.
“Greek columns and their entablatures were at first entirely of timber, with terra-cotta decorations in the upper trabeation, but were converted into stone quite early in the [Hellenic] period, about 600 BC. The translation was quite direct, timber forms being imitated in stonework with remarkable exactness. For this reason, Greek architecture sometimes has been called a ‘carpentry in marble…”
A History of Architecture; 18th Ed. p.192
Sir Banister Fletcher
It is not uncommon for one generation to continue in the familiar ways of a previous era even while a significant shift is occurring around them. The Greeks, for example, continued to build their temples according to visual principles of earlier wood construction even though a new medium, stone, had made its woodcraft symbolism obsolete. The old iconography was quite literally “enshrined,” and the new generation chose simply not to question the previous path despite a new material and medium.
We may find that a similarly significant shift is occurring in the development of BIM—and now is a good time to re-focus and take stock lest we unintentionally build a faulty workflow. In particular, with the accelerating success of BIM adoption, the shift from a single-player to a multi-player BIM process may soon prompt careful reconsideration of our entire approach to BIM models.
The most obvious concern—unmanageable file sizes—has been unfolding over several years. In our present BIM workflow, we are creating ever larger, difficult-to-access BIM models within a single file, or a series of large linked files which are likely to choke the BIM workflow. An additional concern, however, is that we have no workable protocols or established tools to retrieve and manage the data at the other end for “data extraction.” If we are to fully address these issues, we may want a comprehensive re-evaluation before we get too far along our present path.
Instead of continuing to create ever larger files, we ought to conceptualize and structure the BIM environment for quick and easy access. We could imagine an arrangement where BIM is comprised of many tiny pieces of data. We can call this atomicBIM—that is, BIM in small, discrete pieces of data. An atomized information structure would provide granularity and rapid access so that subsets of BIM information could be more easily accessed without a massive download.
The Atomic Structure of BIM
The term “atomicBIM” evokes an image of BIM data in small packets, similar to atoms of an element. This notion’s core difference from a large single file, which also contains many pieces, is that in atomicBIM these pieces would be distinctly addressed and more easily accessed in isolation. This leads to an intriguing question: what exactly might the atomic structure of BIM look like in this scenario? If we could get closer to the very heart of BIM’s core properties, we might better understand how to unpack it for downstream uses.
Most dictionaries define an atom as: “…the smallest unit of an element, having all the characteristics of that element…” Thus, an atom is understood to be a very small but nonetheless still recognizable piece of the substance that it comprises. An atom of BIM then should still possess all the required characteristics and capabilities for the construction process.
Let’s examine how BIM objects have evolved over time and try to discern their “atomic nature” from that exercise.
BM: The origins of today’s BIM movement is wholly predicated on the initial idea that buildings are three-dimensional endeavors, thus 3D models are a valuable tool in predicting aspects of the design and construction effort. This 3D-modeling effort in isolation can be simply called “BM” for Building Modeling. The value proposition of BM alone is to understand the relationship between purely physical, geometric components. The “information” that exists in the BM model is simply spatial—where things start and stop, and how they are arranged—which while valuable, is really just graphical information.
In the first pure BM phase, the atoms were simply 3D objects—there was no other data, no opportunity to create schedules of components, or arrange them on a timeline, nor count them for cost estimating, It was simply geometry. To do any of those other things, we had to add data tags to the objects. This is largely what happened in the next stage of evolution—BM+I.
BM+I: In the first advance, data tags were added to 3D objects. The geometry object dataset was simply expanded to contain fields of data that were attached to the geometry. The tags were added without much architectural “context,” i.e., the objects did not understand that they were parts of a building. They were simply digital 3D objects with data fields—they might as well have been parts of a Black & Decker drill.
However, if users could pluck the data from the 3D objects, arrange them into a spreadsheet, and add the necessary context—often in their head—it would be possible to get some useful findings, like an equipment schedule. The data transfer was not, however, reciprocated. Once a design change was made, the data fields—again without context—would be exported and again manipulated.
Atoms of BM+I therefore were 3D objects with data attached to them, similar to a “pin cushion” configuration. The “pinheads” containing object data were not automatically related to each other.
B.I.M.: Before too long, originating in the manufacturing world, a new breed of software was maturing that took a totally new approach—3D modeling that was to be in context. In the manufacturing arena, software such as Parametric Corporation’s Pro-Engineer emerged that could emulate manufacturing processes. However, plain CAD objects were not the centerpiece of this software; instead, fabrication management and process simulation were. It was a bold new direction in digital design, and it soon arrived at the door of the AEC industry. Similar software soon emerged, particularly Revit, which had at its heart a database. In this new arrangement, the “information engine”was at the center of the software, and both graphical representations and schedules were driven by data contained in the engine.
In addition, data objects were clearly situated in an architectural context; walls for example, were “hardwired” to have certain behaviors, such as hosting doors and windows; gridlines and floor levels were understood as they actually exist in the construction world, that is, as major determinants of building layout. Every component “knew” which floor level it belonged to, and all manner of architectural objects were capable of being scheduled.
At this stage, contextualized building-related data was born: atoms of 3D objects with attached data floating in a further data context, what Victorians might have referred to as “ether.” This heralded the arrival of B.I.M.—“BM” linked to information-management. This is roughly where we find ourselves today, with 3D objects in a context that also creates linkages between the object data
But that is not the end of the story.
BI(m): The B.I.M. phase generally captures BIM’s status as we now know it, and much of its present focus revolves around its use in the design process—the data-input phase. However, we are quickly evolving into a new phase where models get transferred downstream to an increasing cast of builders, owners, and operations people. At this still-developing phase, which we could call BI(m), information about the project as a whole may become of greater importance than any particular model atom. This is because once a project passes a certain point, the workflow emphasis shifts, and it becomes equally crucial to get that data out. Though data-extraction is an ongoing activity during design, the nexus of the significant switch from data-input to data-extraction is often the “bid date”—the point where the design is complete, and the project enters the Bid, Construction and Operations phases.
Obviously, it is not possible to jettison model geometry completely in BI(m), but this context—project schedule, budget, and scope items—provides critical information for comprehensive tracking of construction projects.
The Spectrum of BIM: As we can see, the composition of BIM has gradually evolved from “BM” (Building Modeling) towards “BI” (Building Information) with various combinations in between. This sequential evolution of BIM, illustrated below, often mirrors the shift in emphasis seen over a project’s phases as the work moves from design to construction—the shift from modeling to information, from placing objects to retrieving information about them.
In the final analysis then, a Building Information model can be viewed as a collection of BIM atoms in an ether or context of project information. Over the life of a project, various participants will shift their focus from geometric aspects to contextual aspects.
Atoms of BIM. Image © EYP Architecture & Engineering
Authoring and Integration
Breaking large datasets into atom-sized packets is not the only solution for ballooning BIM files. It is possible to approach the problem by simply making the aggregation of large files more efficient. In this aggregative approach, large chunks of a project created by multiple authors and software would be compressed in a common format so that they could be dealt with as a unified whole.
In the construction field, for example, this approach can frequently be seen when contractors use software like NavisWorks for clash-detection, timeline-management, and coordination. Typically, very large BIM files are translated from their native format into a unified, proprietary format and then combined for various operations.
One of the advantages of this aggregation is that it is somewhat independent of the native file formats used; most common CAD applications can be exported to the NavisWorks format for example, and thus can be integrated and read together. Another advantage is that it is quick and relatively easy to coordinate with today’s technology.
In the file translation process, however, 3D objects lose a lot of their native intelligence, and the extra intelligence added by4D or clash-detection exercises is not easily transferred back to the authoring BIM application. This is one of the reasons why the alternative “atomic” approach is worth considering. In the long run, if we could formulate atoms of BIM in such a way that they remain intact as they are passed from application to application for information authoring in a non-destructive fashion, then that workflow would be more robust and extensible.
Authoring BIM Atoms. Image © EYP Architecture & Engineering
This introduces a further, significant concept for future BIM software development—that there will be authoring software, and there will be an integrative setting. Authoring software will populate a BIM environment with atoms, or add discipline-specific information to those atoms. Current BIM applications, for instance, are examples of atom-generation authoring software. Likewise, energy or cost estimating programs also will be information-authoring software.
The authoring concept is especially important because ultimately BIM authoring is unlikely to be the sole preserve of architects, engineers and other building designers; it will expand to include property managers, financiers, estimators, suppliers, procurers—in fact, anyone whose day-to-day job deals with the built environment.
Once the authoring of atoms is established, it is time to consider the nature of the second component, the integrative context. It is likely that the BIM “model” most people will interact with will be a static repository of atoms rather than a live interactive design environment. Each participant’s interactive BIM authoring software will produce or add atoms of data, placing them into this static context; the BIM environment will be the “ether” which manages those atoms. An energy analysis application, for example, might check out the rooms and spaces from a BIM model, along with the climatic context (Southern exposure, etc.). It would then author new information on those atoms and upload them again to the BIM repository for all participants to view. In an ideal arrangement, atoms will be able to be continuously “checked out” of the BIM repository, and authored anew with updated information.
Atoms and Ether
The concepts underlying atomicBIM may appear unfamiliar and remote, but in fact, the recognition that large files could be managed by fragmenting them was once present in very early CAD packages such as MicroGDS and ArrisCAD. In ArrisCAD for instance, layers were often saved as entirely separate .LYR files which were combined at the user’s desktop into the “drawing” file they required. Though BIM’s rich data adds several layers of complexity to the atom management, the key thrust was discernible. Fortunately many of the core concepts required for BIM “atoms” and “ether” are already under development.
Currently, the foremost candidate for the title of “BIM atom” is the IFC (Industry Foundation Class) effort, together with its allied initiatives in the BuildingSMART Alliance. Though originally created to address interoperability and the operations lifecycle, the IFC initiative has continued to draw further contributions from other groups, which taken together offer one of the best examples of defining atoms of BIM today. IFCs hold both geometric representations and associated data, so they already reflect many of the major characteristics of BM+I as described earlier.
IFC is structured so that its classifications describe building components in a uniform way, allowing designers, suppliers, and fabricators to agree on common terminology and property sets. More recently, an international working group has added important functionality to the IFC with the IFD (International Framework of Dictionaries). The IFD effort has developed a scheme in which building components can be consistently identified and classified. The IFD initiative is set to unite with the IFC efforts in the next IFC2x4 release for universal tracking and naming of objects in the lifecycle of a project. Essentially, the IFC/IFD combination will create unique, traceable atoms of BIM, which are 3D geometry with information data fields attached to them—which we previously termed BM+I.
A full discussion of the IFC/IFD effort is beyond the scope of this article, but suffice to say that it is a creative and exciting initiative with much promise for BIM’s future. With common definitions established, various pieces of software can deal with the same IFC instance and add information as required. The IFC/IFD pairing is ready to take the atomicBIM effort as far as the “BM+I” stage with a uniformly accepted file format.
IFD as a Mapping Mechanism
. Image courtesy Lars Bjørkhaug and Håvard Bell, www.ifd-library.org
The second, key component of atomicBIM, the “ether,” is also under development, with several different efforts underway. Similar to the aggregative approach which we discussed earlier, some applications that aggregate models already use IFCs as their core/interchange medium. Originating from Europe, Solibri Model Checker is an integrative environment that uses IFCs as an aggregation software with a host of authoring capabilities. In a similar vein, Building Explorer, a relatively new, integrated 4D/5D BIM solution uses IFC as its file format for many operations.
In the long run, however, the concept most often judged as most promising is that of “Model Server,” i.e., a central server environment which manages, classifies and distributes pieces of the model in much the same way as a central personnel database holds and dispenses data records in response to queries. In the AEC domain, we can already see some moves toward server-based collaboration with applications such as Bentley’s ProjectWise Integration Server. It’s worth noting that the pursuit of model servers is not confined to the AEC industry, demonstrating that this is an established approach to large object datasets. In the product manufacturing arena, applications have been developed to control parts and objects that can be accessed by multiple players from a central database. Many of these server solutions are employing widely accepted querying techniques, such as SQL, to structure their operation.
Given that IFCs are the leading candidate for the role of “atoms of construction,” there are also several efforts underway to develop IFC-based model servers which will fulfill this functionality. There are already some commercial IFC servers available, and a number of others in development. Several of these initiatives are based in Europe or are joint initiatives with US organizations. This includes Oracle, a company whose products are known for their ability to manage large databases. Oracle is already an accepted standard in other industries, with its products for managing production on a large, distributed basis. It is possible that we will see some of that technology migrate over to our industry in the near future in the form of a model server.
It remains to be seen whether the IFC format would actually become the final working prototype of atoms of BIM, but if not, at the very least much of its core thinking is invaluable to an atomic approach to BIM. Whichever final form they take, the eventual role of BIM atoms and contextual ether will be to create remote, universally-accessible model repositories that contain all data concerning a particular design project.
The Benefits of atomicBIM
The atomic approach to BIM may soon become a pressing need as BIM adoption spreads into more and more fields of activity. When mature, the BIM workflow will involve a multitude of players; a fully developed BIM repository will ideally house a vast store of information about the building involved. Regardless of how powerful computer hardware becomes, it is likely that we will simply add more information and stress the speed of the new hardware as soon as it is available. Even if speed were not a factor, there will clearly be massive amounts of data, and so some ability to parse and slice the data will be a requirement.
There are several ways in which atomicBIM can streamline the BIM workflow:
- Extracting slices of data and processing them in any numbers of authoring applications
- Enabling the use of thin-client devices for lean, efficient access to large datasets
- Easing interoperability and aggregation of data from multiple sources.
Slicing Data: As previously described, a large but granular data structure will permit many changes in workflow over our current situation. Some of the major advantages will include lean sowing and harvesting of data in central databases. In addition, there will be a short-term benefit to atomicBIM—helping control file size during production. But it is the long-term goal that is the most important, setting up a structure so that, regardless of the rate of growth, it is a robust scalable platform for future additions.
There are countless forms that “data-slices” from a BIM model could take. In all likelihood, these slices will take the form of structured queries from a team member to the central model server. For instance, sample queries might ask: “How many fire-rated doors are in this building?” or, “What is the detail of the floor/beam/wall connection on the 12th floor at column line F-12?” Ideally the model server would deliver only the appropriate atoms and associated data in response to these various queries.
Thin Client BIM Access: BIM data will ultimately need to become a ubiquitous resource for project participants. This means we will really want BIM information where it can provide the most value—on the job site. Today, there are already forward-thinking design teams who bring a BIM presence from the design environment to the job site. The next obvious step will be to expand BIM beyond the job trailer and have it universally accessible.
The ultimate experience of BIM model accessibility will probably occur when a device as lean as an iPhone or any other handheld can query a large BIM database and extract precisely the information required for the task at hand without much latency. We will need atomicBIM to enable a low threshold so that anyone can easily be involved.
Thin client access to large BIM datasets. Image © EYP Architecture & Engineering
Interoperability: This is a critical issue to the new workflow. Despite recent welcome moves to allow interoperability between design software by Autodesk and Bentley, everyone, including these companies themselves, acknowledge that it is only a step towards a larger, more difficult goal. No matter how many corporations agree to make their software more compatible, it will never be practical to have all software necessary for BIM purposes to talk to each other.
Luckily most BIM applications already export to the IFC format. The common language of IFCs will greatly help to promote collaboration. Universal, open interoperability will enable the traffic of commonly agreed definitions of construction items in a BIM environment, independent of any single software application. This will enable the authoring capabilities of a whole host of software applications to participate in a unified model.
Conclusion: Avoiding “Carpentry in Marble”
The technical challenges facing BIM proliferation today and in the near future are largely the result of legacy workflow protocols from current software. The resulting large files are triggering a host of unprecedented and unintended issues—WAN acceleration, hardware upgrades, and software overload. Even the impending move to 64-bit software computing is unlikely to remedy the problem in the long run. These issues are unlikely to be solved by simply optimizing current BIM software, no matter how many resources are devoted to that task. What is needed is a reassessment of our vision for the eventual BIM model.
Clearly, transitioning to a granular form of BIM will be a wrenching, but important re-alignment for the evolution of BIM. There are many exciting research initiatives related to model servers (many originating in Europe) that are exploring new ways to manage building data. These efforts point to a more sustainable proposition than our current ever-growing BIM models.
Though our current BIM solutions have served us well in the last decade, they may not be setting us up for future success. In particular, they have not created scalable, open or granular access to the information we create during design activities. The concept of atomicBIM will help us structure that information in a much more manageable way.
About the Author
John Tobin is a licensed Architect and a Principal with EYP Architecture & Engineering PC, a firm with offices in Albany, NY, Boston, MA, Washington, DC, New York City, and Orlando, FL.
Tobin has spent his career on the subject of technology in architecture, most recently with a focus on 3D technologies. A long-time Revit user, he is currently charged with implementing BIM across all architecture and engineering disciplines at EYP, including structural and MEP work. Prior to joining EYP, John spent ten years as a faculty member at Rensselaer School of Architecture in Troy, NY, where he taught design and technology courses. He can be reached at firstname.lastname@example.org.
Note: AECbytes content should not be reproduced on any other website, blog, print publication, or newsletter without permission.
Have comments or feedback on this article?
Visit its AECbytes
blog posting to share them with other
readers or see what others have to say.
the Future > atomicBIM: Unleashing BIM’s Power > Printer-friendly