AECbytes Viewpoint #61 (July 31, 2011)
Debunking the Myths About BIM in the “Cloud”
Latest Panacea to all IT Problems
“Cloud Computing is all the rage…” as a recent InfoWorld.com article put it in a straight way. Cloud computing is not only considered as THE solution to many IT related problems but is becoming the platform of brand new business models as well. All the “tech” giants such as Apple, Microsoft or Amazon have been extremely busy launching their own version of cloud platforms and services. The “cloud” is, in fact, transforming our understanding of computing—now we are starting to consider computing resources not as actual tools but rather as a utility that is ubiquitously and infinitely available, similar to electricity or water.
As no serious IT discussion can bypass “the cloud” these days, little wonder that the AEC space is busy too with news and announcements about “brand new” cloud solutions and offerings. Reading the news, AEC professionals may think that BIM already has or will very soon have fully moved to the cloud. Without implementing BIM as a cloud solution, AEC firms may feel well behind current times. Yet I strongly suggest that they restrain from drawing conclusions too early. Contrary to public opinion, cloud computing is not a recent phenomenon and is a much more complex matter than it may appear, so it is worth understanding the full puzzle first.
Cloud Computing Fundamentals
Cloud computing is not a specific technology or a particular software solution. Instead, it is an umbrella concept for different methods to share resources over computer networks, as shown below in the cloud computing logical diagram (source: Wikipedia)
From the end-user perspective, there are four fundamentally different approaches to cloud computing, popularized by different companies, as described in the following sections.
#1 – Data in the Cloud (The “Amazon” Approach)
The most basic approach to cloud computing is when the cloud is considered as a giant server (hard-drive) that anyone can access for a fee. In this case, users can store their files (data) in the cloud with virtually unlimited storage space that dynamically scales; data can be remotely accessed from any computer in the network; files can easily be shared with others; and the data itself is stored in a much higher security environment.
This technology has been long used in different data centers, but was first productized to the large-scale consumer market by the Amazon Cloud Drive for simple file storage purposes as well as specific media file storage service such as music, photo and video. Cloud data storage is obviously a natural component in all other cloud solution approaches but as a standalone function, it appears most clear-cut at Amazon.com.
While the cloud for Amazon is clearly an auxiliary business today, it is worth mentioning that there are businesses where the entire business model is built around data storage in the cloud. Dropbox with its 25+ million users may be the most popular such solution. Here the functional extra is that your cloud folder automatically syncs with any of your client computers in both directions—small addition, huge business success.
#2 – Software Virtualization in the Cloud (The “Citrix” Approach)
Another technology is where in addition to just data sharing, actual product functionality is made available through the cloud as well. The most basic approach here is software virtualization that has been available for almost two decades now for remote desktop access. Recent developments in virtualization technologies now make this platform capable of close-to-real-time running of full desktop applications on a virtualization layer in a runtime environment such as the Citrix XenApp solution.
Software virtualization basically means that the software is only installed on the server and is provided to client computers “on-demand.” The software runs on the server and only the UI (user interface) is streamed to the client computers.
This approach provides great benefits, both to the IT organization with central software and license management that provides greater flexibility in software deployment and software updates, as well as to the user with better scaling performance. Client computers are less demanding in hardware, although it requires very strong and expensive server-hardware and IT setup. At the same time, virtualization does not change the main paradigm of desktop applications, only centralizes their running and management; it does not solve the problem of different users or teams working in parallel on one file. Consequently, it cannot offer real solutions to many of the issues that users expect to be solved by cloud-based solutions.
#3 – Web Applications in the Cloud (The “Google” Approach)
It is interesting to take a look at the realm of web applications. Google has been for long a pure “search” company but with the launch of Gmail and Google Docs, web-based application programming has become immensely popular in various areas such as social media and online document creation and sharing. Recently, Microsoft has also announced the launch of its web-based office solution, Office 365.
There are lots of web-ported desktop applications available where the desktop application’s functionality is “optimized” to the web, which naturally means limitations both in functionality and performance. The web is a great platform for data creation, access, and management; Google has even tried to put an entire Operating System onto the web platform with Google Chrome OS. Due to its different primary purpose, the web platform should not aim to be the primary platform for porting complex high graphic- and calculation-intensive applications. At the same time, web optimized versions of such applications can be perfect to solve focused functional requirements such as analysis in the cloud.
#4 – Business Logic in the Cloud (The “Apple” Cloud Approach)
Apple was largely silent regarding the cloud for a long time until recently when the company announced their cloud solution called the iCloud. As always, Apple has a different idea about things, which now includes questions such as what the cloud is good for and how it should best be utilized for maximized end-user benefit. The essence of Apple’s approach is that the cloud is not the purpose but the tool to solve end-user problems, hence it can even remain invisible to the end-user. It is indeed a different approach, but what does it mean to implement it in reality?
In Apple’s understanding, the cloud is not only an infinite storage device, it can also be extended with solution-specific business logic in the background. What the default iCloud end-user applications do (such as Apple Pages and other iOS applications) is that they constantly and intelligently synch in the background, not only saved files but also the actual ongoing progress of documents. (Steve Jobs envisions that the phenomenon “File Saving” will soon disappear.) In this way, Apple combines the benefits of live collaboration environment of web-apps (such as Google Docs) with full offline access and functionality.
This approach has further potential as Apple allows third-party applications to add their own business logic to the iCloud platform with an application-programming interface (API) to iCloud. In this way, Apple is building a hybrid solution where the cloud is an active component in the workflow, but without the drawbacks and limitations that a pure cloud solution inherently carries. Obviously, here we are talking about the approach, so there is no need for Apple’s iCloud to provide similarly optimized hybrid solutions.
BIM in the Cloud
Selecting a cloud solution will typically involve a very extensive list of factors and considerations. The very first question to decide is if one should go for a public or private cloud setup. The fundamental difference is that a private cloud setup provides IT full control over everything, since it is installed and maintained in one’s own premises. But the trade-off is that IT management of infrastructure and hardware is still an in-house “headache,” even if it can be done centrally. The obvious factors to consider include but are not limited to: data security; user management; administration tools; license management; bandwidth requirements; resilience (no down-time); technical support services; operating system support; cost (implementation, maintenance), etc.
Although all of these are very important factors to consider, they are mostly general considerations with little impact on the actual BIM workflow. Therefore, for now, I will focus on the following two factors that can greatly influence the effectiveness of the actual BIM workflow, especially for larger projects that will typically be the focus of a cloud-based solution.
#1 – Accessibility of the BIM Model
Since BIM models are large, complex and highly integrated databases, it is especially important to provide concurrent access to BIM models while maintaining their highly integrated nature but not introducing unnecessary limitations. Issues to be considered here include:
Concurrent “Real-time” Access to the BIM Project
This is clearly an issue to be considered regardless of any cloud infrastructure, but it is a definite must-have for the cloud. Should your BIM solution not be able to provide real-time model-based collaboration inside the same physical office (on a LAN), no “cloud solution” can be the cure to your basic problem, which is creating and editing the BIM model with a larger team. Once this issue is sorted out, concurrent access to the BIM project from remote locations may be possible with the four different cloud approaches in different ways: while the “data storage cloud” (the Amazon way) has little contribution to solving the problem, “virtualization solutions” (the Citrix way) may be of help here (note that the first requirement still needs to be met). Concurrent remote access through “web applications” (the Google way) is also easily provided, but as we discussed above, the authoring capabilities are in most of the cases seriously limited. The “business logic” or “active component in the cloud” approach (the Apple way) can be a real solution here for an active component in the cloud, such as the GRAPHISOFT BIM Server, which can concurrently serve multiple clients with relatively small “delta” data packages transmitted through the WAN.
Purpose-built “Light-weight” Access to the BIM Project
Next to remote access, another consideration of growing importance is accessing the BIM model’s different layers through purpose-built interfaces; here naturally, we face a growing demand for mobile devices based solutions. Mobile access can easily be provided with all four approaches, again at different levels: with software virtualization, web application, and active server clients, the actual BIM model can be made accessible virtually from anywhere, while in the “data” cloud, auxiliary information may be accessed from mobile devices such as pre-saved presentation materials (images, documents, etc.). Obviously, a specifically written mobile client app with direct access to the BIM project through an active BIM server in the cloud can offer the best user experience and the highest project intelligence.
Offline access to the BIM Project
The last but just as equally important factor is accessing the BIM model during offline periods. For this purpose, the only viable solution is the “active server” in the cloud that makes the BIM project available and accessible for further edits on the local computer while being offline, but ensuring no conflicting edits among team members and perfect model integrity upon next model synchronization. The other solutions are, by their nature, not accessible during offline times or have serious limitations in the above mentioned considerations
#2 – System Performance of the BIM Model
Here, the ultimate question is scalability—in other words, how the highly integrated BIM project performs with a growing model and a growing team and with growing geographic distances in the cloud. Issues to be considered here include:
The BIM Project Scaling with the Hardware
This is again clearly an issue to be considered regardless of any cloud infrastructure, but a definite must-have for the cloud. Without scaling, with the growing number of processors and increased available memory, BIM projects very soon bump up against a rock-solid ceiling in project size. Both multiprocessor and 64-bit support are mission critical, since the “cloud” virtually has an infinite number of processors and limitless available memory. If a particular BIM solution only supports one or the other, the true potential of the cloud infrastructure can physically not be utilized.
The BIM Project Scaling with the Software
Next to scaling with the basic hardware, cloud solutions are, in most of the cases, “multitenant” environments. This means that the client computers need to deal with one “virtual machine” while, in the background, a whole “grid” of computers serves the client requests. This has little to do with the client machines; here, the server side component needs to be prepared to manage and share data on the “grid.” This factor is not only the prerequisite for optimized IT setup/budget, but is the real deal when scaling on large public clouds where one installation is supposed to serve a large number of clients.
Cloud computing has been a huge and dynamically growing trend in the IT industry. One needs to clearly understand the differences of the main system approaches applied in the different cloud solutions.
BIM projects especially have unique requirements that need to be considered in order to be able to exploit the true potential of the cloud. The solution, I believe, is an integrated BIM model instead of a collection of standalone files. This integrated model requires an active BIM server component that enables both real-time collaboration of “professionals” and provides direct bidirectional access to the live BIM model for any stakeholders through purpose-built tools running on various mobile devices. Such an active BIM server requires multitenant software architecture in order to be able to scale with the “cloud” and utilize its infinite data storage and computing capacity to provide the best ROI on the IT infrastructure.
Another question that would be worth discussing in a separate article is the approach to the client side applications in the cloud setup. There are, again, different approaches here with “thin” or “thick” clients, but in my opinion, regardless of the actual approach, software vendors will keep needing to put emphasis on further optimizing their solutions for continuously improving support of multiprocessing and multithreading software architectures.
About the Author
Viktor Várkonyi is the CEO of Graphisoft. He is a seasoned industry professional who has spent almost two decades on the design of workflow solutions for ArchiCAD, a leading BIM solution for architects.
For questions or comments related to this article, you can write to Viktor at: email@example.com.
Note: The views expressed in Viewpoint articles
are those of the individual authors and do not
necessarily reflect those of AECbytes. Also, AECbytes content should not be reproduced on any other website, blog, print publication, or newsletter without permission.
If you found this article useful and have not yet subscribed to AECbytes, please consider doing so. Subscription is free, and more subscribers will allow this publication to provide more of such content to you.