Interior construction state recognition with 4D BIM registered image sequences
Introduction
Building Information Modelling (BIM) is an essential step to the digital management of construction projects. One big advantage of BIM is the holistic collection, linking and provision of data for different planning, construction and operation tasks. In the context of construction management, the application of 4D building models by linking activities of a schedule with corresponding building elements is very common. Based on 4D building models the construction sequence can be analyzed and progress monitoring can be supported.
In current practice, progress monitoring is mainly performed manually. To this end, weekly or daily paper-based progress reports are collected by field personnel by hand [1]. This procedure leads to a high workload of the field personnel to fulfil the need for a reasonable frequency of inspections. Hence, manual progress monitoring involves either a massive amount of human intervention or inspection updates cannot be performed as frequently as required. In particular, interior finishing works represent a high-ranking section of every construction project that is sensitive to schedule disorders. The average share in a typical total budget lies between 25% and 40% [2]. As a result, a higher degree of automation in progress monitoring would result in reasonable cost savings. The derivation of the actual state requires the automatic comparison of the as-planned information that is present in BIM with the actual as-built state of the building.
Recently, the research community has been investigating methods that aim to achieve a higher degree of automation in progress monitoring. Several approaches and studies that address the comparison of as-built and BIM-based as-planned data for this purpose have been presented. For example, sensing technologies like Radio-Frequency Identification (RFID) [3], [4], Ultra-Wideband (UWB) [5], [6], [7], [8]), Wi-Fi [9], ZigBee [10], laser scanners [11], [12], Global Positioning System [13], image [14], [15], [16], [17], [18], [19], video [20], [21], [22], [23], and depth image [24] capturing devices are used to obtain data of the as-built state.
Each of these technologies has its advantages and limitations in a specific environment. For instance, radio-based technologies are able to determine the presence of objects through walls and can, therefore, be deployed indoors, but are limited to presence detection or localization. Contrarily, stationary line-of-sight 3D laser sensing technologies obtain accurate measurements within general site scenes, but are too inflexible for indoor applications. Although depth images from portable devices are more flexible, they are currently limited in range. The main aspects of BIM-based automation of progress monitoring for a specific sensing technology and a target environment that need to be considered are:
Mapping of the sensed data to the BIM object instances and association to scheduled activities. The comparison of as-built and as-planned states is only possible if registration is performed. Objects from the BIM model need to be identified in the raw data. Detection solely of the type or class is not sufficient for precise progress statements. When a column is detected, the determined state needs to be associated with one specific instance of all available columns.
The raw data delivered by the sensing technology must have the capability of supporting the state detection of all activities that are supposed to be performed in the actual environment. Sensing technologies like RFID or laser scans are not able to determine states defined by sole appearance changes, for example, wall painting.
The sensing technology must meet the requirements of the environment's structure. It needs to handle permanent and temporary construction obstacles and distances of the actual environment. For instance, utilization of common RFID technology for triangulation of objects on construction sites delivers positions with an error of about 2 m [25], resulting in not being able to determine whether materials are already mounted or still stored. However, low distance RFID might help detect the presence of materials or construction objects within a range of up to 1 m (ISO/IEC 18000–3), which is useful in interior finishing inspections.
Sensing technologies may depend on infrastructure in order to collect data. For the usage of these technologies, it needs to be considered if the state of the building provides the required infrastructure. For instance, Wi-Fi-based localization of objects by triangulation assumes the presence of appropriate hardware and a dense distribution in every part of the building, which contrasts with the stage of interior finishing.
Currently, there exists no proper solution that covers the difficult situation imposed by view obscuring walls and registration issues along with the dense repetition of the similar (but distinct) building elements. This makes registration challenging for line-of-view sensing technologies and requires complex infrastructure for radio-based technologies. Moreover, there are challenges associated with the wide range of construction activities existing indoors.
Vision-based as-built data acquisition methods can meet all the four main aspects leveraging computer vision and image processing methods based on sequential imagery. Registration is possible on discovered correspondences between the image and the known geometry of the digital building model. A wide range of activities can be covered by appearance-based processing methods or scene and object structure recognition approaches. In addition, image sensing devices are highly portable and can easily elude barriers for a better view and do not depend on infrastructure. In Table 1, sensing technologies are rated for each of the four stated main aspects regarding BIM-based progress monitoring.
In this context, vision-based methods appear to represent the appropriate means to target indoor construction state recognition. However, current methods for indoor construction state recognition are independent of, and not associated with 4D BIM. This paper presents an approach to automate 4D BIM based progress monitoring.
Section snippets
Related work
Automation in progress monitoring has been the subject of research in recent years. Various sets of different approaches have been presented that are based on various acquisition and processing techniques. In this section, a summary of present approaches is given and the technologies are introduced that this work refers to.
Methodology
In this chapter, a novel method that targets a higher degree of automation for indoor progress monitoring with the consistent integration of as-planned 4D BIM information into the as-built inspection process is presented. For the first time, a coordinated set of processes is aligned that regards challenges and requirements for indoor scenes. The method considers extensive access to information present in the 4D BIM model in order to enable and support the progress monitoring processes. This
Experiments and results
The aim of the experiments was to evaluate the output of the single steps of the presented framework for validity and the viability of the whole framework as one unit. The evaluation of the registration steps is presented first, followed by the test results for the recognition, concluding in the final general experimental output.
Discussion
The results of the experiments in the preceding section show the overall viability of the framework. However, analyzing the results of the experiments, several issues, which the proposed method has yet to address, were encountered. This mainly includes the as-built acquisition and the registration block of the framework. During the processing of the image sequences, it could be observed that very rough motion of the camera, especially rotation-dominant motion in front of the close-range objects
Conclusions and outlook
In order to determine the actual state of buildings under construction, this paper contributes a promising novel method that increases the degree of automation for inspections. The method is based on the 4D BIM model information and image sequences that are merged and processed to derive rich information about single construction tasks of interest. Each image of a sequence is registered to the 4D BIM model in terms of its origin in the building model coordinate system and its point in time in
Acknowledgements
The authors gratefully acknowledge the financial support by the German Research Foundation (DFG) for this work under the grants KO 4311/4-1 and KO 3473/8-1.
References (95)
- et al.
An object-based 3D walk-through model for interior construction progress monitoring
Autom. Constr.
(2011) - et al.
Multisensor data fusion for on-site materials tracking in construction
Autom. Constr.
(2010) - et al.
Using reference RFID tags for calibrating the estimated locations of construction materials
- et al.
Performance evaluation of ultra wideband technology for construction resource location tracking in harsh environments
Autom. Constr.
(2011) - et al.
Automated task-level activity analysis through fusion of real time location sensors and worker's thoracic posture data
Autom. Constr.
(2013) - et al.
Onsite 3D marking for construction activity tracking
Autom. Constr.
(2013) - et al.
Accuracy assessment of ultra-wide band technology in tracking static resources in indoor construction scenarios
Autom. Constr.
(2013) - et al.
Application of WiFi-based indoor positioning system for labor tracking at construction sites: a case study in Guangzhou MTR
- et al.
Wireless sensor networks for resources tracking at building construction sites
Tsinghua Sci. Technol.
(2008) - et al.
Integrating 3D laser scanning and photogrammetry for progress measurement of construction work
Autom. Constr.
(2008)
Automatic spatio-temporal analysis of construction site equipment operations using GPS data
Autom. Constr.
Detection of large-scale concrete columns for automated bridge inspection
Autom. Constr.
Towards automated progress assessment of workpackage components in construction projects using computer vision
Adv. Eng. Inform.
3D structural component recognition and modeling method using color and 3D data for construction progress monitoring
Autom. Constr.
Automated vision tracking of project related entities
Adv. Eng. Inform.
Construction worker detection in video frames for initializing vision trackers
Autom. Constr.
Vision-based action recognition of earthmoving equipment using spatio–temporal features and support vector machine classifiers
Advanced Enginering Informatics.
Learning and classifying actions of construction workers and equipment using bag-of-video-feature-words and Bayesian network models
Adv. Eng. Inform.
Status quo and open challenges in vision-based sensing and tracking of temporary resources on infrastructure construction sites
Adv. Eng. Inform.
Construction performance monitoring via still images, time-lapse photos, and video streams: now, tomorrow, and the future
Adv. Eng. Inform.
Tracking and locating components in a precast storage yard utilizing radio frequency identification technology and GPS
Autom. Constr.
Automatically tracking engineered components through shipping and receiving processes with passive identification technologies
Autom. Constr.
Assessing the impact of materials tracking technologies on construction craft productivity
Autom. Constr.
Situational awareness of construction equipment using GPS, wireless and web technologies
Integrating automated data acquisition technologies for progress reporting of construction projects
A survey of active and passive indoor localisation systems
Comput. Commun.
Evaluation of image-based modeling and laser scanning accuracy for emerging automated performance monitoring techniques
Autom. Constr.
Automatic reconstruction of as-built building information models from laser-scanned point clouds: a review of related techniques
Autom. Constr.
Automated progress tracking using 4D schedule and 3D sensing technologies
Automated recognition of 3D CAD model objects in laser scans and calculation of as-built dimensions for dimensional compliance control in construction
Adv. Eng. Inform.
Automating progress measurement of construction projects
Autom. Constr.
Automated construction activity monitoring system
Adv. Eng. Inform.
Automating progress measurement of construction projects
Autom. Constr.
Fully automated registration of 3D data to a 3D CAD model for project progress monitoring
Autom. Constr.
State of research in automatic as-built modelling
Adv. Eng. Inform.
New algorithms for 2D and 3D point matching:: pose estimation and correspondence
Pattern Recogn.
Iterative pose estimation using coplanar feature points
Comput. Vis. Image Underst.
Automated computer vision-based detection of components of under-construction indoor partitions
Autom. Constr.
Towards automated progress assessment of workpackage components in construction projects using computer vision
Adv. Eng. Inform.
EDLines: a real-time line segment detector with a false detection control
Pattern Recogn. Lett.
MSLD: A robust descriptor for line matching
Pattern Recogn.
An efficient and robust line segment matching approach based on LBD descriptor and pairwise geometric consistency
J. Vis. Commun. Image Represent.
Multimodal image matching based on multimodality robust line segment descriptor
Neurocomputing
Ablaufplanung
The value of integrating scan-to-BIM and scan-vs-BIM techniques for construction monitoring using laser scanning and BIM: the case of cylindrical MEP components
Autom. Constr.
Data acquisition from construction sites for tracking purposes
Eng. Constr. Archit. Manag.
Cited by (117)
Activity-level construction progress monitoring through semantic segmentation of 3D-informed orthographic images
2024, Automation in ConstructionAutomated vision-based construction progress monitoring in built environment through digital twin
2023, Developments in the Built EnvironmentPlatform-independent visual installation progress monitoring for construction automation
2023, Automation in ConstructionAutomated progress monitoring technological model for construction projects
2023, Ain Shams Engineering JournalAutomated UAV image-to-BIM registration for building façade inspection using improved generalised Hough transform
2023, Automation in ConstructionZero-reference deep learning for low-light image enhancement of underground utilities 3D reconstruction
2023, Automation in Construction