Elsevier

Automation in Construction

Volume 86, February 2018, Pages 11-32
Automation in Construction

Interior construction state recognition with 4D BIM registered image sequences

https://doi.org/10.1016/j.autcon.2017.10.027Get rights and content

Highlights

  • Interior activity state recognition framework with BIM-registered image sequences

  • Novel pipeline for robust recognition of activity states is presented and evaluated.

  • Available BIM information is leveraged for method development.

  • Only very little manual intervention is necessary for each inspection.

  • The proposed method allows for simultaneous activity recognitions.

Abstract

Deviations from planned schedules in construction projects frequently lead to unexpected financial disadvantages. However, early assessment of delays or accelerations during the phase of construction enables the adjustment of subsequent and dependent tasks. Manually performed, this involves many human resources if as-built information is not immediately available. This is particularly valid for indoor environments, where a general overview of tasks is not given.

In this paper, we present a novel method that increases the degree of automation for indoor progress monitoring. The novel method recognizes the actual state of construction activities from as-built video data based on as-planned BIM data using computer vision algorithms. To achieve that, two main steps are incorporated. The first step registers the images with the underlying 4D BIM model. This means the discovery of the pose of each image of a sequence according to the coordinate system of the building model. Being aware of the image origin, it allows for the advanced interpretation of the content in consecutive processing. In the second step, the relevant tasks of the expected state of the 4D BIM model are projected onto the image space. The resulting image regions of interest are then taken as input for the determination of the activity state.

The method is extensively tested in the experiment section of this paper. Since each consecutive process is based on the output of preceding steps, each process of the introduced method is tested for its standalone characteristics. In addition, the general manner of applicability is evaluated by means of two exemplary tasks as a concluding proof of the success of the novel method. All experiments show promising results and direct towards automatic indoor progress monitoring.

Introduction

Building Information Modelling (BIM) is an essential step to the digital management of construction projects. One big advantage of BIM is the holistic collection, linking and provision of data for different planning, construction and operation tasks. In the context of construction management, the application of 4D building models by linking activities of a schedule with corresponding building elements is very common. Based on 4D building models the construction sequence can be analyzed and progress monitoring can be supported.

In current practice, progress monitoring is mainly performed manually. To this end, weekly or daily paper-based progress reports are collected by field personnel by hand [1]. This procedure leads to a high workload of the field personnel to fulfil the need for a reasonable frequency of inspections. Hence, manual progress monitoring involves either a massive amount of human intervention or inspection updates cannot be performed as frequently as required. In particular, interior finishing works represent a high-ranking section of every construction project that is sensitive to schedule disorders. The average share in a typical total budget lies between 25% and 40% [2]. As a result, a higher degree of automation in progress monitoring would result in reasonable cost savings. The derivation of the actual state requires the automatic comparison of the as-planned information that is present in BIM with the actual as-built state of the building.

Recently, the research community has been investigating methods that aim to achieve a higher degree of automation in progress monitoring. Several approaches and studies that address the comparison of as-built and BIM-based as-planned data for this purpose have been presented. For example, sensing technologies like Radio-Frequency Identification (RFID) [3], [4], Ultra-Wideband (UWB) [5], [6], [7], [8]), Wi-Fi [9], ZigBee [10], laser scanners [11], [12], Global Positioning System [13], image [14], [15], [16], [17], [18], [19], video [20], [21], [22], [23], and depth image [24] capturing devices are used to obtain data of the as-built state.

Each of these technologies has its advantages and limitations in a specific environment. For instance, radio-based technologies are able to determine the presence of objects through walls and can, therefore, be deployed indoors, but are limited to presence detection or localization. Contrarily, stationary line-of-sight 3D laser sensing technologies obtain accurate measurements within general site scenes, but are too inflexible for indoor applications. Although depth images from portable devices are more flexible, they are currently limited in range. The main aspects of BIM-based automation of progress monitoring for a specific sensing technology and a target environment that need to be considered are:

Mapping of the sensed data to the BIM object instances and association to scheduled activities. The comparison of as-built and as-planned states is only possible if registration is performed. Objects from the BIM model need to be identified in the raw data. Detection solely of the type or class is not sufficient for precise progress statements. When a column is detected, the determined state needs to be associated with one specific instance of all available columns.

The raw data delivered by the sensing technology must have the capability of supporting the state detection of all activities that are supposed to be performed in the actual environment. Sensing technologies like RFID or laser scans are not able to determine states defined by sole appearance changes, for example, wall painting.

The sensing technology must meet the requirements of the environment's structure. It needs to handle permanent and temporary construction obstacles and distances of the actual environment. For instance, utilization of common RFID technology for triangulation of objects on construction sites delivers positions with an error of about 2 m [25], resulting in not being able to determine whether materials are already mounted or still stored. However, low distance RFID might help detect the presence of materials or construction objects within a range of up to 1 m (ISO/IEC 18000–3), which is useful in interior finishing inspections.

Sensing technologies may depend on infrastructure in order to collect data. For the usage of these technologies, it needs to be considered if the state of the building provides the required infrastructure. For instance, Wi-Fi-based localization of objects by triangulation assumes the presence of appropriate hardware and a dense distribution in every part of the building, which contrasts with the stage of interior finishing.

Currently, there exists no proper solution that covers the difficult situation imposed by view obscuring walls and registration issues along with the dense repetition of the similar (but distinct) building elements. This makes registration challenging for line-of-view sensing technologies and requires complex infrastructure for radio-based technologies. Moreover, there are challenges associated with the wide range of construction activities existing indoors.

Vision-based as-built data acquisition methods can meet all the four main aspects leveraging computer vision and image processing methods based on sequential imagery. Registration is possible on discovered correspondences between the image and the known geometry of the digital building model. A wide range of activities can be covered by appearance-based processing methods or scene and object structure recognition approaches. In addition, image sensing devices are highly portable and can easily elude barriers for a better view and do not depend on infrastructure. In Table 1, sensing technologies are rated for each of the four stated main aspects regarding BIM-based progress monitoring.

In this context, vision-based methods appear to represent the appropriate means to target indoor construction state recognition. However, current methods for indoor construction state recognition are independent of, and not associated with 4D BIM. This paper presents an approach to automate 4D BIM based progress monitoring.

Section snippets

Related work

Automation in progress monitoring has been the subject of research in recent years. Various sets of different approaches have been presented that are based on various acquisition and processing techniques. In this section, a summary of present approaches is given and the technologies are introduced that this work refers to.

Methodology

In this chapter, a novel method that targets a higher degree of automation for indoor progress monitoring with the consistent integration of as-planned 4D BIM information into the as-built inspection process is presented. For the first time, a coordinated set of processes is aligned that regards challenges and requirements for indoor scenes. The method considers extensive access to information present in the 4D BIM model in order to enable and support the progress monitoring processes. This

Experiments and results

The aim of the experiments was to evaluate the output of the single steps of the presented framework for validity and the viability of the whole framework as one unit. The evaluation of the registration steps is presented first, followed by the test results for the recognition, concluding in the final general experimental output.

Discussion

The results of the experiments in the preceding section show the overall viability of the framework. However, analyzing the results of the experiments, several issues, which the proposed method has yet to address, were encountered. This mainly includes the as-built acquisition and the registration block of the framework. During the processing of the image sequences, it could be observed that very rough motion of the camera, especially rotation-dominant motion in front of the close-range objects

Conclusions and outlook

In order to determine the actual state of buildings under construction, this paper contributes a promising novel method that increases the degree of automation for inspections. The method is based on the 4D BIM model information and image sequences that are merged and processed to derive rich information about single construction tasks of interest. Each image of a sequence is registered to the 4D BIM model in terms of its origin in the building model coordinate system and its point in time in

Acknowledgements

The authors gratefully acknowledge the financial support by the German Research Foundation (DFG) for this work under the grants KO 4311/4-1 and KO 3473/8-1.

References (95)

  • N. Pradhananga et al.

    Automatic spatio-temporal analysis of construction site equipment operations using GPS data

    Autom. Constr.

    (2013)
  • Z. Zhu et al.

    Detection of large-scale concrete columns for automated bridge inspection

    Autom. Constr.

    (2010)
  • Y.M. Ibrahim et al.

    Towards automated progress assessment of workpackage components in construction projects using computer vision

    Adv. Eng. Inform.

    (2009)
  • H. Son et al.

    3D structural component recognition and modeling method using color and 3D data for construction progress monitoring

    Autom. Constr.

    (2010)
  • I. Brilakis et al.

    Automated vision tracking of project related entities

    Adv. Eng. Inform.

    (2011)
  • M.W. Park et al.

    Construction worker detection in video frames for initializing vision trackers

    Autom. Constr.

    (2012)
  • M. Golparvar-Fard et al.

    Vision-based action recognition of earthmoving equipment using spatio–temporal features and support vector machine classifiers

    Advanced Enginering Informatics.

    (2013)
  • J. Gong et al.

    Learning and classifying actions of construction workers and equipment using bag-of-video-feature-words and Bayesian network models

    Adv. Eng. Inform.

    (2011)
  • J. Teizer

    Status quo and open challenges in vision-based sensing and tracking of temporary resources on infrastructure construction sites

    Adv. Eng. Inform.

    (2015)
  • J. Yang et al.

    Construction performance monitoring via still images, time-lapse photos, and video streams: now, tomorrow, and the future

    Adv. Eng. Inform.

    (2015)
  • E. Ergen et al.

    Tracking and locating components in a precast storage yard utilizing radio frequency identification technology and GPS

    Autom. Constr.

    (2007)
  • D. Grau et al.

    Automatically tracking engineered components through shipping and receiving processes with passive identification technologies

    Autom. Constr.

    (2012)
  • D. Grau et al.

    Assessing the impact of materials tracking technologies on construction craft productivity

    Autom. Constr.

    (2009)
  • A.A. Oloufa et al.

    Situational awareness of construction equipment using GPS, wireless and web technologies

  • S. El-Omari et al.

    Integrating automated data acquisition technologies for progress reporting of construction projects

  • G. Deak et al.

    A survey of active and passive indoor localisation systems

    Comput. Commun.

    (2012)
  • M. Golparvar-Fard et al.

    Evaluation of image-based modeling and laser scanning accuracy for emerging automated performance monitoring techniques

    Autom. Constr.

    (2011)
  • P. Tang et al.

    Automatic reconstruction of as-built building information models from laser-scanned point clouds: a review of related techniques

    Autom. Constr.

    (2010)
  • Y. Turkan et al.

    Automated progress tracking using 4D schedule and 3D sensing technologies

  • F. Bosché

    Automated recognition of 3D CAD model objects in laser scans and calculation of as-built dimensions for dimensional compliance control in construction

    Adv. Eng. Inform.

    (2010)
  • X. Zhang et al.

    Automating progress measurement of construction projects

    Autom. Constr.

    (2009)
  • D. Rebolj et al.

    Automated construction activity monitoring system

    Adv. Eng. Inform.

    (2008)
  • X. Zhang et al.

    Automating progress measurement of construction projects

    Autom. Constr.

    (2009)
  • C. Kim et al.

    Fully automated registration of 3D data to a 3D CAD model for project progress monitoring

    Autom. Constr.

    (2013)
  • V. Pătrăucean et al.

    State of research in automatic as-built modelling

    Adv. Eng. Inform.

    (2015)
  • S. Gold et al.

    New algorithms for 2D and 3D point matching:: pose estimation and correspondence

    Pattern Recogn.

    (1998)
  • D. Oberkampf et al.

    Iterative pose estimation using coplanar feature points

    Comput. Vis. Image Underst.

    (1996)
  • H. Hamledari et al.

    Automated computer vision-based detection of components of under-construction indoor partitions

    Autom. Constr.

    (2017)
  • Y.M. Ibrahim et al.

    Towards automated progress assessment of workpackage components in construction projects using computer vision

    Adv. Eng. Inform.

    (2009)
  • C. Akinlar et al.

    EDLines: a real-time line segment detector with a false detection control

    Pattern Recogn. Lett.

    (2011)
  • Z. Wang et al.

    MSLD: A robust descriptor for line matching

    Pattern Recogn.

    (2009)
  • L. Zhang et al.

    An efficient and robust line segment matching approach based on LBD descriptor and pairwise geometric consistency

    J. Vis. Commun. Image Represent.

    (2013)
  • C. Zhao et al.

    Multimodal image matching based on multimodality robust line segment descriptor

    Neurocomputing

    (2016)
  • P. Greiner et al.

    Ablaufplanung

  • F. Bosché et al.

    The value of integrating scan-to-BIM and scan-vs-BIM techniques for construction monitoring using laser scanning and BIM: the case of cylindrical MEP components

    Autom. Constr.

    (2014)
  • K. Han et al.
  • S. El-Omari et al.

    Data acquisition from construction sites for tracking purposes

    Eng. Constr. Archit. Manag.

    (2009)
  • Cited by (117)

    View all citing articles on Scopus
    View full text