The analysis of human motion is an important task in various surveillance applications. Getting 3D information through a calibrated system might enhance the benefits of such analysis. This paper presents a novel technique to automatically recover both intrinsic and extrinsic parameters for each surveillance camera within a camera network by only using a walking human. The same feature points of a pedestrian are taken to calculate each camera’s intrinsic parameters and to determine the relative orientations of multiple cameras within a network as well as the absolute positions within a common coordinate system. Experimental results, showing the accuracy and the practicability, are presented at the end of the paper.
Swipe to navigate through the chapters of this book
Please log in to get access to this content
To get access to this content you need the following product:
- Multiple Camera Self-calibration and 3D Reconstruction Using Pedestrians
- Springer Berlin Heidelberg
- Sequence number
Neuer Inhalt/© ITandMEDIA