Abstract:
A processing for specifying a correspondence relationship of feature points between two sets of optical data can be highly precise and efficiently carried out. The correspondence relationship of the perpendicular edges is obtained based on the assumption that the object is a building, in the processing for integrating the three-dimensional model obtained from the point cloud position data and the three-dimensional model obtained from the stereophotographic image. In this case, one perpendicular edge is defined by the relative position relationship with the other perpendicular edge, and the correspondence relationship is high-precisely and rapidly searched.
Abstract:
A technique is provided for efficiently process three-dimensional point cloud position data that are obtained at different viewpoints. A projecting plane is set in a measurement space as a parameter for characterizing a target plane contained in plural planes that form an object. The target plane and other planes are projected on the projecting plane. Then, a distance between each plane and the projecting plane is calculated at each grid point on the projecting plane, and the calculated matrix data is used as a range image that characterizes the target plane. The range image is also formed with respect to the other planes and with respect to planes that are viewed from another viewpoint. The range images of the two viewpoints are compared, and a pair of the planes having the smallest difference between the range images thereof is identified as matching planes between the two viewpoints.
Abstract:
A point cloud data processing device is equipped with a non-plane area removing unit 101, a plane labeling unit 102, and a contour calculating unit 106. The non-plane area removing unit 101 removes point cloud data relating to non-plane areas from point cloud data because the non-plane areas apply a high load in calculation. In the point cloud data, a two-dimensional image of an object is linked with data of three-dimensional coordinates of plural points that form the two-dimensional image. The plane labeling unit 102 adds labels for identifying planes with respect to the point cloud data in which the data of the non-plane areas are removed. The contour calculating unit 106 calculates a contour of the object by using local flat planes based on a local area that is connected with the labeled plane.
Abstract:
The device includes a unit obtaining an object's point cloud position data, a unit obtaining the object's image data, a unit in which co-relationship between point cloud position data obtained in the point cloud position data obtaining unit through a primary viewpoint or image data obtained in the image data obtaining unit through the primary viewpoint and image data obtained in the image data obtaining unit through a secondary (different from the primary) viewpoint are identified, a unit forming a three-dimensional model by the data obtained in the point cloud position data obtaining unit, and a unit controlling displaying of the model formed in the model forming unit on a displaying device. The model forming unit forms a three-dimensional model having direction seen from the secondary viewpoint, depending on the co-relationship identified in the co-relationship identifying unit. Operators see the model seen from the secondary viewpoint as an image.
Abstract:
A processing for specifying a correspondence relationship of feature points between two sets of optical data can be highly precise and efficiently carried out. The correspondence relationship of the perpendicular edges is obtained based on the assumption that the object is a building, in the processing for integrating the three-dimensional model obtained from the point cloud position data and the three-dimensional model obtained from the stereophotographic image. In this case, one perpendicular edge is defined by the relative position relationship with the other perpendicular edge, and the correspondence relationship is high-precisely and rapidly searched.