Abstract:
A technique is provided for efficiently process three-dimensional point cloud position data that are obtained at different viewpoints. A projecting plane is set in a measurement space as a parameter for characterizing a target plane contained in plural planes that form an object. The target plane and other planes are projected on the projecting plane. Then, a distance between each plane and the projecting plane is calculated at each grid point on the projecting plane, and the calculated matrix data is used as a range image that characterizes the target plane. The range image is also formed with respect to the other planes and with respect to planes that are viewed from another viewpoint. The range images of the two viewpoints are compared, and a pair of the planes having the smallest difference between the range images thereof is identified as matching planes between the two viewpoints.
Abstract:
A technique for efficiently calibrating a camera is provided. Reference laser scan data is obtained by scanning a building 131 by a laser scanner 115, which is fixed on a vehicle 100 and has known exterior orientation parameters, while the vehicle 100 travels. An image of the building 131 is photographed at a predetermined timing by an onboard camera 113. Reference point cloud position data, in which the reference laser scan data is described in a coordinate system defined on the vehicle 100 at the predetermined timing, is calculated based on the trajectory the vehicle 100 has traveled. Matching points are selected between feature points in the reference point cloud position data and in the image. Exterior orientation parameters of the camera 113 are calculated based on relative relationships between the reference point cloud position data and image coordinate values in the image of the matching points.
Abstract:
Exterior orientation parameters of a camera are easily determined by, for example, a reference image being obtained by photographing a building 131 with a camera 112, in which exterior orientation parameters are determined, while a vehicle 100 travels, and a comparative image being simultaneously obtained by photographing the building 131 with a camera 113, in which exterior orientation parameters are undetermined. Then, points that match between the reference image and the comparative image are selected, and relative orientation and scale adjustment using a predetermined scale are performed, whereby the exterior orientation parameters of the camera 113 are calculated.
Abstract:
A technique is disclosed for easily performing calibration of a camera by using a MMS. A calibration device for a camera that is configured to photograph the sun includes an in-image sun position identifying unit 115, a sun position estimating unit 113, and a camera attitude calculating unit 116. The in-image sun position identifying unit 115 identifies a position of the sun in an image that is photographed by the camera. The sun position estimating unit 113 estimates a position of the sun in the image based on orbital information of the sun. The camera attitude calculating unit 116 calculates attitude of the camera based on differences between identified positions and the estimated positions of the sun in the images.
Abstract:
A technique for performing calibration of a laser scanner efficiently is provided. Point cloud position data of a building 131 is obtained by a laser scanner 141 in which exterior orientation parameters are already known. On the other hand, the building 131 is scanned by a laser scanner 115 while a vehicle 100 travels, and point cloud position data of the building 131 measured by the laser scanner 115 is obtained based on a trajectory the vehicle 100 has traveled. Then, exterior orientation parameters of the laser scanner 115 are calculated based on a correspondence relationship between these two point cloud position data.
Abstract:
Exterior orientation parameters of a camera are easily determined by, for example, a reference image being obtained by photographing a building 131 with a camera 112, in which exterior orientation parameters are determined, while a vehicle 100 travels, and a comparative image being simultaneously obtained by photographing the building 131 with a camera 113, in which exterior orientation parameters are undetermined. Then, points that match between the reference image and the comparative image are selected, and relative orientation and scale adjustment using a predetermined scale are performed, whereby the exterior orientation parameters of the camera 113 are calculated.
Abstract:
The device includes a unit obtaining an object's point cloud position data, a unit obtaining the object's image data, a unit in which co-relationship between point cloud position data obtained in the point cloud position data obtaining unit through a primary viewpoint or image data obtained in the image data obtaining unit through the primary viewpoint and image data obtained in the image data obtaining unit through a secondary (different from the primary) viewpoint are identified, a unit forming a three-dimensional model by the data obtained in the point cloud position data obtaining unit, and a unit controlling displaying of the model formed in the model forming unit on a displaying device. The model forming unit forms a three-dimensional model having direction seen from the secondary viewpoint, depending on the co-relationship identified in the co-relationship identifying unit. Operators see the model seen from the secondary viewpoint as an image.
Abstract:
The position and the attitude of a device or a member are determined. A camera 115, in which exterior orientation parameters are not determined, is photographed by a camera 113, in which exterior orientation parameters in an IMU coordinate system are determined. The camera 115 is rotatable around a shaft, in which the direction of the rotational shaft is preliminarily set, and a distance between the camera 113 and the camera 115 is known. In this condition, by analyzing the image of the camera 115 photographed by the camera 113, the exterior orientation parameters of the camera 115 are calculated.
Abstract:
A technique for efficiently calibrating a camera is provided. Reference laser scan data is obtained by scanning a building 131 by a laser scanner 115, which is fixed on a vehicle 100 and has known exterior orientation parameters, while the vehicle 100 travels. An image of the building 131 is photographed at a predetermined timing by an onboard camera 113. Reference point cloud position data, in which the reference laser scan data is described in a coordinate system defined on the vehicle 100 at the predetermined timing, is calculated based on the trajectory the vehicle 100 has traveled. Matching points are selected between feature points in the reference point cloud position data and in the image. Exterior orientation parameters of the camera 113 are calculated based on relative relationships between the reference point cloud position data and image coordinate values in the image of the matching points.
Abstract:
A technique for performing calibration of a laser scanner efficiently is provided. Point cloud position data of a building 131 is obtained by a laser scanner 141 in which exterior orientation parameters are already known. On the other hand, the building 131 is scanned by a laser scanner 115 while a vehicle 100 travels, and point cloud position data of the building 131 measured by the laser scanner 115 is obtained based on a trajectory the vehicle 100 has traveled. Then, exterior orientation parameters of the laser scanner 115 are calculated based on a correspondence relationship between these two point cloud position data.