Calibration of multiple cameras
Abstract:
A plurality of cameras obtain images in an environment. Calibration data about the translation and rotation of the cameras is used to support functionality such as determining a location of an object that appears in those images. To determine the calibration data, images are processed to determine features produced by points in the environment. A cluster is designated that includes the images of the same point as viewed from at least some of the cameras. A triangulation matrix and back-projection error are calculated for each cluster. These triangulation matrices and back-projection errors are then perturbed to find the translation and rotation that minimize eigenvalues and eigenvectors representative of the feature. The eigenvalues and eigenvectors are then used to determine the rotation and translation of the respective cameras with respect to an origin.
Information query
Patent Agency Ranking
0/0