Abstract:
Disclosed is a head-up display apparatus based on AR that three-dimensionally displays augmented image information on an object external to a vehicle on the basis of actual distance information between the vehicle and the external object, and thus can provide realistic information to a driver. The head-up display apparatus includes a distance information generating unit configured to receive an image signal from an image signal inputting apparatus capturing an image in front of a vehicle and generate distance information on each of a plurality of objects in the front image, an information image generating unit configured to generate an information image of each object in the front image, and an augmentation processing unit configured to generate augmented information image of each object on the basis of the distance information on each object.
Abstract:
An autostereoscopic three-dimensional (3D) display apparatus is provided. The autostereoscopic 3D display apparatus includes an image display unit configured to display a 3D image including a 3D virtual object or a 3D image including a 3D virtual object and text; and an optical unit configured to reflect or transmit the displayed 3D image from the image display unit toward a viewer, transmit an image of a real object facing the viewer, and display a combination of the 3D image and the image of the real object to the viewer.
Abstract:
A three-dimensional (3D) transparent display device is provided. The 3D transparent display device includes a position obtaining unit configured to obtain information regarding 3D positions of both eyes of a user and a 3D position of a real object; a controller configured to estimate a two-dimensional (2D) position on a display screen at which an image of a virtual object is to be displayed on the basis of the 3D positions of the both eyes of the user and the 3D position of the real object; and a 3D transparent display panel configured to display the image of the virtual object on the display screen on the basis of the estimated 2D position of the virtual object.
Abstract:
Provided is a user interface (UI) apparatus based on a hand gesture. The UI apparatus includes an image processing unit configured to detect a position of an index finger and a center position of a hand from a depth image obtained by photographing a user's hand, and detect a position of a thumb on a basis of the detected position of the index finger and the detected center position of the hand, a hand gesture recognizing unit configured to recognize a position change of the index finger and a position change of the thumb, and a function matching unit configured to match the position change of the index finger to a predetermined first function, match the position change of the thumb to a predetermined second function, and output a control signal for executing each of the matched functions.