Abstract:
A method, including presenting, by a computer (26), multiple interactive items (36) on a display (28) coupled to the computer, receiving an input indicating a direction of a gaze of a user (22) of the computer. In response to the gaze direction, one of the multiple interactive items is selected, and subsequent to the one of the interactive items being selected, a sequence of three- dimensional (3D) maps is received containing at least a hand (31) of the user. The 3D maps are analyzed to detect a gesture performed by the user, and an operation is performed on the selected interactive item in response to the gesture.
Abstract:
ENHANCED VIRTUAL TOUCHPAD AND TOUCHSCREEN A method, including presenting, by a computer (26), multiple interactive items (36) on a display (28) coupled to the computer, receiving an input indicating a direction of a gaze of a user (22) of the computer. In response to the gaze direction, one of the multiple interactive items is selected, and subsequent to the one of the interactive items being selected, a sequence of three dimensional (3D) maps is received containing at least a hand (31) of the user. The 3D maps are analyzed to detect a gesture performed by the user, and an operation is performed on the selected interactive item in response to the gesture.
Abstract:
ENHANCED VIRTUAL TOUCHPAD AND TOUCHSCREEN A method, including presenting, by a computer (26), multiple interactive items (36) on a display (28) coupled to the computer, receiving an input indicating a direction of a gaze of a user (22) of the computer. In response to the gaze direction, one of the multiple interactive items is selected, and subsequent to the one of the interactive items being selected, a sequence of three dimensional (3D) maps is received containing at least a hand (31) of the user. The 3D maps are analyzed to detect a gesture performed by the user, and an operation is performed on the selected interactive item in response to the gesture.
Abstract:
A method, including presenting, by a computer (26), multiple interactive items (36) on a display (28) coupled to the computer, receiving an input indicating a direction of a gaze of a user (22) of the computer. In response to the gaze direction, one of the multiple interactive items is selected, and subsequent to the one of the interactive items being selected, a sequence of three- dimensional (3D) maps is received containing at least a hand (31) of the user. The 3D maps are analyzed to detect a gesture performed by the user, and an operation is performed on the selected interactive item in response to the gesture.
Abstract:
A method, including receiving a three-dimensional (3D) map of at least a part of a body of a user (22) of a computerized system, and receiving a two dimensional (2D) image of the user, the image including an eye (34) of the user. 3D coordinates of a head (32) of the user are extracted from the 3D map and the 2D image, and a direction of a gaze performed by the user is identified based on the 3D coordinates of the head and the image of the eye.