Abstract:
An integrated remote control device and method using multi-modal inputs are provided. The integrated remote control device may include an input module that receives inputs of at least two modalities from a user, a first modality recognition module that recognizes and interprets an input of a first modality received from the input module, a second modality recognition module that recognizes and interprets an input of a second modality received from the input module, a command database (DB) that stores commands, a control module that interprets a user command by synthesizing information about the inputs interpreted by the first and second modality recognition modules, and extracts a control command that corresponds to the interpreted user command from the command DB, and an output module that outputs the control command extracted by the control module.
Abstract:
The user interaction system comprises a portable pointing device (101) connected to a camera (102) and sending pictures to a digital signal processor (120), capable of recognizing an object (130) and a command given by the user (100) by moving the pointing device (101) in a specific way, and controlling an electrical apparatus (110) on the basis of this recognition.
Abstract:
The present invention is directed toward a system and process that controls a group of networked electronic components using a multimodal integration scheme in which inputs from a speech recognition subsystem, gesture recognition subsystem employing a wireless pointing device and pointing analysis subsystem also employing the pointing device, are combined to determine what component a user wants to control and what control action is desired. In this multimodal integration scheme, the desired action concerning an electronic component is decomposed into a command and a referent pair. The referent can be identified using the pointing device to identify the component by pointing at the component or an object associated with it, by using speech recognition, or both. The command may be specified by pressing a button on the pointing device, by a gesture performed with the pointing device, by a speech recognition event, or by any combination of these inputs.
Abstract:
The present invention is directed toward a system and process that controls a group of networked electronic components using a multimodal integration scheme in which inputs from a speech recognition subsystem, gesture recognition subsystem employing a wireless pointing device and pointing analysis subsystem also employing the pointing device, are combined to determine what component a user wants to control and what control action is desired. In this multimodal integration scheme, the desired action concerning an electronic component is decomposed into a command and a referent pair. The referent can be identified using the pointing device to identify the component by pointing at the component or an object associated with it, by using speech recognition, or both. The command may be specified by pressing a button on the pointing device, by a gesture performed with the pointing device, by a speech recognition event, or by any combination of these inputs.
Abstract:
Systems and methods according to the present invention address these needs and others by providing a handheld device, e.g., a free space pointing device, which uses hand tremor as an input. One or more sensors within the handheld device detect a user's hand tremor and identify the user based on the detected tremor.
Abstract:
A method for controlling a predetermined control-targeted device using three dimensional pointing, wherein the method includes: calculating position information of a controlling apparatus; calculating attitude information of a controlling apparatus; calculating sight line information of the controlling apparatus by using the position information and the attitude information of the controlling apparatus; selecting the predetermined control-targeted device by using the sight line information of the controlling apparatus; and controlling the selected control-targeted device.
Abstract:
A handheld device includes a display having a viewable surface and operable to generate an image indicating a currently controlled remote device and a gesture database maintaining a plurality of remote command gestures. Each remote command gesture is defined by a motion of the device with respect to a first position of the handheld device. The device includes a gesture mapping database comprising a mapping of each of the remote command gestures to an associated command for controlling operation of the remote device and a motion detection module operable to detect motion of the handheld device within three dimensions and to identify components of the motion in relation to the viewable surface. The device includes a control module operable to track movement of the handheld device using the motion detection module, to compare the tracked movement against the remote command gestures to determine a matching gesture, and to identify the one of the commands corresponding to the matching gesture. The device also includes a wireless interface operable to transmit the identified command to a remote receiver for delivery to the remote device.
Abstract:
A broadcast receiving apparatus which has a function controlled by a received code signal has a code signal output device detects its own motion, and outputs a control signal which responds to the detected motion as a code signal, a code signal function setting portion which sets a function for controlling the broadcast receiving apparatus in response to the code signal, and a control portion receives the code signal according to the motion of the code signal output device, the signal being outputted from the code signal output device, and carries out control based on the function set at the code signal function setting portion in response to the reception. In this manner, an operation of the broadcast receiving apparatus can be carried out smoothly and intuitively by moving the remote control main body.
Abstract:
A system and method for controlling the movement of a cursor on a monitor screen are provided. The system comprising at least one remote control unit having a plurality of push buttons for remotely controlling the moving direction of the cursor on the monitor screen; at least one light emitting element for emitting light that indicates a signal generated by the remote control unit; a light detector for extracting the light movement that is transmitted sequentially from the remote control unit; and, a control unit for displaying the moving position of the cursor on the monitor screen corresponding to the extracted movement of the light from the remote control unit, and also adapted to stop the moving position of the cursor upon releasing the push button of the remote control unit. The movement of the cursor on the monitor screen also can be stopped if the light movement transmitted from the remote control unit changes in the opposite direction.
Abstract:
A portable master controller for a locomotive remote control system. The portable master controller has a user interface for receiving commands to control the movement of the locomotive. The user interface is responsive to operator commands to generate control signals. A processing unit receives the control signals from the user interface to generate digital command signals directing the movement of the locomotive. A transmission unit receives the digital command signals and generates a RF transmission conveying the digital command signals to the slave controller. A solid-state tilt sensor in communication with the processing unit communicates inclination information to the processing unit about the portable master controller. The processing unit receives and processes the inclination information. If the inclination information indicates that the portable master controller is in an unsafe operational condition, the processing unit generates an emergency digital command signal to the transmission unit, without input from the operator, for directing the locomotive to acquire a secure condition.