Abstract:
A remote wand for controlling the operations of a media system is provided. The wand may be operative to control the movement of a cursor displayed on screen by the position and orientation at which the wand is held. As the user moves the wand, the on-screen cursor may move. The user may use the wand to control a plurality of operations and applications that may be available from the media system, including for example zoom operations, a keyboard application, an image application, an illustration application, and a media application.
Abstract:
A wireless device controls an object, such as a physical device, directly or through interaction with a virtual representation of the object situated at a predefined physical location. The wireless device identifies an intent gesture performed by a user that indicates intent to control the object, such as pointing or orienting the wireless device toward the object, with or without additional input; determines the object associated with the intent gesture using wireless ranging and/or device orientation; interprets sensor data from one or more sensors associated with the wireless device to determine an action gesture corresponding to a command; and transmits a corresponding command value to control the object. Additionally, in some embodiments, the wireless device presents output information that indicates the range/direction of the object, such as on a map or image of the proximate area with an indicator representative of the object shown on the map/image.
Abstract translation:无线设备直接或通过与位于预定义物理位置处的对象的虚拟表示进行交互来控制诸如物理设备之类的对象。 无线设备识别由用户执行的指示意图控制对象的意图手势,诸如将无线设备指向或定向为对象,具有或不具有附加输入; 使用无线测距和/或设备定向来确定与意图手势相关联的对象; 解译来自与所述无线装置相关联的一个或一个以上传感器的传感器数据以确定对应于命令的动作手势; 并发送相应的命令值来控制对象。 另外,在一些实施例中,无线设备利用代表地图/图像上显示的对象的指示符来呈现指示对象的范围/方向(诸如在地图或邻近区域的图像上)的输出信息。 p >
Abstract:
A system comprising a first control assembly (230) comprising a first sensor (236) and a first indicator (235), the first sensor configured to receive a first user input and control a presentation of the first indicator in response to the first user input; and a second indicator (205) configured to receive a signal from the first sensor of the first control assembly, the signal synchronizing a presentation of the second indicator with the presentation of the first indicator.
Abstract:
A method for controlling operations of a media application implemented on a media system by moving a wand, the method comprising: displaying a media application interface on a screen of the media system; displaying a cursor on the screen; receiving a transmission from the wand, the transmission comprising the output of at least one motion detection component incorporated in the wand; moving the cursor in response to receiving the transmission from the wand; identifying a media application operation to perform in response to receiving the transmission based on the displayed media application interface and the movement of the cursor, wherein identifying the media application operation to perform includes identifying a first media application operation if the cursor is moved near a first edge of the screen and identifying a second media application operation if the cursor is moved beyond the first edge of the screen; and performing the identified media application operation.
Abstract:
A method for controlling an object by a wireless device, the method comprising: by the wireless device: interpreting first sensor data to identify an intent gesture indicating an intent to control the object, wherein the object is within wireless range of the wireless device; identifying the object to be controlled based at least in part on an anonymous identifier received from the object; interpreting second sensor data to determine an action gesture indicating a command for controlling the object; and transmitting a command value associated with the command for controlling the object.
Abstract:
The computing system includes a primary display, memory, and a housing at least partially containing a physical input mechanism and a touch screen adjacent to the physical input mechanism: displays, on the primary display, a first user interface, the first user interface comprising one or more user interface elements; and identifies an active user interface element among the one or more user interface elements that is in focus on the primary display. In accordance with a determination that the active user interface element that is in focus on the primary display is associated with an application executed by the computing system, the computing system displays a second user interface on the touch screen, including: (A) a first set of corresponding to the application; and (B) at least one system-level affordance corresponding a system-level functionality.
Abstract:
A system comprising a first control assembly (201) comprising a first sensor (206) and a first indicator (205), the first sensor configured to receive a first user input and control a presentation of the first indicator in response to the first user input; and a second indicator (235) configured to receive a signal from the first sensor of the first control assembly, the signal synchronizing a presentation of the second indicator with the presentation of the first indicator.
Abstract:
A system comprising a first control assembly (201) comprising a first sensor (206) and a first indicator (205), the first sensor configured to receive a first user input and control a presentation of the first indicator in response to the first user input; and a second indicator (235) configured to receive a signal from the first sensor of the first control assembly, the signal synchronizing a presentation of the second indicator with the presentation of the first indicator.
Abstract:
A method for controlling an object by a wireless device, the method comprising: by the wireless device: interpreting first sensor data to identify an intent gesture indicating an intent to control the object, wherein the object is within wireless range of the wireless device; identifying the object to be controlled based at least in part on an anonymous identifier received from the object; interpreting second sensor data to determine an action gesture indicating a command for controlling the object; and transmitting a command value associated with the command for controlling the object.