Abstract:
A grip of a primary user on a touch-sensitive computing device and a grip of a secondary user on the touch-sensitive computing device are sensed and correlated to determine whether the primary user is sharing or handing off the computing device to the secondary user. In the case of handoff, capabilities of the computing device may be restricted, while in a sharing mode only certain content on the computing device is shared. In some implementations both a touch-sensitive pen and the touch-sensitive computing device are passed from a primary user to a secondary user. Sensor inputs representing the grips of the users on both the pen and the touch-sensitive computing device are correlated to determine the context of the grips and to initiate a context-appropriate command in an application executing on the touch-sensitive pen or the touch-sensitive computing device. Meta data is also derived from the correlated sensor inputs.
Abstract:
Described herein are techniques and systems that allow modification of functionalities based on distances between a shared device (e.g., a shared display, etc.) and an individual device (e.g., a mobile computing device, etc.). The shared device and the individual device may establish a communication to enable exchange of data. In some embodiments, the shared device or the individual device may measure a distance between the shared device and the individual device. Based on the distance, the individual device may operate in a different mode. In some instances, the shared device may then instruct the individual device to modify a functionality corresponding to the mode.
Abstract:
The subject disclosure is directed towards a graphical or printed keyboard having keys removed, in which the removed keys are those made redundant by gesture input. For example, a graphical or printed keyboard may be the same overall size and have the same key sizes as other graphical or printed keyboards with no numeric keys, yet via the removed keys may fit numeric and alphabetic keys into the same footprint. Also described is having three or more characters per key, with a tap corresponding to one character, and different gestures on the key differentiating among the other characters.
Abstract:
Pen and computing device sensor correlation technique embodiments correlate sensor signals received from various grips on a touch-sensitive pen and touches to a touch-sensitive computing device in order to determine the context of such grips and touches and to issue context-appropriate commands to the touch-sensitive pen or the touch-sensitive computing device. A combination of concurrent sensor inputs received from both a touch-sensitive pen and a touch-sensitive computing device are correlated. How the touch-sensitive pen and the touch-sensitive computing device are touched or gripped are used to determine the context of their use and the user's intent. A context-appropriate user interface action based can then be initiated. Also the context can be used to label metadata.
Abstract:
A method, system, and one or more computer-readable storage media for providing multi-dimensional haptic touch screen interaction are provided herein. The method includes detecting a force applied to a touch screen by an object and determining a magnitude, direction, and location of the force. The method also includes determining a haptic force feedback to be applied by the touch screen on the object based on the magnitude, direction, and location of the force applied to the touch screen, and displacing the touch screen in a specified direction such that the haptic force feedback is applied by the touch screen on the object.
Abstract:
Systems and methods for presenting a dynamic user-interaction control are presented. The dynamic user-interaction control enables a device user to interact with a touch-sensitive device in a single-handed manner. A triggering event causes the dynamic user-interaction control to be temporarily presented on a display screen. In various embodiments, a dynamic user-interaction control is presented at the location corresponding to the triggering event (i.e., the location of the device user's touch). The dynamic user-interaction control remains present on the display screen and the device user can interact with the control until a dismissal event is encountered. A dismissal event occurs under multiple conditions including the device user breaks touch connection with the dynamic user-interaction control for a predetermined amount of time.
Abstract:
Described herein are techniques and systems that allow modification of functionalities based on distances between a shared device (e.g., a shared display, etc.) and an individual device (e.g., a mobile computing device, etc.). The shared device and the individual device may establish a communication to enable exchange of data. In some embodiments, the shared device or the individual device may measure a distance between the shared device and the individual device. Based on the distance, the individual device may operate in a different mode. In some instances, the shared device may then instruct the individual device to modify a functionality corresponding to the mode.
Abstract:
A grip of a primary user on a touch-sensitive computing device and a grip of a secondary user on the touch-sensitive computing device are sensed and correlated to determine whether the primary user is sharing or handing off the computing device to the secondary user. In the case of handoff, capabilities of the computing device may be restricted, while in a sharing mode only certain content on the computing device is shared. In some implementations both a touch-sensitive pen and the touch-sensitive computing device are passed from a primary user to a secondary user. Sensor inputs representing the grips of the users on both the pen and the touch-sensitive computing device are correlated to determine the context of the grips and to initiate a context-appropriate command in an application executing on the touch-sensitive pen or the touch-sensitive computing device. Meta data is also derived from the correlated sensor inputs.
Abstract:
Pen and computing device sensor correlation technique embodiments correlate sensor signals received from various grips on a touch-sensitive pen and touches to a touch-sensitive computing device in order to determine the context of such grips and touches and to issue context-appropriate commands to the touch-sensitive pen or the touch-sensitive computing device. A combination of concurrent sensor inputs received from both a touch-sensitive pen and a touch-sensitive computing device are correlated. How the touch-sensitive pen and the touch-sensitive computing device are touched or gripped are used to determine the context of their use and the user's intent. A context-appropriate user interface action based can then be initiated. Also the context can be used to label metadata.
Abstract:
A method, system, and one or more computer-readable storage media for providing multi-dimensional haptic touch screen interaction are provided herein. The method includes detecting a force applied to a touch screen by an object and determining a magnitude, direction, and location of the force. The method also includes determining a haptic force feedback to be applied by the touch screen on the object based on the magnitude, direction, and location of the force applied to the touch screen, and displacing the touch screen in a specified direction such that the haptic force feedback is applied by the touch screen on the object.