Abstract:
PROBLEM TO BE SOLVED: To provide electronic devices with touch-sensitive surfaces using faster, more efficient touch-based accessibility methods.SOLUTION: The method includes: displaying a plurality of user interface elements on a display, wherein a current focus is on a first user interface element; detecting a first finger gesture on the touch-sensitive surface, wherein the first finger gesture is independent of contacting a location on the touch-sensitive surface that corresponds to a second user interface element; and, in response to detecting the first finger gesture: changing the current focus from the first user interface element in the plurality of user interface elements to the second user interface element in the plurality of user interface elements; and outputting accessibility information associated with the second user interface element.
Abstract:
PROBLEM TO BE SOLVED: To provide a more efficient human machine interface of a touch screen display and/or a track pad by lightening the load that the visually handicapped has in recognizing.SOLUTION: A method includes a process of displaying a plurality of user interface elements on a display, a current focus being on a first user interface element; a process of detecting on a contact sense surface a first finger gesture irrelevant to a touch on a position on the contact sense surface corresponding to a second user interface element; and a process of changing the current focus from the first user interface element among the plurality of user interface elements to the second user interface element among the plurality of user interface elements in response to the detection of the first finger gesture, and outputting accessibility information related to the second user interface element.
Abstract:
An electronic device with a display and a touch-sensitive surface displays, on the display, a first visual indicator that corresponds to a virtual touch. The device receives a first input from an adaptive input device. In response to receiving the first input from the adaptive input device, the device displays a first menu on the display. The first menu includes a virtual touches selection icon. In response to detecting selection of the virtual touches selection icon, a menu of virtual multitouch contacts is displayed.
Abstract:
An electronic device with a display and a touch-sensitive surface displays, on the display, a first visual indicator that corresponds to a virtual touch. The device receives a first input from an adaptive input device. In response to receiving the first input from the adaptive input device, the device displays a first menu on the display. The first menu includes a virtual touches selection icon. In response to detecting selection of the virtual touches selection icon, a menu of virtual multitouch contacts is displayed.
Abstract:
Techniques for increasing accessibility of touch-screen devices are disclosed. In one aspect, container regions on a touch-sensitive user interface of a touch screen device are defined. A touch event corresponding to a location on the user interface is received, and it is determined that the location corresponds to a particular container region. When another touch event is received, content is determined according to a context of the particular container region. The content is then presented. In another aspect, data specifying locations of user interface items on a user interface is received. The data is modified to enlarge an area for a particular item. A touch input event corresponding to a particular location on the user interface is received. It is determined that the location is within the enlarged area for the item, and input is provided to an application indicating that the item was selected.
Abstract:
A method, comprising: at an electronic device with a display and a touch-sensitive surface: while in a virtual-gesture recording mode: receiving, from an input device, a first user input indicating a first movement; in response to the first user input: displaying a first visual indicator in accordance with the first user input; and displaying a first trail corresponding to the first movement indicated by the first user input; while displaying the first visual indicator: receiving, from the input device, a second user input indicating a second movement; in response to the second user input: displaying a second visual indicator in accordance with the second user input; and displaying a second trail corresponding to the second movement indicated by the second user input; creating a user-defined virtual gesture that corresponds to the first and second movements indicated by the first and second user inputs; after creating the user-defined virtual gesture: displaying an icon associated with the user-defined virtual gesture; receiving, from the input device, a third user input corresponding to a selection of the icon; and in response to receiving the third user input corresponding to the selection of the icon, performing the user defined virtual gesture corresponding to the first movement indicated by the first user input and the second movement indicated by the second user input.
Abstract:
Id: 1000235451 The present disclosure describes technology, which can be implemented as a method, apparatus, and/or computer software embodied in a computer-readable medium, and which, among other things, be used to create custom vibration patterns in response to user input, for example, in response to the user tapping out a desired pattern on the display of a mobile device. For example, one or more aspects of the subject matter described in this disclosure can be embodied in one or more methods that include receiving tactile input from a user of an electronic device specifying a custom vibration pattern, in concert with receiving tactile input, providing visual feedback to the user corresponding to the received tactile input, and storing the specified custom vibration pattern for use by the electronic device to actuate haptic feedback signaling a predetermined notification event.
Abstract:
This invention relates generally to creating custom vibration patterns for playback by a mobile electronic device, for example, in response to receiving a notification event. In particular, the invention relates to a method (Figure 1) performed by one or more processes executing on an electronic device. The method comprises receiving tactile input from a user of the electronic device specifying a custom vibration pattern (112), in concert with receiving tactile input, providing visual feedback (114) to the user corresponding to the received tactile input, and storing the specified custom vibration pattern for use by the electronic device to actuate (118) haptic feedback signalling a predetermined notification event.
Abstract:
A method and an accessible electronic device with a touch-sensitive surface and a display configured to implement the method are provided. The method includes displaying a plurality of user interface elements and in response to a first user interface navigation gesture on the touch sensitive surface, navigating in the plurality of user interface elements in accordance with a current navigable unit type of a plurality of navigable unit types. In response to detecting a first user interface navigation setting gesture on the touch-sensitive surface, the current navigable unit type is changed from the first navigable unit type to a second navigable unit type. Accessibility information about the second navigable unit type is outputted. The method also includes, in response to detecting a second user interface navigation gesture by the finger on the touch sensitive surface, navigating in the plurality of user interface elements in accordance with the current navigable unit type, wherein the current navigable unit type is set to the second navigable unit type.
Abstract:
A method is performed by an accessible electronic device with a display and a touch- sensitive surface. The method includes: displaying a plurality of user interface elements on the display, wherein a current focus is on a first user interface element; detecting a first finger gesture on the touch-sensitive surface, wherein the first finger gesture is independent of contacting a location on the touch-sensitive surface that corresponds to a second user interface element; and, in response to detecting the first finger gesture: changing the current focus from the first user interface element in the plurality of user interface elements to the second user interface element in the plurality of user interface elements; and outputting accessibility information associated with the second user interface element..