Abstract:
An electronic device, while in an interaction configuration mode: displays a first user interface that includes a plurality of user interface objects; and, while displaying the first user interface, detects one or more gesture inputs on a touch-sensitive surface. For a respective gesture input, the device determines whether one or more user interface objects of the plurality of user interface objects correspond to the respective gesture input. The device visually distinguishes a first set of user interface objects in the plurality of user interface objects that correspond to the detected one or more gesture inputs from a second set of user interface objects in the plurality of user interface objects that do not correspond to the detected one or more gesture inputs. The device detects an input; and, in response to detecting the input, exits the interaction configuration mode and enters a restricted interaction mode. PI 1282AU1/63266-5402AU Portable Multifunction Device 100 F -0-6--J 1 210 122 (Speaker 111 Optical {Proximity ) Sensor j. ) Sensor166) 210 is SIM card slot 212 is headphone jack Touch Screen 112 Microphone Home Accelerometer(s) 113 ) i 20 168 External Port 124 Figure 2
Abstract:
Disclosed herein are systems and methods that enable low-vision users to interact with touch sensitive secondary displays. An example method includes: displaying, on a primary display, a first user interface for an application and displaying, on a touch-sensitive secondary display, a second user interface that includes a plurality of application-specific affordances that control functions of the application. Each respective affordance is displayed with a first display size. The method also includes: detecting, via the secondary display, an input that contacts at least one application-specific affordance. In response to detecting the input and while it remains in contact with the secondary display, the method includes: (i) continuing to display the first user interface on the primary display and (ii) displaying, on the primary display, a zoomed-in representation of the at least one application-specific affordance. The zoomed-in representation is displayed with a second display size that is larger than the first display size.
Abstract:
Disclosed herein are systems and methods that enable low-vision users to interact with touch sensitive secondary displays. An example method includes: displaying, on a primary display, a first user interface for an application and displaying, on a touch-sensitive secondary display, a second user interface that includes a plurality of application-specific affordances that control functions of the application. Each respective affordance is displayed with a first display size. The method also includes: detecting, via the secondary display, an input that contacts at least one application-specific affordance. In response to detecting the input and while it remains in contact with the secondary display, the method includes: (i) continuing to display the first user interface on the primary display and (ii) displaying, on the primary display, a zoomed-in representation of the at least one application-specific affordance. The zoomed-in representation is displayed with a second display size that is larger than the first display size. 0 v00 LOf ClCD Nf 00 L CDo Lf)O 0 Of Lf) f) r
Abstract:
An electronic device with a display and a touch-sensitive surface displays, on the display, a first visual indicator. The electronic device receives a first single touch input on the touch sensitive surface at a location that corresponds to the first visual indicator; and, in response to detecting the first single touch input on the touch-sensitive surface at a location that corresponds to the first visual indicator, replaces display of the first visual indicator with display of a first menu. The first menu includes a virtual touches selection icon. In response to detecting selection of the virtual touches selection icon, the electronic device displays a menu of virtual multitouch contacts. P12277AU 1/63266-5403AU Portable Multifunction Device 100 (Speaker11 Opticali Proximity ) Sensor 164 ) Sensor 166) 210 is SIM card slot 212 is headphone jack Touch Screen 112 Microphone Home Accelerometer(s) 113 ) 218 External Port j_24 Figure 2
Abstract:
An electronic device, while in an interaction configuration mode: displays a first user interface that includes a plurality of user interface objects; and, while displaying the first user interface, detects one or more gesture inputs on a touch-sensitive surface. For a respective gesture input, the device determines whether one or more user interface objects of the plurality of user interface objects correspond to the respective gesture input. The device visually distinguishes a first set of user interface objects in the plurality of user interface objects that correspond to the detected one or more gesture inputs from a second set of user interface objects in the plurality of user interface objects that do not correspond to the detected one or more gesture inputs. The device detects an input; and, in response to detecting the input, exits the interaction configuration mode and enters a restricted interaction mode.
Abstract:
An electronic device, while in an interaction configuration mode: displays a first user interface that includes a plurality of user interface objects; and, while displaying the first user interface, detects one or more gesture inputs on a touch-sensitive surface. For a respective gesture input, the device determines whether one or more user interface objects of the plurality of user interface objects correspond to the respective gesture input. The device visually distinguishes a first set of user interface objects in the plurality of user interface objects that correspond to the detected one or more gesture inputs from a second set of user interface objects in the plurality of user interface objects that do not correspond to the detected one or more gesture inputs. The device detects an input; and, in response to detecting the input, exits the interaction configuration mode and enters a restricted interaction mode.
Abstract:
An electronic device, while in an interaction configuration mode: displays a first user interface that includes a plurality of user interface objects; and, while displaying the first user interface, detects one or more gesture inputs on a touch-sensitive surface. For a respective gesture input, the device determines whether one or more user interface objects of the plurality of user interface objects correspond to the respective gesture input. The device visually distinguishes a first set of user interface objects in the plurality of user interface objects that correspond to the detected one or more gesture inputs from a second set of user interface objects in the plurality of user interface objects that do not correspond to the detected one or more gesture inputs. The device detects an input; and, in response to detecting the input, exits the interaction configuration mode and enters a restricted interaction mode. PI 1282AU1/63266-5402AU Portable Multifunction Device 100 F -0-6--J 1 210 122 (Speaker 111 Optical {Proximity ) Sensor j. ) Sensor166) 210 is SIM card slot 212 is headphone jack Touch Screen 112 Microphone Home Accelerometer(s) 113 ) i 20 168 External Port 124 Figure 2
Abstract:
While an electronic device with a display and a touch-sensitive surface is in a screen reader accessibility mode, the device displays an application launcher screen including a plurality of application icons. A respective application icon corresponds to a respective application stored in the device. The device detects a sequence of one or more gestures on the touch-sensitive surface that correspond to one or more characters. A respective gesture that corresponds to a respective character is a single finger gesture that moves across the touch-sensitive surface along a respective path that corresponds to the respective character. The device determines whether the detected sequence of one or more gestures corresponds to a respective application icon of the plurality of application icons, and, in response to determining that the detected sequence of one or more gestures corresponds to the respective application icon, performs a predefined operation associated with the respective application icon.