Abstract:
An electronic device with a display and a touch - sensitive surface displays a first keyboard on the display, the first keyboard comprising a first plurality of keys. A key activation gesture at a first a first time is detected at a location on the touch - sensitive surface that corresponds to a location of a first key in the first keyboard. In response to detecting the key activation gesture at the first time, the first key is activated. One or more contacts on the touch - sensitive surface are detected at a second time after the first time, the one or more contacts corresponding to a keyboard selection gesture. In response to detecting the one or more contacts that correspond to the keyboard selection gesture at the second time after the first time, the first keyboard is replaced with a second keyboard when the second time exceeds a predefined period of time after the first time and the display of the first keyboard is maintained when the second time is less than the predefined period of time after the first time.
Abstract:
An electronic device with a display and a touch - sensitive surface concurrently displays on the display an application content area with a first size, and an input area with a keyboard, the input area being adjacent to and separate from the application content area with the first size, the input area being at the bottom of the display. A gesture is detected on the touch - sensitive surface. In response to detecting the gesture on the touch - sensitive surface the input area is moved away from the bottom of the display over the application content area and the application content area is increased to a second size larger than the first size.
Abstract:
Methods and graphical user interfaces for editing on a portable multifunction device with a touch screen display are disclosed. While displaying an application interface of an application, the device detects a multitouch edit initiation gesture on the touch screen display. In response to detection of the multitouch edit initiation gesture, the device displays a plurality of user-selectable edit option icons in an area of the touch screen display that is independent of a location of the multitouch edit initiation gesture. The device also displays a start point object and an end point object to select content displayed by the application in the application interface.
Abstract:
An electronic device with a display and a touch-sensitive surface concurrently displays on the display an application content area and an unsplit keyboard, the unsplit keyboard being located at a bottom of the display. The device detects a first gesture on the touch-sensitive surface. In response to detecting the first gesture on the touch-sensitive surface, the device converts the unsplit keyboard into a split keyboard and moves the split keyboard away from the bottom of the display over the application content area in accordance with the first gesture.
Abstract:
An electronic device with a display and a touch - sensitive surface concurrently displays a keyboard and one or more text entry areas. A keyboard may be split with an arrangement of keys. A keyboard may be split and characters may be displayed in a first and a second text entry area. An input area with a keyboard may be moved. An input area may be reconfigured.
Abstract:
An electronic device with a display and a touch-sensitive surface concurrently displays on the display an application content area and an unsplit keyboard, the unsplit keyboard being located at a bottom of the display. The device detects a first gesture on the touch-sensitive surface. In response to detecting the first gesture on the touch-sensitive surface, the device converts the unsplit keyboard into a split keyboard and moves the split keyboard away from the bottom of the display over the application content area in accordance with the first gesture.
Abstract:
An electronic device with a display and a touch-sensitive surface concurrently displays on the display an application content area and an unsplit keyboard, the unsplit keyboard being located at a bottom of the display. The device detects a first gesture on the touch-sensitive surface. In response to detecting the first gesture on the touch-sensitive surface, the device converts the unsplit keyboard into a split keyboard and moves the split keyboard away from the bottom of the display over the application content area in accordance with the first gesture.
Abstract:
Computing equipment may display data items in a list on a touch screen display. The computing equipment may use the touch screen display to detect touch gestures. A user may select a data item using a touch gesture such as a tap gesture. In response, the computing equipment may display a selectable option. When the option is displayed, movable markers may be placed in the list. The markers can be dragged to new locations to adjust how many of the data items are selected and highlighted in the list. Ranges of selected items may be merged by moving the markers to unify separate groups of selected items. A region that contains multiple selectable options may be displayed adjacent to a selected item. The selectable options may correspond to different ways to select and deselect items. Multifinger swipe gestures may be used to select and deselect data items.
Abstract:
Methods and graphical user interfaces for editing on a portable multifunction device with a touch screen display are disclosed. While displaying an application interface of an application, the device detects a multitouch edit initiation gesture on the touch screen display. In response to detection of the multitouch edit initiation gesture, the device displays a plurality of user-selectable edit option icons in an area of the touch screen display that is independent of a location of the multitouch edit initiation gesture. The device also displays a start point object and an end point object to select content displayed by the application in the application interface.
Abstract:
Methods and graphical user interfaces for editing on a portable multifunction device with a touch screen display are disclosed. While displaying an application interface of an application, the device detects a multitouch edit initiation gesture on the touch screen display. In response to detection of the multitouch edit initiation gesture, the device displays a plurality of user-selectable edit option icons in an area of the touch screen display that is independent of a location of the multitouch edit initiation gesture. The device also displays a start point object and an end point object to select content displayed by the application in the application interface.