Abstract:
A computer-human interface provides a mechanism to manage the available space of a computer display (28) in a manner that facilitates navigation among multiple windows (42-50) that are overlaid upon one another. The interface includes a user-delectable mode (Figure 5) in which the windows are rearranged, and resized if necessary, so that all open windows can be simultaneously viewed within the area of the display, thereby enabling any one of the windows to be easily selected for access. In effect, the presentation of the windows is "flattened" so that all windows appear at the same virtual depth, rather than overlapping one another. With this approach, there is no need to minimize windows in order to access one that is overlaid by another, thereby enabling the user to keep the content of all windows visible and accessible. Subsets of windows can be repositioned in the same manner (Figure 23b), or all windows can be removed from the display area for access to desktop objects.
Abstract:
A device with a touch-sensitive display may be unlocked via gestures performed on the touch-sensitive display. The device is unlocked if contact with the display-corresponds to a prededfined gesture for unlocking the device. The device displays one or more unlock images with respect to which the predefined gesture is to be performed in order to unlock the device. The performance of the predefined gesture with respect to the unlock image may include moving the unlock image to a predefined location and/or moving the unlock image along a predefined path. The device may also display visual cues of the predefined gesture on the touch screen to remind a user of the gestures.
Abstract:
The invention relates to a computer implemented method for initiating floating controls via a touch sensitive device, the method comprising: detecting the presence of an object on the touch sensitive device; recognizing the object; and generating a user interface element on the touch screen in the vicinity of the object based on the recognized object.
Abstract:
The invention relates to a computer implemented method for initiating floating controls via a touch sensitive device, the method comprising: detecting the presence of an object on the touch sensitive device; recognizing the object; and generating a user interface element on the touch screen in the vicinity of the object based on the recognized object.
Abstract:
Deletion gestures for use on a portable multifunction device with a touch-sensitive display are disclosed. In some embodiments, a computer-implemented method for use in conjunction with the portable multifunction device comprises displaying a list of items on the touch- sensitive display, detecting a first gesture on the touch-sensitive display to edit the list of items, responding to the first gesture by displaying a first icon next to each deletable item in the list, detecting a second gesture on the touch-sensitive display to select one of the deletable items, and responding to the second gesture by displaying a second icon next to the selected item. If a third gesture on the second icon is detected, the selected deletable item is deleted. If a fourth gesture on the first icon next to the selected deletable item is detected, the second icon is deleted.
Abstract:
Methods and graphical user interfaces for editing on a portable multifunction device with a touch screen display are disclosed. While displaying an application interface of an application, the device detects a multitouch edit initiation gesture on the touch screen display. In response to detection of the multitouch edit initiation gesture, the device displays a plurality of user-selectable edit option icons in an area of the touch screen display that is independent of a location of the multitouch edit initiation gesture. The device also displays a start point object and an end point object to select content displayed by the application in the application interface.
Abstract:
Providing a bridge interface for managing virtual workspaces is disclosed. A plurality of workspace images is presented in a user interface, each workspace image corresponding to a different virtual workspace available to a user of a computer system. A user input indicating a selection of a presented workspace image is received. The user interface is updated to display a plurality of application windows associated with the selected virtual workspace in addition to displaying the plurality of workspace images.
Abstract:
In accordance with some embodiments, a computer-implemented method for use in conjunction with a device (100) with a touch screen display (112) is disclosed. The method comprises displaying an electronic document (3912) at a first magnification; detecting a gesture (3951, 3953) on or near the touch screen display (112) corresponding to a command to zoom out by a user-specified amount; in response to detecting the gesture (3951, 3953), displaying the electronic document (3912) at a magnification less than the first magnification; in response to the gesture (3951, 3953) causing zoom out below a magnification where a document length (3957) or a document width (3959) is entirely displayed while the gesture (3951, 3953) is still detected on or near the touch screen display (112), displaying the electronic document (3912) at a magnification wherein areas (3955) beyond opposite edges of the electronic document (3912) are displayed; and, in response to detecting termination of the gesture (3951, 3953), displaying the electronic document (3912) at a magnification wherein the areas (3955) beyond opposite edges of the electronic document (3912) are no longer displayed.