Abstract:
in accordance with some embodiments, a computer-implemented method for use in conjunction with a device (100) with a touch screen display (112) is disclosed. In the method, a multifinger twisting gesture (1508) with a corresponding degree of rotation is detected and a 90° screen rotation command is executed, if the corresponding degree of rotation exceeds a predefined degree of rotation, or a screen rotation command with an acute angle of rotation and, upon ceasing to detect the multifinger twisting gesture (1508), a screen rotation command with an angle of rotation opposite to the acute angle are executed, if the corresponding degree of rotation is less than the predefined degree of rotation.
Abstract:
A computer-human interface provides a mechanism to manage the available space of a computer display (28) in a manner that facilitates navigation among multiple windows (42-50) that are overlaid upon one another. The interface includes a user-delectable mode ( Figure 5 ) in which the windows are rearranged, and resized if necessary, so that all open windows can be simultaneously viewed within the area of the display, thereby enabling any one of the windows to be easily selected for access. In effect, the presentation of the windows is "flattened" so that all windows appear at the same virtual depth, rather than overlapping one another. With this approach, there is no need to minimize windows in order to access one that is overlaid by another, thereby enabling the user to keep the content of all windows visible and accessible. Subsets of windows can be repositioned in the same manner ( Figure 23b ), or all windows can be removed from the display area for access to desktop objects.
Abstract:
A machine implemented method of searching data, the method comprising: storing metadata for a plurality of files created by a plurality of different software applications which execute on a data processing system, wherein the type of information in metadata for files of a first software application differs from the type of information in metadata for files of a second software application; storing content from the plurality of files; searching, by a data processing system, the stored metadata and the stored content in response to a single command from a user, wherein the single command is entered into a system wide user interface available on the data processing system for the plurality of different software applications and wherein an output of the searching is displayed as the user enters a search query and wherein the output of the searching includes executable applications and wherein the executable applications are configured to be launchable from the displayed output.
Abstract:
A computer-human interface provides a mechanism to manage the available space of a computer display (28) in a manner that facilitates navigation among multiple windows (42-50) that are overlaid upon one another. The interface includes a user-delectable mode ( Figure 5 ) in which the windows are rearranged, and resized if necessary, so that all open windows can be simultaneously viewed within the area of the display, thereby enabling any one of the windows to be easily selected for access. In effect, the presentation of the windows is "flattened" so that all windows appear at the same virtual depth, rather than overlapping one another. With this approach, there is no need to minimize windows in order to access one that is overlaid by another, thereby enabling the user to keep the content of all windows visible and accessible. Subsets of windows can be repositioned in the same manner ( Figure 23b ), or all windows can be removed from the display area for access to desktop objects.
Abstract:
A computer-implemented method, for use in conjunction with a portable electronic device with s touch screen display, comprises displaying at least a portion of a structured electronic document on the touch screen display, wherein the structured electronic document comprises a plurality of boxes of content, and detecting a first gesture at a location on the displayed portion of the structured electronic document. A first box in the plurality of boxes at the location of the first gesture is determined. The first box on the touch screen display is enlarged and substantially centered.
Abstract:
A device with a touch-sensitive display may be unlocked via gestures performed on the touch-sensitive display. The device is unlocked if contact with the display-corresponds to a prededfined gesture for unlocking the device. The device displays one or more unlock images with respect to which the predefined gesture is to be performed in order to unlock the device. The performance of the predefined gesture with respect to the unlock image may include moving the unlock image to a predefined location and/or moving the unlock image along a predefined path. The device may also display visual cues of the predefined gesture on the touch screen to remind a user of the gestures.