Abstract:
A portable electronic device with a touch screen display for photo management is disclosed. One aspect of the invention involves a computer-implemented method in which the portable electronic device displays an array of thumbnail images corresponding to a set of photographic images. The device replaces the displayed array of thumbnail images with a user-selected photographic image upon detecting a user contact with a corresponding thumbnail image in the array. The user-selected photographic image is displayed at a larger scale than the corresponding thumbnail image. The portable device displays a different photographic image in replacement of the user-selected photographic image in accordance with a scrolling gesture. The scrolling gesture comprises a substantially horizontal movement of user contact with the touch screen display.
Abstract:
In some embodiments, an electronic device expands an item of content in accordance with detection of a user's gaze. In some embodiments, an electronic device scrolls text of a content item in accordance with a determination that the user is reading the content item. In some embodiments, an electronic device navigates between user interfaces in accordance with detection of movement of the user's head and detection of the user's gaze. In some embodiments, an electronic device displays augmented content related to a portion of content in accordance with detection of movement of the user's head and detection of the user's gaze in accordance with some embodiments.
Abstract:
Providing a bridge interface for managing virtual workspaces is disclosed. A plurality of workspace images is presented in a user interface, each workspace image corresponding to a different virtual workspace available to a user of a computer system. A user input indicating a selection of a presented workspace image is received. The user interface is updated to display a plurality of application windows associated with the selected virtual workspace in addition to displaying the plurality of workspace images.
Abstract:
Methods and systems for implementing gestures with sensing devices are disclosed. More particularly, methods and systems related to gesturing with multipoint sensing devices are disclosed.
Abstract:
A portable electronic device (100) having a touch screen display (112) detects a first finger-down event at a first position (5805) on the touch screen display (112). The first position. (5805) is adjacent to first and second user interface objects (5806, 5802). The portable device (100) detects a second finger event at a second position (5808,5812,5809,5817,5807) on the touch screen display (112). The second finger event is either a finger-dragging event or a finger-up event. The portable device (100) determines a type of the second finger event and a distance between the first position (5805) and the second position (5808,5812,5809,5817,5807). The portable device (100) performs a first action associated with the first user interface object (5806) if the distance is greater than a predefined threshold and performs a second action associated with the second user interface object (5802) if the distance is equal to or less than the predefined threshold and the second finger event is a finger-up event.
Abstract:
A user interface for handling multiple calls includes displaying an image associated with a first party on a first call and an image associated with a second party on a second call. When one call is active and the other call is on hold, the image associated with the party that is on the active call is visually highlighted to make it more visually prominent relative to the other image. When both calls are joined into a conference call, both images are displayed adjacent to each other and neither is visually highlighted relative to the other.
Abstract:
The present disclosure relates to user interfaces for receiving user input. In some examples, a device determines which user input technique a user has accessed most recently, and displays the corresponding user interface. In some examples, a device scrolls through a set of information on the display. When a threshold criteria is satisfied, the device displays an index object fully or partially overlaying the set of information. In some examples, a device displays an emoji graphical object, which is visually manipulated based on user input. The emoji graphical object is transmitted to a recipient. In some examples, a device displays paging affordances that enlarge and allow a user to select a particular page of a user interface. In some examples, the device displays user interfaces for various input methods, including multiple emoji graphical objects. In some examples, a keyboard is displays for receiving user input.
Abstract:
In some implementations, a method for managing virtual workspaces is described. In some implementations, workspace images corresponding to different virtual workspaces can be displayed on a user interface of a computing device. When an application window is moved onto one of the workspace images, the window can be scaled down to fit within the workspace image. In some implementations, a window grouping or cluster can be moved onto one of the workspace images and scaled down accordingly. In some implementations, a method for generating a new virtual workspace is described. In some implementations, a placeholder workspace image that has no corresponding virtual workspace can be displayed on a user interface of a computing device. In response to an application window being moved onto the placeholder workspace image, a new virtual workspace (and new workspace image) can be generated that includes the windows that were moved onto the placeholder workspace image.
Abstract:
Methods and graphical user interfaces for editing on a portable multifunction device with a touch screen display are disclosed. While displaying an application interface of an application, the device detects a multitouch edit initiation gesture on the touch screen display. In response to detection of the multitouch edit initiation gesture, the device displays a plurality of user-selectable edit option icons in an area of the touch screen display that is independent of a location of the multitouch edit initiation gesture. The device also displays a start point object and an end point object to select content displayed by the application in the application interface.
Abstract:
A portable multifunction device displays a first icon and a second icon on its touch screen display. In response to a sequence of finger movements across the first and second icons, wherein the finger stays in contact with the touch screen display during the movements, the portable device highlights the first icon for at least a predefined time period if a parameter associated with the finger's position relative to the touch screen display meets a first predefined condition and then highlights the second icon for at least the predefined time period if the parameter associated with the finger's position relative to the touch screen display meets a second predefined condition.