Abstract:
In one exemplary embodiment, a portable computer having a display assembly coupled to a base assembly to alternate between a closed position and an open position. Palm rest areas are formed by a touchpad disposed on the surface of the base assembly. In an alternative embodiment, a touchpad disposed on the base assembly has a width that extends substantially into the palm rests areas of the base assembly.
Abstract:
A computer- implemented method for displaying and managing lists on a portable multifunction device (100) with a touch screen display (112) includes displaying (602) a list of items, detecting (608) a finger contact (2724) on a moving-af fordance icon (2722), detecting (610) movement of the finger contact on the touch screen display, and in response to detecting the movement of the finger contact, moving (612) the moving-affordance icon and the corresponding item in the list in accordance with the movement of the finger contact. In some embodiments, at least some of the items have corresponding moving-af fordance icons (2722).
Abstract:
The present disclosure relates to user interfaces for receiving user input. Some techniques for receiving user input using electronic devices, however, are generally cumbersome and inefficient. For example, composing or preparing a response to a message requires navigating a complex and time-consuming user interface, which may include multiple key presses or keystrokes. Existing techniques require longer than necessary, wasting user time and device energy. This latter consideration is particularly important in battery-operated devices. Accordingly, there is a need for electronic devices with faster, more efficient methods and interfaces for receiving user input. Such methods and interfaces optionally complement or replace conventional methods for receiving user input. Such methods and interfaces reduce the cognitive burden on a user and produce a more efficient human-machine interface. For battery-operated computing devices, such methods and interfaces conserve power and increase the time between battery charges. The above deficiencies and other problems associated with user interfaces for computing devices for receiving user input are reduced or eliminated by the disclosed devices.
Abstract:
The present disclosure relates to user interfaces for receiving user input. Some techniques for receiving user input using electronic devices, however, are generally cumbersome and inefficient. For example, composing or preparing a response to a message requires navigating a complex and time-consuming user interface, which may include multiple key presses or keystrokes. Existing techniques require longer than necessary, wasting user time and device energy. This latter consideration is particularly important in battery-operated devices. Accordingly, there is a need for electronic devices with faster, more efficient methods and interfaces for receiving user input. Such methods and interfaces optionally complement or replace conventional methods for receiving user input. Such methods and interfaces reduce the cognitive burden on a user and produce a more efficient human-machine interface. For battery-operated computing devices, such methods and interfaces conserve power and increase the time between battery charges. The above deficiencies and other problems associated with user interfaces for computing devices for receiving user input are reduced or eliminated by the disclosed devices.
Abstract:
A portable electronic device, comprising: a touch screen display; one or more processors; memory; and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions for: displaying at least a portion of a structured electronic document on the touch screen display, wherein the structured electronic document comprises a plurality of boxes of content; detecting a first gesture at a location on the displayed portion of the structured electronic document; determining a first box in the plurality of boxes at the location of the first gesture, the first box having a first size; enlarging and translating the structured electronic document so that the first box is substantially centered on the touch screen display at a second size greater than the first size; while the first box is enlarged, detecting a second gesture on the enlarged first box; and in response to detecting the second gesture, reducing in size the displayed portion of the structured electronic document.
Abstract:
A user interface for handling multiple calls includes displaying an image associated with a first party on a first call and an image associated with a second party on a second call. When one call is active and the other call is on hold, the image associated with the party that is on the active call is visually highlighted to make it more visually prominent relative to the other image. When both calls are joined into a conference call, both images are displayed adjacent to each other and neither is visually highlighted relative to the other.
Abstract:
In some embodiments, a device displays content on a touch screen display and detects input by finger gestures. In response to the finger gestures, the device selects content, visually distinguishes the selected content, and/or updates the selected content based on detected input. In some embodiments, the device displays a command display area that includes one or more command icons; detects activation of a command icon in the command display area; and, in response to detecting activation of the command icon in the command display area, performs a corresponding action with respect to the selected content. Exemplary actions include cutting, copying, and pasting content.
Abstract:
In accordance with some embodiments, a computer- implemented method for use in conjunction with a device (100) with a touch screen display (112) is disclosed. In the method, a movement (3925) of an object on or near the touch screen display (112) is detected. In response to detecting the movement (3925), an electronic document (3912) displayed on the touch screen display (112) is translated in a first direction (3928-2). If an edge of the electronic document (3912) is reached while translating the electronic document (3912) in the first direction (3928-2) while the object is still detected on or near the touch screen display (112), an area (3930) beyond the edge of the document (3912) is displayed. After the object is no longer detected on or near the touch screen display (112), the document (3912) is translated in a second direction (3928-1) until the area (3930) beyond the edge of the document is no longer displayed.