Abstract:
A portable multifunction device (100) with a touch screen display (112) performs a method that includes: displaying a phone call user interface (3000F) on the touch screen display (112), wherein the phone call user interface (3000F) includes: a first informational item (3033) associated with an active phone call between a user of the device and a first party, a second informational item (3033) associated with a suspended phone call between the user and a second party, and a merge call icon (3038); upon detecting a user selection (3040) of the merge call icon (3038), merging the active phone call and the suspended phone call into a conference call between the user, the first party, and the second party, and replacing the phone call user interface (3000F) with a conference call user interface (3000G). The conference call user interface (3000G) includes: a third informational item (3042) associated with the conference call, and a conference call management icon (3044).
Abstract:
A computer-implemented method, for use in conjunction with a portable electronic device with a touch screen display, comprises displaying at least a portion of a structured electronic document on the touch screen display, wherein the structured electronic document comprises a plurality of boxes of content, and detecting a first gesture at a location on the displayed portion of the structured electronic document. A first box in the plurality of boxes at the location of the first gesture is determined. The first box on the touch screen display is enlarged and substantially centered.
Abstract:
A computer-implemented method for management of voicemail messages, performed at a portable electronic device with a touch screen display, includes: displaying a list of voicemail messages; detecting selection by a user of a respective voicemail message in the list; responding to the user selection of the respective voicemail message by initiating playback of the user-selected voicemail message; displaying a progress bar for the user-selected voicemail message, wherein the progress bar indicates the portion of the user-selected voicemail message that has been played; detecting movement of a finger of the user from a first position on the progress bar to a second position on the progress bar; and responding to the detection of the finger movement by restarting playback of the user-selected voicemail message at a position within the user-selected voicemail message corresponding substantially to the second position on the progress bar.
Abstract:
A computer system displays a first and second user interface object in a three- dimensional environment. The first and second user interface objects have a first and second spatial relationship to a first and second anchor position corresponding to a location of a user's hand in a physical environment, respectively. While displaying the first and second user interface objects in the three-dimensional environment, the computer system detects movement of the user's hand in the physical environment, corresponding to a translational movement and a rotational movement of the user's hand relative to a viewpoint, and in response, translates the first and second user interface objects relative to the viewpoint in accordance with the translational movement of the user's hand, and rotates the first user interface object relative to the viewpoint in accordance with the rotational movement of the user's hand without rotating the second user interface object.
Abstract:
An example method includes: concurrently displaying,: a first view of a first application in a first display mode; and a display mode affordance; while displaying the first view of the first application, receiving a sequence of one or more inputs including a first input selecting the display mode affordance; and in response to detecting the sequence of one or more inputs: ceasing to display at least a portion of the first view of the first application while maintaining display of a representation of the first application; displaying at least a portion of a home screen that includes multiple application affordances, receiving a second input selecting an application affordance associated with a second application; and in response to receiving the second input, concurrently displaying, via the display generation component: a second view of the first application and a first view of the second application.
Abstract:
A computing system, while displaying a first view of a first computer-generated three-dimensional environment including a representation of a respective portion of a physical environment, and a first representation of one or more projections of light in a first portion of the first computer-generated three-dimensional environment, detects, from a first user, a query directed to a virtual assistant. In response, the computer system displays animated changes of the first representation of the one or more projections of light in the first portion of the first computer-generated three-dimensional environment, including displaying a second representation of the one or more projections of light that is focused on a first sub-portion of the first portion, and then displays content responding to the query at a position corresponding to the first sub-portion of the first portion.
Abstract:
While displaying a virtual object with a first spatial location in a three-dimensional environment, a computer system detects a first hand movement performed by a user. In accordance with a determination that the first hand movement meets first gesture criteria, the computer system performs a first operation in accordance with the first hand movement, without moving the virtual object away from the first spatial location; and in accordance with a determination that the first hand movement meets second gesture criteria, the computer system displays a first visual indication that the virtual object has transitioned into a reconfiguration mode, and further detects a second hand movement performed by the user. In accordance with a determination that the second hand movement meets the first gesture criteria, the computer system moves the virtual object from the first spatial location to a second spatial location in accordance with the second hand movement.
Abstract:
While displaying a view of a three-dimensional environment, a computer system detects movement of a user's thumb over the user's index finger using a camera. In accordance with a determination that the movement is a swipe of the thumb over the index finger in a first direction, the computer system performing a first operation; and in accordance with a determination that the movement is a tap of the thumb over the index finger at a first location on the index finger, the computer system performs a second operation that is different from the first operation.
Abstract:
An electronic device: displays a field of view of a camera at a first magnification and updates the displayed field of view over time based on changes detected by the camera. The field of view includes a view of a three-dimensional space. In response to a first touch input, the device adds a measurement point at a first location in the displayed field of view that corresponds to a first location in the three-dimensional space. As the camera moves, the device displays the measurement point at a location in the displayed field of view that corresponds to the first location in the three-dimensional space. In response to a second touch input corresponding to a current location of the measurement point in the displayed field of view, the device enlarges display of the displayed field of view around the measurement point from the first magnification to a second, greater magnification.
Abstract:
Audio inputs are detected at an electronic device and translated into electronic communications for playback at an external electronic device.