Abstract:
In any context where a user can view multiple different content items, switching among content items is provided using an array mode. In a full-frame mode, one content item is visible and active, but other content items may also be open. In response to user input the display can be switched to an array mode, in which all of the content items are visible in a scrollable array. Selecting a content item in array mode can result in the display returning to the full-frame mode, with the selected content item becoming visible and active. Smoothly animated transitions between the full-frame and array modes and a gesture-based interface for controlling the transitions can also be provided.
Abstract:
A computing device is disclosed. The computing device includes a housing having an illuminable portion. The computing device also includes a light device disposed inside the housing. The light device is configured to illuminate the illuminable portion.
Abstract:
A computing device is disclosed. The computing device includes a housing having an illuminable portion. The computing device also includes a light device disposed inside the housing. The light device is configured to illuminate the illuminable portion.
Abstract:
A wireless communication device may wirelessly control an object, such as a physical device, directly or through interaction with a virtual representation (or placeholder) of the object situated at a predefined physical location. In particular, the wireless communication device may identify an intent gesture performed by a user that indicates intent to control the object. For example, the intent gesture may involve pointing or orienting the wireless communication device toward the object, with or without additional input. Then, the wireless communication device may determine the object associated with the intent gesture using wireless ranging and/or device orientation. Moreover, the wireless communication device may interpret sensor data from one or more sensors associated with the wireless communication device to determine an action gesture corresponding to a command or a command value. The wireless communication device may then transmit the command value to control the object.
Abstract:
An electronic device displays a first user interface that corresponds to a first application, and detects on a touch-sensitive surface a first gesture that includes movement of a contact in a respective direction on the touch-sensitive surface. In response to detecting the first gesture, the device, in accordance with a determination that the movement of the contact is in a first direction, replaces display of the first user interface with display of a second user interface that corresponds to a second application; and in accordance with a determination that the movement of the contact is in a second direction, distinct from the first direction, displays a first system user interface for interacting with a system-level function.
Abstract:
An example method is performed at a device with a display and a biometric sensor. While the device is in a locked state, the method includes displaying a log-in user interface that is associated with logging in to a first and second user account. While displaying the log-in user interface, the method includes, receiving biometric information, and in response to receiving the biometric information: when the biometric information is consistent with biometric information for the first user account and the first user account does not have an active session, displaying a prompt to input a log-in credential for the first user account; and when the biometric information is consistent with biometric information for the second user account and the second user account does not have an active session on the device, displaying a prompt to input a log-in credential for the second user account.
Abstract:
An electronic device displays a user interface that includes a first adjustable control and a second adjustable control and detects movement of a first contact across a touch-sensitive surface in a drag gesture. In accordance with a determination that the drag gesture is performed while a focus selector is at a location that corresponds to the first adjustable control: the device outputs a first plurality of tactile outputs that has a first distribution of tactile outputs. In accordance with a determination that the drag gesture is performed while the focus selector is at a location that corresponds to the second adjustable control, the device outputs a second plurality of tactile outputs. The second plurality of tactile outputs has a second distribution of tactile outputs that is different from the first distribution of tactile outputs.
Abstract:
An example method is performed at a device with a display and a biometric sensor. While the device is in a locked state, the method includes displaying a log-in user interface that is associated with logging in to a first and second user account. While displaying the log-in user interface, the method includes, receiving biometric information, and in response to receiving the biometric information: when the biometric information is consistent with biometric information for the first user account and the first user account does not have an active session, displaying a prompt to input a log-in credential for the first user account; and when the biometric information is consistent with biometric information for the second user account and the second user account does not have an active session on the device, displaying a prompt to input a log-in credential for the second user account.
Abstract:
An electronic device, while displaying representations of a plurality of collections of media items, detects a swipe input that starts at a location corresponding to a first representation of a first collection of media items in the plurality of collections of media items. In response to detecting the swipe input: in accordance with a determination that the swipe input is in a first direction, the device scrolls the representations of the plurality of collections of media items in the first direction; and, in accordance with a determination that the swipe input is in a different, second direction, the device: ceases to display a representation of a first item in the first collection of media items, and displays a representation of a second item in the first collection of media items, without scrolling; and generates a tactile output corresponding to displaying the representation of the second item.
Abstract:
A wireless communication device may wirelessly control an object, such as a physical device, directly or through interaction with a virtual representation (or placeholder) of the object situated at a predefined physical location. In particular, the wireless communication device may identify an intent gesture performed by a user that indicates intent to control the object. For example, the intent gesture may involve pointing or orienting the wireless communication device toward the object, with or without additional input. Then, the wireless communication device may determine the object associated with the intent gesture using wireless ranging and/or device orientation. Moreover, the wireless communication device may interpret sensor data from one or more sensors associated with the wireless communication device to determine an action gesture corresponding to a command or a command value. The wireless communication device may then transmit the command value to control the object.