Abstract:
Systems and methods are provided for displaying a portion of a map on a mobile device of a user while the user is traveling along a route. The mobile device can use a selected route and a current location of the device to load map tiles for parts of the map that are upcoming along the route. In this manner, the user can have quick access to the portions of the map that the user likely will want to view. For example, the map tiles can be loaded for the next 50 Km, and then when the stored tiles reaches only 25 Km ahead, another 25 Km of tiles can be retrieved. The amount of tiles loaded (e.g., minimum and maximum amounts) can vary based on a variety of factors, such as network state, distance traveled along the route, and whether the mobile device is charging.
Abstract:
Systems and methods for rendering 3D maps may highlight a feature in a 3D map while preserving depth. A map tool of a mapping or navigation application that detects the selection of a feature in a 3D map (e.g., by touch) may perform a ray intersection to determine the feature that was selected. The map tool may capture the frame to be displayed (with the selected feature highlighted) in several steps. Each step may translate the map about a pivot point of the selected map feature (e.g., in three or four directions) to capture a new frame. The captured frames may be blended together to create a blurred map view that depicts 3D depth in the scene. A crisp version of the selected feature may then be rendered within the otherwise blurred 3D map. Color, brightness, contrast, or saturation values may be modified to further highlight the selected feature.
Abstract:
Devices, methods, and machine-read le media to facilitate intuitive comparison and selection of calculated navigation routes are disclosed. An electronic device for navigation includes a touch-sensitive screen and a processing module for displaying a map, calculating a number or navigation routes simultaneously on the touch-sensitive screen, and receiving a selection of a route. Callouts, or markers for presenting key information about each route, are also displayed discretely on the map. Navigation tiles including key route information and route pictorials can also be created and displayed for each calculated route.
Abstract:
At least certain embodiments of the present disclosure include an environment with a framework of software code interacting with a plurality of applications to provide gesture operations in response to user inputs detected on a display of a device. A method for operating through an application programming interface (API) in this environment includes displaying a user interface that includes a respective view that is associated with a respective application of the plurality of applications. The method includes, while displaying the respective view, detecting, via the software code, a user input within the region of the touch-sensitive surface that corresponds to the respective view, and, in response, in accordance with a determination that the user input is an inadvertent user input, ignoring the user input. The determination that the user input is an inadvertent user input is made based on an inadvertent user input call transferred through the API.
Abstract:
A device, method, and graphical user interface for providing maps, directions, and location-based information on a touch screen display are disclosed.
Abstract:
Embodiments may include determining a navigation route between an origination and a destination; the route may span multiple portions of a map. Embodiments may also include receiving an order of priority in which to receive the multiple portions of the map; the order may be generated based on distinct levels of expected signal strength for each of the multiple portions. For instance, within the order of priority, map portions associated with areas of low signal strength may be ranked higher than areas of higher signal strength. Embodiments may also include acquiring at least some of the portions of the map according to the order of priority, and generating a map display comprising the multiple portions of the map. For instance, map portions associated with areas of poor reception may be downloaded first whereas map portions associated with strong signal strength may be downloaded on-the-fly during route navigation.
Abstract:
Some embodiments of the invention provide a navigation application that allows a user to peek ahead or behind during a turn-by-turn navigation presentation that the application provides while tracking a device (e.g., a mobile device, a vehicle, etc.) traversal of a physical route. As the device traverses along the physical route, the navigation application generates a navigation presentation that shows a representation of the device on a map traversing along a virtual route that represents the physical route on the map. While providing the navigation presentation, the navigation application can receive user input to look ahead or behind along the virtual route. Based on the user input, the navigation application moves the navigation presentation to show locations on the virtual route that are ahead or behind the displayed current location of the device on the virtual route. This movement can cause the device representation to no longer be visible in the navigation presentation. Also, the virtual route often includes several turns, and the peek ahead or behind movement of the navigation presentation passes the presentation through one or more of these turns. In some embodiments, the map can be defined presented as a two-dimensional (2D) or a three-dimensional (3D) scene.
Abstract:
Methods, systems and apparatus are described to dynamically generate map textures. A client device may obtain map data, which may include one or more shapes described by vector graphics data. Along with the one or more shapes, embodiments may include texture indicators linked to the one or more shapes. Embodiments may render the map data. For one or more shapes, a texture definition may be obtained. Based on the texture definition, a client device may dynamically generate a texture for the shape. The texture may then be applied to the shape to render a current fill portion of the shape. In some embodiments the render map view is displayed.
Abstract:
Some embodiments provide a device that automatically orients and displays a map of a region according to the natural viewing orientation of the map. In some embodiments, the device examines data associated with the map to determine whether it can identify a natural viewing orientation of the map that differs from the geographic orientation of the map. When the device is able to identify such a natural viewing orientation, it displays the map according to this natural viewing orientation instead of the geographic orientation of the map. On the other hand, when the device is not able to identify a natural viewing orientation that differs from the geographic orientation, the device displays the map according to its geographic orientation. In some embodiments, the geographic orientation of the map is north-up orientation (where north is up (e.g., top center of the page), south is down, west is left, and east is right). In other embodiments, the geographic orientation of the map can be another orientation that is set by one of the geographic directions, such as south-up map orientation, where south is up, north is down, east is left and west is right.
Abstract:
Some embodiments of the invention provide a navigation application that allows a user to peek ahead or behind during a turn-by-turn navigation presentation that the application provides while tracking a device (e.g., a mobile device, a vehicle, etc.) traversal of a physical route. As the device traverses along the physical route, the navigation application generates a navigation presentation that shows a representation of the device on a map traversing along a virtual route that represents the physical route on the map. While providing the navigation presentation, the navigation application can receive user input to look ahead or behind along the virtual route. Based on the user input, the navigation application moves the navigation presentation to show locations on the virtual route that are ahead or behind the displayed current location of the device on the virtual route. This movement can cause the device representation to no longer be visible in the navigation presentation. Also, the virtual route often includes several turns, and the peek ahead or behind movement of the navigation presentation passes the presentation through one or more of these turns. In some embodiments, the map can be defined presented as a two-dimensional (2D) or a three-dimensional (3D) scene.