Abstract:
One or more techniques and/or systems are provided for monitoring interactions by an input object with an interactive interface projected onto an interface object. That is, an input object (e.g., a finger) and an interface object (e.g., a wall, a hand, a notepad, etc.) may be identified and tracked in real-time using depth data (e.g., depth data extracted from images captured by a depth camera). An interactive interface (e.g., a calculator, an email program, a keyboard, etc.) may be projected onto the interface object, such that the input object may be used to interact with the interactive interface. For example, the input object may be tracked to determine whether the input object is touching or hovering above the interface object and/or a projected portion of the interactive interface. If the input object is in a touch state, then a corresponding event associated with the interactive interface may be invoked.
Abstract:
Architecture that combines multiple depth cameras and multiple projectors to cover a specified space (e.g., a room). The cameras and projectors are calibrated, allowing the development of a multi-dimensional (e.g., 3D) model of the objects in the space, as well as the ability to project graphics in a controlled fashion on the same objects. The architecture incorporates the depth data from all depth cameras, as well as color information, into a unified multi-dimensional model in combination with calibrated projectors. In order to provide visual continuity when transferring objects between different locations in the space, the user's body can provide a canvas on which to project this interaction. As the user moves body parts in the space, without any other object, the body parts can serve as temporary “screens” for “in-transit” data.
Abstract:
Various technologies described herein pertain to sensing cardiovascular risk factors of a user. A chair includes one or more sensors configured to output signals indicative of conditions at site(s) on a body of a user. A seat of the chair, a back of the chair, and/or arms of the chair can include the sensor(s). Moreover, the chair includes a collection circuit configured to receive the signals from the sensor(s). A risk factor evaluation component is configured to detect a pulse wave velocity of the user based on the signals from the sensor(s). The risk factor evaluation component is further configured to perform a pulse wave analysis of the user based on a morphology of a pulse pressure waveform of the user, and the pulse pressure waveform is detected based on the signals from the sensor(s).
Abstract:
Three-dimensional (3-D) spatial image data may be received that is associated with at least one arm motion of an actor based on free-form movements of at least one hand of the actor, based on natural gesture motions of the at least one hand. A plurality of sequential 3-D spatial representations that each include 3-D spatial map data corresponding to a 3-D posture and position of the hand at sequential instances of time during the free-form movements may be determined, based on the received 3-D spatial image data. An integrated 3-D model may be generated, via a spatial object processor, based on incrementally integrating the 3-D spatial map data included in the determined sequential 3-D spatial representations and comparing a threshold time value with model time values indicating numbers of instances of time spent by the hand occupying a plurality of 3-D spatial regions during the free-form movements.
Abstract:
One or more techniques and/or systems are provided for monitoring interactions by an input object with an interactive interface projected onto an interface object. That is, an input object (e.g., a finger) and an interface object (e.g., a wall, a hand, a notepad, etc.) may be identified and tracked in real-time using depth data (e.g., depth data extracted from images captured by a depth camera). An interactive interface (e.g., a calculator, an email program, a keyboard, etc.) may be projected onto the interface object, such that the input object may be used to interact with the interactive interface. For example, the input object may be tracked to determine whether the input object is touching or hovering above the interface object and/or a projected portion of the interactive interface. If the input object is in a touch state, then a corresponding event associated with the interactive interface may be invoked.
Abstract:
Various technologies described herein pertain to controlling performance of a health assessment of a user in an entertainment venue. Data in a health record of the user is accessed, where the health record is retained in computer-readable storage. The user is located at the entertainment venue, and the entertainment venue includes an attraction. A health parameter of the user to be measured as part of the health assessment performed in the entertainment venue is selected based on the data in the health record of the user. Further, an interaction between the user and the attraction of the entertainment venue is controlled based on the health parameter to be measured. Data indicative of the health parameter of the user is computed based on a signal output by a sensor. The signal is output by the sensor during the interaction between the user and the attraction of the entertainment venue.
Abstract:
Various technologies described herein pertain to adjust recommended dosages of a medication for a user in a non-clinical environment. The medication can be identified and an indication of a symptom of the user desirably managed by the medication can be received. An initial recommended dosage of the medication can be determined based on static data of the user and the symptom. Dynamic data indicative of efficacy of the medication for the user over time in the non-clinical environment can be collected from sensor(s) in the non-clinical environment. The dynamic data indicative of the efficacy of the medication can include data indicative of the symptom and data indicative of a side effect of the user resulting from the medication. A subsequent recommended dosage of the medication can be refined based on the static data of the user and the dynamic data indicative of the efficacy of the medication for the user.