Abstract:
In an illustrative embodiment a display control device is provided. The display control device includes a display control section to control display of a virtual object according to a positional relationship between an imaging device and a display device.
Abstract:
Provided is an information processing device including an image acquisition unit configured to acquire a captured image, a change detection unit configured to detect a change in a state of a subject in a network service recognized from the captured image, and a depiction change unit configured to change a depiction of the subject shown in the captured image in a case where the change detection unit detects the change in the state.
Abstract:
According to an illustrative embodiment, an information processing apparatus is provided. The information processing apparatus includes a communication device to receive plural pieces of tag information corresponding to respective positions within a target area, the target area having a position defined by the position of the apparatus; and an output device to output a plurality of sounds such that for each sound at least a portion of the sound overlaps with at least a portion of another of the sounds, each of the sounds being indicative of a respective piece of tag information.
Abstract:
[Object] To provide a client apparatus, control method, system and program capable of presenting emotion expression data showing viewing reactions of other users or a user himself or herself. [Solution] There is provided a client apparatus including: an acquisition unit that acquires a reaction of a viewing user to a content; an emotion estimation unit which estimates an emotion of a viewing user based on reaction information of the viewing user acquired by the acquisition unit; a determination unit that determines emotion expression data representing an emotion estimated by the emotion estimation unit; and an output unit that outputs emotion expression data determined by the determination unit.
Abstract:
There is provided an information processing apparatus that controls display of a virtual object displayed in an extended work space in which a real object and the virtual object are operable, the information processing apparatus including an operation deciding unit configured to decide an operation process to the virtual object displayed in the extended work space on the basis of a result of analysis of input information to the extended work space, the analysis being based on position information of an information terminal detected in the extended work space and display control trigger information for changing display of the virtual object, and a display control unit configured to execute a display control process of the virtual object on the basis of the decided operation process.
Abstract:
There is provided an information processing apparatus including a processing unit configured to control combining of a captured image and an operation target image so as to generate a combined image for feeding back gesture recognition to a user. A degree of visualization of the captured image appears to be changed in the combined image.
Abstract:
A method is provided for enabling sharing of data. The method comprises defining a sharing region corresponding to a portion of a surface of an apparatus. The method further comprises enabling sharing of data with an external device in response to user input associating the data with the sharing region.
Abstract:
PROBLEM TO BE SOLVED: To provide a gesture recognition device, a gesture recognition method and a program that are capable of recognizing a gesture which intercepts a sensor surface of an imaging sensor without utilizing a special device.SOLUTION: A gesture recognition device 1 for recognizing a gesture which intercepts a front surface of an imaging sensor 3 comprises: a first detection unit (a feature point detection section 15 and a feature point processing section 17) that detects changes of the photographic images Pa and Pb between non-intercepted condition and intercepted condition of the front surface of the imaging sensor; and a second detection unit (a histogram calculation section 21 and a histogram processing section 23) that detects an area in which a slope of a luminance value for a captured image is less than a threshold in the captured image in the intercepted condition of the front surface of the imaging sensor.
Abstract:
PROBLEM TO BE SOLVED: To improve the operability of a GUI displayed as a virtual three-dimensional space.SOLUTION: An information processing device 100 comprises: a display unit 110 for displaying an object 500p in a virtual three-dimensional space including the depth direction of a display screen 112; an operation unit 120 for acquiring an operation for moving the object 500p; and a control unit 130 for moving the display of the object 500p according to the acquired operation, executing first processing in a first state where the object 500p overlaps a first overlap responding area 510a set near a display area 502a of another object 500a displayed on the display screen 112, and executing second processing different from the first processing in a second state where the object 500p overlaps a second overlap determination area 520a set near the display area 502a; wherein the first overlap responding area 510a is an area obtained by extending the second overlap determination area 520a in at least the depth direction.
Abstract:
PROBLEM TO BE SOLVED: To provide a gesture recognition apparatus, a gesture recognition method, and a program which can provide appropriate gesture feedback using forecast information on a gesture.SOLUTION: A gesture recognition apparatus includes: a recognition processing unit 17 that recognizes a gesture based on a series of gesture information items input within a predetermined input period; a gesture forecasting unit 19 that forecasts a gesture based on a gesture information item being input, among the series of gesture information items; and a forecast information notifying unit 21 that notifies a user of forecast information about a forecast result of the gesture. A user U can check which gesture can be recognized, by continuously inputting a gesture information item according to notification of forecast information.