Abstract:
A movable olfactory robot dog (1) includes an IMS unit (100) acquiring chemical substance-related information relating to chemical substances included in external air (19) respectively obtained from left and right nostrils (12L) and (12R), and an event monitoring unit (30) that determines the occurrence of an event and occurrence direction of the event relative to the robot dog (1) based on a change in the chemical substance-related information respectively acquired at the left and right nostrils (12L) and (12R).
Abstract:
A robot adapted to operate in association with an interface surface having disposed therein or thereon coded data indicative of a plurality of reference points of the interface surface, the robot comprising: movement means to allow the robot to move over the interface surface; a sensing device which senses at least some of the coded data and generates indicating data indicative of a position of the robot on the interface surface; communication means to transmit the indicating data to a computer system running a computer application, and to receive movement instructions from the computer application; and, a marking device adapted to selectively mark the interface surface in response to marking instructions received from the computer application.
Abstract:
A system for providing communication of position information between moving bodies (105, 105) navigating in proximity of each other. Messages can be communicated via the same system. Orientation information is provided by transmitting infrared digital signals that are specific to individual zones around the moving body. By knowledge of the relation between the position of the zones and the specific signals an orientation can be deduced by a receiving body. Distance information is provided by transmitting infrared digital signals from a transmitter at respective power levels, at which power levels the signals comprise information for identifying the specific power level. By knowledge of the relation between the range of the zones and the specific signals, a distance from a receiving body to the transmitter can be deduced. Direction information is provided by knowledge of the position of reception zones and signals received.
Abstract:
Systems and methods are disclosed herein for determining relative orientation between a self-propelled device and a mobile computing device by utilizing the asymmetric radiation pattern of communication link emissions by the self-propelled device. Upon establishing the communication link, the self-propelled device may perform a spin, thereby enabling the self-propelled device and/or the mobile computing device to detect radiated pulses due to the asymmetry in the link. A direction may be determined based on such pulses, which may be utilized for calibration purposes.
Abstract:
A robotic device (200) includes a housing (202) configured to house a mobile device (210). The robotic device (200) also includes an articulating image director (204) aligned with a field of view of a camera of the mobile device (210). The housing (202) of the robotic device (210) is positioned at an angle to provide a forward view or rear facing view to the camera via the articulating image director (204).
Abstract:
There is provided a self-propelled electronic device for use on a base surface. The electronic device includes a chassis disposable over the base surface. The chassis includes a first surface facing the base surface when the chassis is disposed thereover. A light meter is configured to detect light incident toward the first surface and determine a luminance level thereof the detected light. A light source is configured to transition between an ON and OFF states dependent on the luminance level of the detected light. A line sensor is coupled to the chassis and is configured to sense a line segment on the base surface. A movement mechanism is coupled to the chassis and is placeable on the base surface. The movement mechanism is in operative communication with the line sensor to move on the base surface in a pattern corresponding to the line sensed on the base surface.
Abstract:
An environment identifying apparatus (400) is adapted to be mounted in a robot apparatus that moves in an identifiable unique environment in which a plurality of landmarks are located so as to identify the current environment by means of a plurality of registered environments. The environment identifying apparatus comprises an environment map building section (402) for recognizing the landmarks in the current environment, computing the movement/state quantity of the robot apparatus itself and building an environment map of the current environment containing information on the positions of the landmarks in the current environment on the basis of the landmarks and the movement/state quantity, an environment map storage section (403) having a data base of registered environment maps containing positional information on the landmarks and environment IDs, an environment identifying section (404) for identifying the current environment on the basis of the degree of similarity between the environment map of the current environment and each of the registered environment maps and an environment exploring section (405) for exploring a new environment.
Abstract:
A method of controlling a robot (1102) having detection means (1103, 1104) for detecting an object (1109) in one of a number of zones relative to the robot; and processing means for selecting and performing a predetermined action in response to said detection, the action corresponding to the detected zone. The method comprises presenting to a user via a graphical user interface (1101) a number of area symbols (1106-1108) each representing a corresponding one of the zones relative to the robot; presenting via the graphical user interface a plurality of action symbols (1124-1127) each representing at least one respective action of the robot; receiving a user command indicating a placement of an action symbol in a predetermined relation to a first one of said area symbols corresponding to a first zone; and generating an instruction for controlling the toy robot to perform the corresponding action in response to detecting an object in the first zone.