Abstract:
A robot for marking an encoded surface is provided. The encoded surface has coded data identifying a plurality of locations thereon. The robot has an image sensor for sensing the coded data, and a processor for generating indicating data using the coded data sensed by the image sensor. The indicating data has data regarding a position of the robot on the encoded surface. The robot uses a communication means to transmit the indicating data to a computer system and receiving instructions from the computer system. A steerable drive system moves the robot over the encoded surface in response to movement instructions received from the computer system and a marking device selectively marks the encoded surface in response to marking instructions received from the computer system.
Abstract:
A robot for marking an interface surface is provided. The interface surface has coded data identifying a plurality of locations on the interface surface printed thereon. The robot comprises: an image sensor for sensing at least some of the coded data; a processor for generating indicating data using the sensed coded data, the indicating data comprising data regarding a position of the robot on the interface surface; communication means for transmitting the indicating data to a computer system and receiving instructions from the computer system; a steerable drive system for moving the robot over the interface surface in response to movement instructions received from the computer system; and a marking device for selectively marking the interface surface in response to marking instructions received from the computer system.
Abstract:
A robot for marking an interface surface is provided. The interface surface has coded data identifying a plurality of locations on the interface surface printed thereon. The robot comprises: an image sensor for sensing at least some of the coded data; a processor for generating indicating data using the sensed coded data, the indicating data comprising data regarding a position of the robot on the interface surface; communication means for transmitting the indicating data to a computer system and receiving instructions from the computer system; a steerable drive system for moving the robot over the interface surface in response to movement instructions received from the computer system; and a marking device for selectively marking the interface surface in response to marking instructions received from the computer system.
Abstract:
An apparatus and method of controlling a mobile body that travels around a sound source which generates a sound. This apparatus includes a traveling information producer for producing traveling information, which is information about traveling of the mobile body; a direction estimator for estimating a direction in which the mobile body is located with respect to the sound source; and a position determiner for determining a position of the mobile body using the traveling information and the estimated direction of the mobile body.
Abstract:
An environment identifying apparatus (400) is adapted to be mounted in a robot apparatus that moves in an identifiable unique environment in which a plurality of landmarks are located so as to identify the current environment by means of a plurality of registered environments. The environment identifying apparatus comprises an environment map building section (402) for recognizing the landmarks in the current environment, computing the movement/state quantity of the robot apparatus itself and building an environment map of the current environment containing information on the positions of the landmarks in the current environment on the basis of the landmarks and the movement/state quantity, an environment map storage section (403) having a data base of registered environment maps containing positional information on the landmarks and environment IDs, an environment identifying section (404) for identifying the current environment on the basis of the degree of similarity between the environment map of the current environment and each of the registered environment maps and an environment exploring section (405) for exploring a new environment.
Abstract:
A robot apparatus 1 is to be electrically charged autonomously. An electrical charging device 100 is provided with two markers, namely a main marker 118 and a sub-marker 119, and the heights of the markers are pre-stored in the robot apparatus. When the robot apparatus 1 is to find the direction and the distance to the electrical charging device 100, a CCD camera 20 finds the direction vector of the marker from the photographed image. This direction vector is transformed into a position vector of a camera coordinate system {c} and further into a position vector of the robot coordinate system {b}. The coordinate in the height-wise direction in the robot coordinate system {b} is compared to the pre-stored height to find the distance between the markers and the robot apparatus and the direction of the robot apparatus.
Abstract:
A charging system for a mobile robot includes the mobile robot that is battery-driven and moves in a self-controlled way within a work space, and a charging station for accommodating the mobile robot for a battery charging operation. The charging system includes visible recognition data arranged in a predetermined location of the charging station, an image pickup unit mounted on the mobile robot, a calculating unit for calculating a range and a bearing from the mobile robot to the charging station, based on an image picked up by the image pickup unit, and a searching unit for causing the mobile robot to search for the charging station, based on the calculation result provided by the calculating unit. Since the mobile robot searches for the charging station using a camera for recognizing the visible recognition data, a charging operation is automated.
Abstract:
A vehicle is controlled by a sensor such as an EyeTap device or a headworn camera., so that the vehicle drives in whatever direction the driver looks. The vehicle may be a small radio controlled car or airplane or helicopter driven of flown by a person outside the car or plane, or the vehicle may be a car, plane, or helicopter, or the like, driven or flown by a person sitting inside it. A differential direction system allows a person's head position to be compared to the position of the vehicle, to bring the difference in orientations to a zero, and a near zero difference may be endowed with a deliberate drift toward a zero difference. Preferably at least one of the sensors (preferably a headworn sensor) is a video camera. Preferably the sensor difference drifts toward zero when the person is going along a straight path, so that the head position for going straight ahead will not drift away from being straight ahead. The invention can be used with a wide range of toy cars, model aircraft, or fullsize vehicles, airplanes, fighter jets, or the like.
Abstract:
In an aspect, a robotic system is provided and includes at least two digital servo modules, each of which includes a position-controlled motor and a position sensor for sensing a servo position, a plurality of building block elements that are connectable with the digital servo modules to create position-controlled joints of a robotic figure, at least two wheel modules enabling wheeled movement of the robotic figure and a central controller communicating with and controlling the digital servo modules and the wheel modules. The central controller operatively places a selected group of digital servo modules in a learned motion mode, wherein a corresponding group of position-controlled joints is enabled to be manually manipulated, and wherein each of the selected digital servo module periodically transmits servomotor position to the central controller. The central controller can steer the robotic figure wheel modules based on servo positions of the selected digital servo modules.
Abstract:
A robotic creature system includes an actuatable head defining eyes, an eyelid mechanism, a body, a drivetrain, and a set of sensors. A method for robotic creature operation includes: detecting an event associated with a technological imperfection and automatically performing a set of expressive actions associated with the event.