Abstract:
A robot having a base with a moving mechanism configured to move the robot according to control instructions, a controller configured to move the robot within a predetermined area along a pathway area, a body supported on the base, a camera mounted on the body and controlled to capture photo images while traveling along the pathway area, wherein the robot is controlled to rotate its body about a point in the pathway area and use the camera to take a 360 degree photo image of the surrounding at the location. The photo image is analyzed to identify information related to the location.
Abstract:
An autonomous moving body whereby annular route issues in environment maps can be solved and division of environment maps can be automated. In a teaching travel mode, the autonomous moving body: outputs a motor control amount from travel commands input by an operator; estimates the position of the autonomous moving body on the environment map; obtains position information for obstructions in the vicinity of the autonomous moving body; associates position information for obstructions to the time the position information for obstructions was obtained; stores same in the storage unit as data for environment map reconstruction; creates a travel schedule; and stores same in the storage unit. In a reproduction travel mode, the autonomous moving body: estimates the position of the autonomous moving body on the environment map; obtains position information for obstructions in the vicinity of the autonomous moving body; reads data for environment map reconstruction that corresponds to the estimated position of the autonomous moving body; updates the environment map; creates a control amount for the motor, so as to travel on the updated environment map in accordance with the schedule; and inputs same to the travel unit.
Abstract:
A method of performing automated operations on a workpiece by at least one autonomous device is provided. The method includes sensing, by a first of the at least one autonomous devices, a guidance pattern positioned on the workpiece along a first path. The method also includes traversing, by the first autonomous device, along the first path by following the sensed guidance pattern, to a first path location that is within a detection distance of a first precision target indicator positioned on the workpiece.
Abstract:
A robot (100, 100a) includes a three-dimensional shape detecting sensor (12) to detect a three dimensional shape of a travel surface existing in a forward travelling direction of the robot (100, 100a), a posture stabilizer (18) to stabilize a posture of a body (10) of the robot (100, 100a), a feature data generator (102) to generate feature data of the detected three dimensional shape, an inclination angle prediction generator (106) to generate a prediction value of an inclination angle of the body (10) when the robot (100, 100a) is to reach a position on the travel surface in the forward travelling direction at a future time point based on the feature data and a prediction model, and an overturn prevention controller (108) to control the posture stabilizer (18) to prevent an overturn of the robot (100, 100a) based on the prediction value.
Abstract:
Vector Field SLAM is a method for localizing a mobile robot in an unknown environment from continuous signals such as WiFi or active beacons. Disclosed is a technique for localizing a robot in relatively large and/or disparate areas. This is achieved by using and managing more signal sources for covering the larger area. One feature analyzes the complexity of Vector Field SLAM with respect to area size and number of signals and then describe an approximation that decouples the localization map in order to keep memory and run-time requirements low. A tracking method for re-localizing the robot in the areas already mapped is also disclosed. This allows to resume the robot after is has been paused or kidnapped, such as picked up and moved by a user. Embodiments of the invention can comprise commercial low-cost products including robots for the autonomous cleaning of floors.
Abstract:
Control units (10) for use with unmanned vehicles (12) include an input device (50) that moves in response to a user input, sensors (70) coupled to the input device (50), and a controller (16). The sensors (70) generate outputs related to the movement of the input device (50). The controller (16) determines a target displacement of the unmanned vehicle (12) based on the outputs of the sensors (70), and generates a control input related to the target displacement. The control input, when received by the unmanned vehicle (12), causes the unmanned vehicle (12) to substantially attain the target displacement. The position of the vehicle (12) is thus controlled by directly controlling the displacement of the vehicle (12).
Abstract:
In one embodiment, an autonomously navigated mobile platform includes a support frame, a projector supported by the frame, a sensor supported by the frame, a memory including a plurality of program instructions stored therein for generating an encoded signal using a phase shifting algorithm, emitting the encoded signal with the projector, detecting the emitted signal with the sensor after the emitted signal is reflected by a detected body, associating the detected signal with the emitted signal, identifying an x-axis dimension, a y-axis dimension, and a z-axis dimension of the detected body, and one or more of a range and a bearing to the detected body, based upon the associated signal, identifying a present location of the mobile platform, navigating the mobile platform based upon the identified location, and a processor operably connected to the memory, to the sensor, and to the projector for executing the program instructions.
Abstract:
Methods of remote control of a mobile robot and an intuitive user interface for remotely controlling a mobile robot are provided. Using a point-and-click device (405), the user is able to choose a target location (430) within a heads-up display (400) toward which to move a mobile robot. Additional graphical overlays (410, 412) are provided to aid the user in navigating even in systems with asynchronous communication.
Abstract:
Methods of remote control of a mobile robot and an intuitive user interface for remotely controlling a mobile robot are provided. Using a point-and-click device (405), the user is able to choose a target location (430) within a heads-up display (400) toward which to move a mobile robot. Additional graphical overlays (410, 412) are provided to aid the user in navigating even in systems with asynchronous communication.
Abstract:
The invention relates to a robot system, including: a base station; and a robot, the base station including a wireless transceiver capable of communicating TCP/IP transmissions over a local wireless protocol, a wired Ethernet connector for communicating TCP/IP transmissions over a local wired Ethernet accessing the Internet, and an access point circuit for transferring TCP/IP transmissions between the local wired Ethernet and local wireless protocol limited to a predetermined IP address locked to the robot, predetermined shell level encryption locked to the robot, and predetermined ports to the Internet open only to the robot, the robot including a wireless transceiver capable of communicating TCP/IP transmissions over a local wireless protocol and a client circuit for transferring TCP/IP transmissions over the local wireless protocol.