Abstract:
Device for optically scanning and measuring an environment, said device being configured to be mobile and being provided with a laser scanner (10) or the like, which creates 3D-scans, and an autonomously moving robot (2), on which a laser scanner (10) or the like is mounted.
Abstract:
An unmanned autonomous vehicle, UAV, for inspection of fluid transportation means (FTM), said unmanned autonomous vehicle (1) comprising a navigation system (2) adapted to localize automatically said fluid transportation means (FTM) and to navigate said unmanned autonomous vehicle (l) along said fluid transportation means (FTM) for its inspection.
Abstract:
Techniques are provided for discovery and monitoring of an environment using a plurality of robots. A plurality of robots navigate an environment by determining a navigation buffer for each of the robots; and allowing each of the robots to navigate within the environment while maintaining a substantially minimum distance from other robots, wherein the substantially minimum distance corresponds to the navigation buffer, and wherein a size of each of the navigation buffers is reduced over time based on a percentage of the environment that remains to be navigated. The robots can also navigate an environment by obtaining a discretization of the environment to a plurality of discrete regions; and determining a next unvisited discrete region for one of the plurality of robots to explore in the exemplary environment using a breadth-first search. The plurality of discrete regions can be, for example, a plurality of real or virtual tiles.
Abstract:
A system is provided for processing container-grown plants positioned in a given area. The system includes a processing station positioned in the area for processing the container-grown plants. It also includes one or more autonomous mobile container handling robots configured to: (i) travel to a source location in the area and pick up a container-grown plant, (ii) transport the container-grown plant to the processing station where a process is performed on the container-grown plant, (iii) transport the container- grown plant from the processing station to a destination location in the area, (iv) deposit the container-grown plant at the destination location, and (v) repeat (i) through (iv) for a set of container-grown plants in the source location.
Abstract:
The present teachings provide a method of controlling a remote vehicle having an end effector and an image sensing device. The method includes obtaining an image of an object with the image sensing device, determining a ray from a focal point of the image to the object based on the obtained image, positioning the end effector of the remote vehicle to align with the determined ray, and moving the end effector along the determined ray to approach the object.
Abstract:
Mobile robotic system allows multiple users to visit authentic places without physically being there. The users are able to take part in controlling the robot's movement according to their interest. A system administrator selects and defines criteria for robot's movement. The mobile robot with video and audio devices on it is remote controlled by a server which selects the robot's movement according to the users and system administrator criteria. The server provides information to users; the robot's location influences the content of the information. Such robotic system may be used for shopping, visiting museums and public touristic attractions over the internet.
Abstract:
A method of operating a mobile robot (100) that includes driving the robot according to a drive direction, determining a driven path (1012) of the robot from an origin (1013), and displaying a drive view (1010) on a remote operator control unit (400) in communication with the robot. The drive view shows the driven path of the robot from the origin. The method further includes obtaining global positioning coordinates of a current location of the robot and displaying a map (1014) in the drive view using the global positioning coordinates. The driven path of the robot is displayed on the map.
Abstract:
In one embodiment, an autonomously navigated mobile platform includes a support frame, a projector supported by the frame, a sensor supported by the frame, a memory including a plurality of program instructions stored therein for generating an encoded signal using a phase shifting algorithm, emitting the encoded signal with the projector, detecting the emitted signal with the sensor after the emitted signal is reflected by a detected body, associating the detected signal with the emitted signal, identifying an x-axis dimension, a y-axis dimension, and a z-axis dimension of the detected body, and one or more of a range and a bearing to the detected body, based upon the associated signal, identifying a present location of the mobile platform, navigating the mobile platform based upon the identified location, and a processor operably connected to the memory, to the sensor, and to the projector for executing the program instructions.
Abstract:
Die Erfindung betrifft eine Anordnung (2), die dazu ausgebildet ist, eine Umgebung für eine bewegliche Vorrichtung (4) zu erfassen, und die mindestens einen Sensor (6) zum visuellen Erfassen der Umgebung sowie jeweils mindestens einen Sensor (8, 10) zum Erfassen der Bewegungsrichtung und der Orientierung der Vorrichtung (4) aufweist, wobei die Anordnung (2) dazu ausgebildet ist, Informationen, die von den Sensoren (6, 8, 10) bereitgestellt werden, zu verarbeiten. Die Erfindung betrifft außerdem ein Verfahren, mit dem eine Umgebung für eine bewegliche Vorrichtung (4) erfasst wird.