Abstract:
Provided is a nondestructive inspection (“NDI”) system that includes an unmanned aerial vehicle (“UAV”) comprising a body structure, the body structure comprising one or more support structures where each of the one or more support structures comprise a releasable end structure; and one or more NDI sensors integrated to a respective releasable end structure. The NDI system can also include a location tracking system that can determine a position, an orientation, or both of the UAV and/or one or more NDI sensors relative to a structure being inspected.
Abstract:
An improved mechanism for calibrating a local positioning system through the use of passive or retro-reflective markers is described herein. A plurality of imaging targets with the passive or retro-reflective markers may be attached or affixed on a surface of an object. The local positioning system may then capture a first image of the imaging targets in a non-illuminated state and further capture a second image of the imaging targets in an illuminated state. A difference image between the first and second captured images may be computed and then segmented. The local positioning system may then identify the plurality of imaging targets based on the segmented difference image and position itself to extract information. The extracted information may then be used to help calibrate the local positioning system.
Abstract:
A method for electronically pairing a plurality of control units with a plurality of objects in an aircraft is provided. The method includes identifying a selected control unit from the plurality of control units that will control a selected object from the plurality of objects, placing a hand-held scanner in close proximity to a first machine-readable tag on the selected control unit to acquire a first unique ID for only the selected control unit, placing the hand-held scanner in close proximity to a second machine-readable tag on the selected object to acquire a second unique ID for only the selected object, and associating the first unique ID with the second unique ID to pair the selected control unit with the selected object.
Abstract:
Systems and methods for supplying an open interface (e.g., web pages) for viewpoint navigation control of a three-dimensional (3-D) visualization of an object that is simple to create and fast and easy to use. This viewpoint navigation control application allows users to control the viewpoint in a 3-D environment by interacting with (e.g., clicking on) a 2-D hyperlink layout within a web browser (or other 2-D viewer with hyperlink capability). Position and orientation data for a selected viewpoint are transferred as part of a command message sent to the 3-D visualization application through an application programming interface when users select a hyperlink from a web page displayed by the 2-D layout application. The 3-D visualization application then retrieves data and displays a view of at least a portion of the 3-D model of the object with the predefined viewpoint specified in the command message.
Abstract:
Systems and methods for determining locations of a device in an environment where features are present. Passive code pattern markers are used as unique location landmarks to provide on-demand location information to the user of the device in an abstract, landmark-based reference system that can then be mapped into an underlying physical 3-D coordinate system to give location coordinates that can be used by other tools to determine a viewpoint. For example, a 3-D visualization system can be configured to set a viewpoint so that an image concurrently generated by a computer system presents a scene which approximates the scene being viewed by the user in the physical world at that moment in time.
Abstract:
A self-contained, holonomic motion tracking solution for supplementing the acquisition of inspection information on the surface of a structure, thereby enabling the real-time production of two-dimensional images from hand-held and automated scanning by holonomic-motion of non-destructive inspection (NDI) sensor units (e.g., NDI probes). The systems and methods disclosed enable precise tracking of the position and orientation of a holonomic-motion NDI sensor unit (hand-held or automated) and conversion of the acquired tracking data into encoder pulse signals for processing by a NDI scanning system.
Abstract:
A system comprising a multi-functional boom subsystem integrated with a holonomic-motion boom base platform. The boom base platform may comprise: Mecanum wheels with independently controlled motors; a pair of sub-platforms coupled by a roll-axis pivot to maintain four-wheel contact with the ground surface; and twist reduction mechanisms to minimize any yaw-axis twisting torque exerted on the roll-axis pivot. A computer with motion control software may be embedded on the boom base platform. The motion control function can be integrated with a real-time tracking system. The motion control computer may have multiple platform motion control modes: (1) a path following mode in which the boom base platform matches the motion path of the surface crawler (i.e., integration with crawler control); (2) a reactive mode in which the boom base platform moves based on the pan and tilt angles of the boom arm; and (3) a collision avoidance mode using sensors distributed around the perimeter of the boom base platform to detect obstacles.
Abstract:
A method, system, and apparatus for visually presenting a virtual environment relative to a physical workspace. An output device visually presents a view of the virtual environment to guide a human operator in performing a number of operations within the physical workspace. A mounting structure holds the output device and is movable with at least one degree of freedom relative to the physical workspace. A sensor system measures movement of the output device relative to the physical workspace to generate sensor data. A controller computes a transformation matrix and the set of scale factors to align the virtual environment and the physical workspace. The controller changes the view of virtual environment based on the sensor data to thereby change the view of the virtual environment in correspondence with the movement of the output device relative to the physical workspace.
Abstract:
Systems and methods for inspecting a surface are disclosed. A source, detector, a base, a controller, and a processing device are used to collect image data related to the surface and information relating to the location of the image data on the surface. The image data and information relating to location are correlated and stored in a processing device to create a map of surface condition.
Abstract:
An automated process uses a local positioning system to acquire location (i.e., position and orientation) data for one or more movable target objects. In cases where the target objects have the capability to move under computer control, this automated process can use the measured location data to control the position and orientation of such target objects. The system leverages the measurement and image capture capability of the local positioning system, and integrates controllable marker lights, image processing, and coordinate transformation computation to provide tracking information for vehicle location control. The resulting system enables position and orientation tracking of objects in a reference coordinate system.