Abstract:
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for an unmanned aerial system inspection system. One of the methods is performed by a UAV and includes receiving, by the UAV, flight information describing a job to perform an inspection of a rooftop. A particular altitude is ascended to, and an inspection of the rooftop is performed including obtaining sensor information describing the rooftop. Location information identifying a damaged area of the rooftop is received. The damaged area of the rooftop is traveled to. An inspection of the damaged area of the rooftop is performed including obtaining detailed sensor information describing the damaged area. A safe landing location is traveled to.
Abstract:
Systems and methods for controlling an unmanned aerial vehicle within an environment are provided. In one aspect, a system comprises one or more sensors carried by the unmanned aerial vehicle and configured to provide sensor data and one or more processors. The one or more processors can be individually or collectively configured to: determine, based on the sensor data, an environment type for the environment; select a flight mode from a plurality of different flight modes based on the environment type, wherein each of the plurality of different flight mode is associated with a different set of operating rules for the unmanned aerial vehicle; and cause the unmanned aerial vehicle to operate within the environment while conforming to the set of operating rules of the selected flight mode.
Abstract:
The present invention discloses a heading generation method of an unmanned aerial vehicle including the following steps of: making a preliminary flight for selecting a point of view to record flight waypoints, the waypoints including positioning data and flight altitude information of the unmanned aerial vehicle; receiving and recording flight waypoints of the unmanned aerial vehicle; generating a flight trajectory according to waypoints of the preliminary flight; editing the flight trajectory to obtain a new flight trajectory; and transmitting the edited new flight trajectory to the unmanned aerial vehicle to cause the unmanned aerial vehicle to fly according to the new flight trajectory. The present invention further relates to a heading generation system of an unmanned aerial vehicle.
Abstract:
A videography drone can communicate with a microphone device. The videography drone can receive spatial information and audio data from a remote microphone device (e.g., a remote tracker, a mobile device running a drone control application, and/or a standalone audio recording device separate from the videography drone without drone control functionalities). The videography drone can utilize the spatial information to navigate the videography drone to follow the remote microphone device. The videography drone can stitch a video segment captured by its camera with an audio segment from the received audio data to generate an audio/video (A/V) segment. The stitching can be performed by matching spatial or temporal information (e.g., from the received spatial information) associated with the audio segment against spatial or temporal information associated with the video segment.
Abstract:
A security system is provided. An unmanned flying vehicle is remotely controlled to collect an environment information of a target environment, and whether a prompt signal is outputted is determined according to an environment variation obtained by comparing the environment information with a reference environmental information.
Abstract:
An interior length of a confined space is inspected by autonomously flying an unmanned aerial vehicle having a sensor pod. The sensor pod can be tethered to the unmanned aerial vehicle and lowered into the confined space from above perhaps by an electromechanical hoist. An altitude or heading of the sensor pod can be measured. The confined space can be the flue of a chimney.
Abstract:
A wireless aircraft is used to determine the flight route and the altitude by specifying an imaging area too large to be imaged in a single shot on the map. A controller terminal 100 communicating with a wireless aircraft 200 taking an image with a camera is used to store imaging area data on an imaging area specified from a user; and determines the flight route and the altitude of the wireless aircraft 200 to image the imaging area with a camera based on the stored imaging area data.
Abstract:
A method of home inspection comprising guiding a drone through a home along a selected inspection path, transmitting signals from the drone to establishing a flight path through the home, storing the flight path on a server, accessing the flight path from a programmed interactive digital device, launching the drone using said programmed interactive digital device, directing the drone through the home along the flight path and transmitting video signals from the drone and employing the video signals to provide a visual view of the property on a display of the interactive digital device. In another embodiment, the buyer can guide the drone along a flight path determined by the buyer in real time.
Abstract:
A surveying system having a total station integrated into an unmanned aerial vehicle communicates with a plurality of mobile communication stations that are located on known site coordinates. By locating the mobile communication stations on known coordinates, the location of the aerial vehicle is precisely triangulated and controlled. Construction drawings are loaded into the system, thereby allowing the vehicle to locate itself at specific points designated in the drawings for the marking of on-site construction grid lines.
Abstract:
Various embodiments provide methods for controlling landings of a UAV in a landing zone including a plurality of landing bays. Various embodiments include a method implemented on a computing device for receiving continuous real-time sensor data from a transceiver and from sensors onboard the UAV, and detecting a target landing bay within the plurality of landing bays within the landing zone that is available for landing based on the continuous real-time sensor data. Orientation and position coordinates for landing in the target landing bay may be calculated based on the continuous real-time sensor data. Information regarding positions and flight vectors of a plurality of autonomous UAVs may be obtained, and a flight plan for landing in the target landing bay may be generated based on the orientation and the position coordinates, positions and flight vectors of the plurality of autonomous UAVs and a current orientation and position of the UAV.