Abstract:
There is disclosed a mobile terminal for providing a guide and feedback for a lock release operation to implement a gesture for a lock release operation at an arbitrary location and in various directions and a method of releasing a user interface lock state thereof. A method of releasing a user interface lock state in a mobile terminal having a touch sensing display according to an embodiment of the present disclosure may include setting the mobile terminal to a user interface lock state; detecting a contact with the touch sensing display at an arbitrary location on the touch sensing display; displaying a lock release region corresponding to the arbitrary location while maintaining a contact with the touch sensing display; displaying the movement of the contact in the lock release region; and switching the mobile terminal into a user interface lock release state when the movement of the contact in the lock release region satisfies a predetermined condition.
Abstract:
There is disclosed a mobile terminal for providing a guide and feedback for a lock release operation to implement a gesture for a lock release operation at an arbitrary location and in various directions and a method of releasing a user interface lock state thereof. A method of releasing a user interface lock state in a mobile terminal having a touch sensing display according to an embodiment of the present disclosure may include setting the mobile terminal to a user interface lock state; detecting a contact with the touch sensing display at an arbitrary location on the touch sensing display; displaying a lock release region corresponding to the arbitrary location while maintaining a contact with the touch sensing display; displaying the movement of the contact in the lock release region; and switching the mobile terminal into a user interface lock release state when the movement of the contact in the lock release region satisfies a predetermined condition.
Abstract:
A mobile terminal including a wireless communication unit configured to transceive data with a drone via wireless communication; a display unit; a memory configured to store data received from the drone; a location information collecting unit configured to collect location information of the mobile terminal; and a controller configured to control the wireless communication unit to receive image data captured by a camera of the drone and sensing data corresponding to the drone and the camera of the drone, and obtain capturing information including flight information corresponding to a flight path of the drone and camera motion information corresponding to a motion of the camera of the drone based on the collected location information of the mobile terminal and the received sensing data.
Abstract:
An electronic device (10) and a terminal communicating with the electronic device (10) are provided. The electronic device (10) includes a memory (14), a communication module (12) connected to at least one external device (31-39) having a camera to exchange data with the at least one external device (31-39), and a controller (17) configured to, when a specific event occurs, enable the camera included in at least one of the at least one external device (31-39) to capture an image through the communication module (12), and obtain the captured image and store the obtained image in the memory (14). When a specific event occurs, an image captured using a camera of an external device may be obtained and stored.
Abstract:
A glasses-type terminal including a main body configured to be worn as glasses on a user's head; a first microphone configured to detect a vibration signal through a skull portion of the user in which a voice being input propagates through the skull portion of the user; a second microphone configured to detect a voice signal in which the voice being input propagates over the air; a memory configured to store a pre-registered vibration signal and a pre-registered voice signal corresponding to a pre-registered voice; and a controller configured to switch a locked state to an unlocked state when the vibration signal detected through the first microphone is matched to the pre-registered vibration signal and the voice signal detected through the second microphone is matched to the pre-registered voice signal.
Abstract:
An unmanned aerial vehicle system according to the present invention includes a housing (2000) mounted on a vehicle (10) and having an inner space, the housing provided with a launching unit, an unmanned aerial vehicle (1000) accommodated in the housing and configured to be launched from the housing when a driving state of the vehicle meets a preset condition, wing units (1210) mounted to the unmanned aerial vehicle and configured to allow the flight of the unmanned aerial vehicle in response to the launch from the housing, an output unit disposed on the unmanned aerial vehicle, and a controller configured to control the wing units to move the unmanned aerial vehicle to a position set based on information related to the driving state when the unmanned aerial vehicle is launched, and control the output unit to output warning information related to the driving state.
Abstract:
An electronic device (10) and a terminal communicating with the electronic device (10) are provided. The electronic device (10) includes a memory (14), a communication module (12) connected to at least one external device (31-39) having a camera to exchange data with the at least one external device (31-39), and a controller (17) configured to, when a specific event occurs, enable the camera included in at least one of the at least one external device (31-39) to capture an image through the communication module (12), and obtain the captured image and store the obtained image in the memory (14). When a specific event occurs, an image captured using a camera of an external device may be obtained and stored.
Abstract:
An unmanned aerial vehicle system according to the present invention includes a housing (2000) mounted on a vehicle (10) and having an inner space, the housing provided with a launching unit, an unmanned aerial vehicle (1000) accommodated in the housing and configured to be launched from the housing when a driving state of the vehicle meets a preset condition, wing units (1210) mounted to the unmanned aerial vehicle and configured to allow the flight of the unmanned aerial vehicle in response to the launch from the housing, an output unit disposed on the unmanned aerial vehicle, and a controller configured to control the wing units to move the unmanned aerial vehicle to a position set based on information related to the driving state when the unmanned aerial vehicle is launched, and control the output unit to output warning information related to the driving state.