-
公开(公告)号:KR101710667B1
公开(公告)日:2017-02-27
申请号:KR1020120147600
申请日:2012-12-17
Applicant: 한국전자통신연구원
CPC classification number: G06F17/30867 , G06Q30/06
Abstract: 본발명은로봇을이용한서비스어플리케이션제공장치및 그방법에관한것으로, 로봇의이동성을이용하여실시간으로주변환경정보및 사용자정보를수집함으로써각 사용자에게필요한서비스어플리케이션을제공할수 있고, 사용자와로봇간의상호작용(Human Robot Interaction)으로발생하는피드백을통해서비스어플리케이션에대한다양한평가를진행할수 있는로봇을이용한서비스어플리케이션제공장치및 그방법에관한기술이다.
Abstract translation: 一种使用机器人提供服务应用的装置包括感测单元,被配置为产生关于机器人的移动路径的周围的环境感测信息和用户状态信息; 以及用户环境确定单元,被配置为确定用户生成用户识别信息的情况和意图,并且搜索和下载服务应用。 此外,该装置包括:服务提供单元,被配置为搜索机器人的移动路径周围的服务表示设备,并将与服务应用相对应的服务移动到所搜索的服务表示设备中的至少一个; 以及用户反馈管理单元,被配置为管理与用户和机器人之间的交互相对应的反馈信息。
-
公开(公告)号:KR1020150061398A
公开(公告)日:2015-06-04
申请号:KR1020130145482
申请日:2013-11-27
Applicant: 한국전자통신연구원
CPC classification number: G05D1/0297 , A47L11/24 , A47L2201/04 , G05D1/0219 , G05D1/0274 , G05D2201/0203 , Y10S901/01 , Y10S901/30
Abstract: 복수의청소로봇들을이용하는협력청소방법및 제어장치가제공된다. 협력청소방법은넓은공간의청소상태를전체적으로모니터링하면서청소가요구되는공간으로청소로봇을자동적으로이동시켜청소를수행할수 있다. 또한, 협력청소방법에따라청소구역이고정되면, 해당구역에발생한쓰레기량이나청소상태에관한데이터를누적할수 있어청소관리가용이할수 있다.
Abstract translation: 提供了一种用于与多个清洁机器人和控制装置的协同清洁的方法。 用于协同清洁的方法可以通过将清洁机器人自动地传送到清洁所需的空间来进行清洁,同时通常监视大空间的清洁状态。 此外,由于根据协同清洗方法固定清洁区域,所以可以累积关于在相关区域中产生的废物量和清洁状态的数据,以清理以容易地管理。
-
公开(公告)号:KR1020150026526A
公开(公告)日:2015-03-11
申请号:KR1020130105508
申请日:2013-09-03
Applicant: 한국전자통신연구원
CPC classification number: H04B17/391 , H04B17/00 , G01S11/06
Abstract: 전파 환경 모델링 방법 및 장치가 제공되며, 적어도 하나의 추종 로봇으로부터 수신된 전파 세기를 측정하는 단계, 적어도 하나의 추종 로봇과 군집 지능 로봇 중 리더 로봇 간의 거리를 측정하는 단계, 전파 모델을 이용하여 환경 파라미터를 추정하는 단계, 적어도 하나의 격자에서 추정된 환경 파라미터와 기 설정된 환경 파라미터를 비교하여 분류하는 단계 및 적어도 하나의 격자에서 수신할 수 있는 적어도 하나의 추종 로봇의 전파 세기를 유추하는 단계를 포함한다.
Abstract translation: 提供了一种用于对波浪环境建模的方法和装置。 用于对波浪环境进行建模的方法包括以下步骤:测量从至少一个从动机器人接收的波的强度; 测量群体智能机器人中至少一个跟随机器人与领导机器人之间的距离; 使用波浪模型估计环境参数; 将在至少一个格中估计的环境参数与预设的环境参数进行比较并对环境参数进行分类; 并且类似于在至少一个格子中接收的至少一个从动机器人的强度。
-
公开(公告)号:KR1020140075408A
公开(公告)日:2014-06-19
申请号:KR1020120143709
申请日:2012-12-11
Applicant: 한국전자통신연구원
IPC: H05B37/02
CPC classification number: H05B37/029 , B25J9/00
Abstract: A dynamic emergency lighting system according to the present invention comprises: at least one subordinate robot for flashing emergency lights depending on evacuation routes and creating a map of surrounding areas; a leader robot which directs the at least one subordinate robot to generate the map and flash the emergency lights, and receives the map information from the at least one subordinate robot; a task server which constitutes a robot group including the leader robot and the at least one subordinate robot after dividing a structure into a plurality of areas, and assigns the robot group to one area of the plurality of areas, wherein the evacuation route is specified by the task server by using the map information.
Abstract translation: 根据本发明的动态应急照明系统包括:至少一个从属机器人,用于根据撤离路线闪烁应急灯,并创建周围地区的地图; 引导机器人,其引导所述至少一个从属机器人生成所述地图并闪光所述应急灯,并从所述至少一个从属机器人接收所述地图信息; 在将结构分割成多个区域之后构成包括引导机器人和所述至少一个从属机器人的机器人组的任务服务器,并且将所述机器人组分配给所述多个区域中的一个区域,其中所述撤离路线由 任务服务器通过使用地图信息。
-
公开(公告)号:KR101322239B1
公开(公告)日:2013-10-25
申请号:KR1020090106220
申请日:2009-11-04
Applicant: 한국전자통신연구원
Abstract: 본 발명은 동작 기반 문자 입력 장치 및 그 방법에 관한 것으로, 특히 관성 센서를 기반으로 한 동작 인식을 이용하여 문자를 입력하는 기술을 이용한 동작 기반 문자 입력 장치 및 그 방법에 관한 것이다. 본 발명은 동작을 이용한 문자 입력의 효율성을 향상시킬 수 있는 방법과 동작에 고유한 속성을 이용하여 입력 콘텐츠의 형태를 다양화할 수 있는 방법을 제시하였다. 본 발명의 실시예에 따른 동작 기반 문자 입력 장치는 사용자의 동작 변화를 센서를 이용하여 신호로 검출하는 입력부; 및 입력부의 신호로부터 의미를 갖는 동작을 구분하여 문자를 인식하되, 동작을 구분한 시점을 사용자에게 알리는 동작 해석 응용수단을 포함하는 것을 특징으로 한다.
관성 센서, 동작 센싱, 동작 구분, 문자 입력-
公开(公告)号:KR1020130070394A
公开(公告)日:2013-06-27
申请号:KR1020110137695
申请日:2011-12-19
Applicant: 한국전자통신연구원
IPC: G06T7/20
CPC classification number: G06K9/00664 , G06K9/2018 , G06K9/3241 , G06T7/215 , G06T2207/30201 , H04N7/18
Abstract: PURPOSE: An object tracking system using a robot and a tracking method are provided to allow the robot to detect a motion pattern of the object by using a laser sensor; find a position of the object; and detect the object based on an image through movement of a camera when the object deviates from a viewing angle of the camera. CONSTITUTION: An object detection unit(120) detects an object from an image which is obtained by an image obtainment unit(110). A motion pattern calculation unit(130) calculates a motion pattern for the object. When the object deviates from the image, a current position calculation unit(150) calculates a current position of the object based on the motion pattern. An object tracing system(100) traces the object by moving the image obtainment unit based on the current position. [Reference numerals] (110) Image obtainment unit; (120) Object detection unit; (130) Motion pattern calculation unit; (140) Motion pattern storage unit; (150) Current position calculation unit
Abstract translation: 目的:提供使用机器人和跟踪方法的对象跟踪系统,以使机器人能够通过使用激光传感器检测对象的运动模式; 找到对象的位置; 并且当物体偏离照相机的视角时,通过照相机的移动来基于图像来检测物体。 构成:物体检测单元(120)从由图像获取单元(110)获得的图像检测对象。 运动模式计算单元(130)计算对象的运动模式。 当对象偏离图像时,当前位置计算单元(150)基于运动模式来计算对象的当前位置。 对象跟踪系统(100)通过基于当前位置移动图像获取单元跟踪对象。 (附图标记)(110)图像获取单元; (120)对象检测单元; (130)运动模式计算单元; (140)运动模式存储单元; (150)当前位置计算单位
-
公开(公告)号:KR1020110071220A
公开(公告)日:2011-06-29
申请号:KR1020090127726
申请日:2009-12-21
Applicant: 한국전자통신연구원
CPC classification number: B25J13/006 , B25J9/1689 , B25J13/06 , G05B2219/33192 , G05B2219/36159 , G05D3/12 , Y10S901/01
Abstract: PURPOSE: A remote control apparatus and method for a tele-presence robot are provided to save time and costs for map-building or installation of artificial landmarks and remotely control a robot using information on the current state of the robot. CONSTITUTION: A remote control apparatus for a tele-presence robot comprises a robot state information receiving unit(110), a control unit(130), and a control command transmitting unit(190). The robot state information receiving unit receives robot status information transmitted from a tele-presence robot. The control unit displays at least part of the robot status information and creates a robot control command based on user input. The control command transmitting unit transmits the robot control command to the tele-presence robot.
Abstract translation: 目的:提供用于远程存在机器人的远程控制装置和方法,以节省用于地图制作或安装人造地标的时间和成本,并使用关于机器人的当前状态的信息远程控制机器人。 构成:用于远程存在机器人的遥控装置包括机器人状态信息接收单元(110),控制单元(130)和控制命令发送单元(190)。 机器人状态信息接收单元接收从远程存在机器人发送的机器人状态信息。 控制单元显示至少部分机器人状态信息,并基于用户输入创建机器人控制命令。 控制命令发送单元将机器人控制命令发送到远程存在机器人。
-
公开(公告)号:KR1020110069505A
公开(公告)日:2011-06-23
申请号:KR1020090126264
申请日:2009-12-17
Applicant: 한국전자통신연구원
Abstract: PURPOSE: A recognition system and method based on multi-step data fusion is provided to implant or extend partly each module of a recognition device by a recognition rate of gesture information is increased to choose a data fusion method instead of the various sensors. CONSTITUTION: A recognition system based on multi-step data fusion comprises: a interface device(100), a recognition device(200), and a robot controller(300). The recognition device comprises a data processor(210), a data binder(220), a feature information combiner(230), an operation information combiner(240), and a recognition information processor(250). The data binder fuses the sensor data in a lower step. The feature information combiner fuses in the middle step the feature information drawn according to data fusion result. The operation information combiner fuses in the top step the operation information drawn according to the feature information combiner. The recognition information processor processes the recognition result. The robot controller controls the robot by using the recognition information of the recognition device.
Abstract translation: 目的:提供一种基于多步数据融合的识别系统和方法,通过增加手势信息的识别率来部分地识别识别装置的每个模块来植入或扩展,以选择数据融合方法而不是各种传感器。 构成:基于多步数据融合的识别系统包括:接口装置(100),识别装置(200)和机器人控制器(300)。 识别装置包括数据处理器(210),数据绑定器(220),特征信息组合器(230),操作信息组合器(240)和识别信息处理器(250)。 数据绑定器将传感器数据保存在较低的步骤中。 特征信息组合器在中间步骤中融合根据数据融合结果绘制的特征信息。 操作信息组合器在顶部步骤中根据特征信息组合器绘制的操作信息。 识别信息处理器处理识别结果。 机器人控制器通过使用识别装置的识别信息来控制机器人。
-
89.
公开(公告)号:KR1020110066007A
公开(公告)日:2011-06-16
申请号:KR1020090122729
申请日:2009-12-10
Applicant: 한국전자통신연구원
IPC: G06F3/02
CPC classification number: G06F17/273
Abstract: PURPOSE: A Korean/English typewriter system using a mediated interface device and a character string input method thereof are provided to improve character recognition performance by finding a desired word through correction of a recognition error character. CONSTITUTION: A character recognition sequence set(200) arranges a recognition result of characters inputted through a mediated interface device according to a recognition sequence. A typewriter unit(300) mixes a character string inputted through the mediated interface device with reference to the character recognition sequence set and revises the mixed a character string based on a dictionary. The typewriter unit is comprised of a word combination device which mixes the inputted character string and a partial word index map which comprises of partial words. The partial word index map comprises Korean and English partial word index map.
Abstract translation: 目的:提供使用中介界面装置和字符串输入方法的韩文/英文打字机系统,以通过校正识别错误字符找到所需词来提高字符识别性能。 构成:字符识别序列集(200)根据识别序列排列通过介质接口装置输入的字符的识别结果。 打字机单元(300)通过参照字符识别序列集合混合通过介导的界面装置输入的字符串,并根据字典修改混合的字符串。 打字机单元由混合输入的字符串的单词组合装置和由部分单词组成的部分单词索引图组成。 部分词索引图包含韩文和英文部分词索引图。
-
公开(公告)号:KR1020110063075A
公开(公告)日:2011-06-10
申请号:KR1020090120014
申请日:2009-12-04
Applicant: 한국전자통신연구원
IPC: G06F3/01 , G06F3/0346
CPC classification number: G02B27/017 , G02B2027/0138 , G02B2027/014 , G02B2027/0178 , G02B2027/0187 , G06F3/012 , G06F3/017 , G06F3/0304 , G06F3/0346 , G06F3/038
Abstract: PURPOSE: A gesture input device and gesture recognizing method using the same are provided to perform the related working through the input of the gesture through pointing. CONSTITUTION: A frame(100) is worn in the head of a user. An infrared camera(120) is installed to a frame. An image processing unit processes the image through the infrared camera. An inertia sensor senses the movement of the head of the user. A communication processing unit transmits the signal of the image processing unit. An infrared pointer(200) is worn in the finger of a user.
Abstract translation: 目的:提供使用其的手势输入装置和手势识别方法,以便通过指点输入手势来执行相关工作。 构成:用户的头部佩戴框架(100)。 一个红外摄像机(120)安装在一个框架上。 图像处理单元通过红外摄像机处理图像。 惯性传感器感测用户头部的运动。 通信处理单元发送图像处理单元的信号。 红外指针(200)佩戴在用户的手指中。
-
-
-
-
-
-
-
-
-