-
公开(公告)号:US11989977B2
公开(公告)日:2024-05-21
申请号:US17363365
申请日:2021-06-30
Applicant: Purdue Research Foundation
Inventor: Karthik Ramani , Tianyi Wang , Xun Qian
CPC classification number: G06V40/28 , G06F3/011 , G06T7/215 , G06T13/40 , G06T19/006 , G06V20/20 , G06V40/103 , G06V40/23
Abstract: A system and method for authoring and implementing context-aware applications (CAPs) are disclosed. The system and method enables users to record their daily activities and then build and deploy customized CAPs onto augmented reality platforms in which automated actions are performed in response to user-defined human actions. The system and method utilizes an integrated augmented reality platform composed of multiple camera systems, which allows for non-intrusive recording of end-users' activities and context detection while authoring and implementing CAPs. The system and method provides an augmented reality authoring interface for browsing, selecting, and editing recorded activities, and creating flexible CAPs through spatial interaction and visual programming.
-
公开(公告)号:US12288299B2
公开(公告)日:2025-04-29
申请号:US18480158
申请日:2023-10-03
Applicant: Purdue Research Foundation
Inventor: Karthik Ramani , Zhengzhe Zhu , Ziyi Liu , Tianyi Wang
IPC: G06T19/00 , A63F13/577 , A63F13/63 , A63F13/655 , G06F3/0346 , G06F3/04815
Abstract: An augmented reality (AR) interaction authoring system is described. The AR interaction authoring system is configured to support the real-time creation of AR applications to support AR-enhanced toys. The design of the AR interaction authoring system enables bidirectional interactions between the physical-virtual space of toys and AR. The AR interaction authoring system allows intuitive authoring of AR animations and toys actuations through programming by demonstration, while referring to the physical toy for a contextual reference. Using a visual programming interface, users can create bidirectional interactions by utilizing users' input on toys to trigger AR animations and vice versa. A plug-and-play IoT toolkit is also disclosed that includes hardware to actuate common toys. In this way, users can effortlessly integrate toys into the virtual world in an impromptu design process, without lengthy electronic prototyping.
-
公开(公告)号:US20240273949A1
公开(公告)日:2024-08-15
申请号:US18635112
申请日:2024-04-15
Applicant: Purdue Research Foundation
Inventor: Karthik Ramani , Tianyi Wang , Xun Qian
CPC classification number: G06V40/28 , G06F3/011 , G06T7/215 , G06T13/40 , G06T19/006 , G06V20/20 , G06V40/103 , G06V40/23
Abstract: A system and method for authoring and implementing context-aware applications (CAPs) are disclosed. The system and method enables users to record their daily activities and then build and deploy customized CAPs onto augmented reality platforms in which automated actions are performed in response to user-defined human actions. The system and method utilizes an integrated augmented reality platform composed of multiple camera systems, which allows for non-intrusive recording of end-users' activities and context detection while authoring and implementing CAPs. The system and method provides an augmented reality authoring interface for browsing, selecting, and editing recorded activities, and creating flexible CAPs through spatial interaction and visual programming.
-
4.
公开(公告)号:US20240118786A1
公开(公告)日:2024-04-11
申请号:US18480134
申请日:2023-10-03
Applicant: Purdue Research Foundation
Inventor: Karthik Ramani , Xun Qian , Tianyi Wang , Fengming He
IPC: G06F3/04815 , G06F3/01 , G06F3/04842 , G06F3/04845 , G06T7/194 , G06T7/73
CPC classification number: G06F3/04815 , G06F3/011 , G06F3/017 , G06F3/04842 , G06F3/04845 , G06T7/194 , G06T7/74 , G06T2207/20081 , G06T2207/30196
Abstract: A method and system for hand-object interaction dataset collection is described herein, which is configured to support user-specified collection of hand-object interaction datasets. Such hand-object interaction datasets are useful, for example, for training 3D hand and object pose estimation model. The method and system adopt a sequential process of first recording hand and object pose labels by manipulating a virtual bounding box, rather than a physical object. Naturally, hand-object occlusions do not occur during the manipulation of the virtual bounding box, so these labels are provided with high accuracy. Subsequently, the images are separately captured of the hand-object interaction with the physical object. These images are paired with the previously recorded hand and object pose labels.
-
公开(公告)号:US12153737B2
公开(公告)日:2024-11-26
申请号:US17814965
申请日:2022-07-26
Applicant: Purdue Research Foundation
Inventor: Karthik Ramani , Tianyi Wang , Xun Qian , Fengming He
Abstract: An augmented reality (AR) application authoring system is disclosed. The AR application authoring system enables the real-time creation of freehand interactive AR applications with freehand inputs. The AR application authoring system enables intuitive authoring of customized freehand gesture inputs through embodied demonstration while using the surrounding environment as a contextual reference. A visual programming interface is provided with which users can define freehand interactions by matching the freehand gestures with reactions of virtual AR assets. Thus, users can create personalized freehand interactions through simple trigger-action programming logic. Further, with the support of a real-time hand gesture detection algorithm, users can seamlessly test and iterate on the authored AR experience.
-
6.
公开(公告)号:US12145267B2
公开(公告)日:2024-11-19
申请号:US17022216
申请日:2020-09-16
Applicant: Purdue Research Foundation
Inventor: Karthik Ramani , Ke Huo , Yuanzhi Cao , Tianyi Wang
IPC: B25J9/16
Abstract: A system and method for authoring and performing Human-Robot-Collaborative (HRC) tasks is disclosed. The system and method adopt an embodied authoring approach in Augmented Reality (AR), for spatially editing the actions and programming the robots through demonstrative role-playing. The system and method utilize an intuitive workflow that externalizes user's authoring as demonstrative and editable AR ghost, allowing for spatially situated visual referencing, realistic animated simulation, and collaborative action guidance. The system and method utilize a dynamic time warping (DTW) based collaboration model which takes the real-time captured motion as inputs, maps it to the previously authored human actions, and outputs the corresponding robot actions to achieve adaptive collaboration.
-
7.
公开(公告)号:US20240112424A1
公开(公告)日:2024-04-04
申请号:US18480158
申请日:2023-10-03
Applicant: Purdue Research Foundation
Inventor: Karthik Ramani , Zhengzhe Zhu , Ziyi Liu , Tianyi Wang
IPC: G06T19/00 , A63F13/577 , A63F13/63 , A63F13/655 , G06F3/0346 , G06F3/04815
CPC classification number: G06T19/006 , A63F13/577 , A63F13/63 , A63F13/655 , G06F3/0346 , G06F3/04815 , G06T2200/24 , G06T2210/21
Abstract: An augmented reality (AR) interaction authoring system is described. The AR interaction authoring system is configured to support the real-time creation of AR applications to support AR-enhanced toys. The design of the AR interaction authoring system enables bidirectional interactions between the physical-virtual space of toys and AR. The AR interaction authoring system allows intuitive authoring of AR animations and toys actuations through programming by demonstration, while referring to the physical toy for a contextual reference. Using a visual programming interface, users can create bidirectional interactions by utilizing users' input on toys to trigger AR animations and vice versa. A plug-and-play IoT toolkit is also disclosed that includes hardware to actuate common toys. In this way, users can effortlessly integrate toys into the virtual world in an impromptu design process, without lengthy electronic prototyping.
-
公开(公告)号:US20230038709A1
公开(公告)日:2023-02-09
申请号:US17814965
申请日:2022-07-26
Applicant: Purdue Research Foundation
Inventor: Karthik Ramani , Tianyi Wang , Xun Qian , Fengming He
Abstract: An augmented reality (AR) application authoring system is disclosed. The AR application authoring system enables the real-time creation of freehand interactive AR applications with freehand inputs. The AR application authoring system enables intuitive authoring of customized freehand gesture inputs through embodied demonstration while using the surrounding environment as a contextual reference. A visual programming interface is provided with which users can define freehand interactions by matching the freehand gestures with reactions of virtual AR assets. Thus, users can create personalized freehand interactions through simple trigger-action programming logic. Further, with the support of a real-time hand gesture detection algorithm, users can seamlessly test and iterate on the authored AR experience.
-
公开(公告)号:US20210406528A1
公开(公告)日:2021-12-30
申请号:US17363365
申请日:2021-06-30
Applicant: Purdue Research Foundation
Inventor: Karthik Ramani , Tianyi Wang , Xun Qian
Abstract: A system and method for authoring and implementing context-aware applications (CAPs) are disclosed. The system and method enables users to record their daily activities and then build and deploy customized CAPs onto augmented reality platforms in which automated actions are performed in response to user-defined human actions. The system and method utilizes an integrated augmented reality platform composed of multiple camera systems, which allows for non-intrusive recording of end-users' activities and context detection while authoring and implementing CAPs. The system and method provides an augmented reality authoring interface for browsing, selecting, and editing recorded activities, and creating flexible CAPs through spatial interaction and visual programming.
-
10.
公开(公告)号:US20250058461A1
公开(公告)日:2025-02-20
申请号:US18937756
申请日:2024-11-05
Applicant: Purdue Research Foundation
Inventor: Karthik Ramani , Ke Huo , Yuanzhi Cao , Tianyi Wang
IPC: B25J9/16
Abstract: A system and method for authoring and performing Human-Robot-Collaborative (HRC) tasks is disclosed. The system and method adopt an embodied authoring approach in Augmented Reality (AR), for spatially editing the actions and programming the robots through demonstrative role-playing. The system and method utilize an intuitive workflow that externalizes user's authoring as demonstrative and editable AR ghost, allowing for spatially situated visual referencing, realistic animated simulation, and collaborative action guidance. The system and method utilize a dynamic time warping (DTW) based collaboration model which takes the real-time captured motion as inputs, maps it to the previously authored human actions, and outputs the corresponding robot actions to achieve adaptive collaboration.
-
-
-
-
-
-
-
-
-