Invention Grant
- Patent Title: Systems and methods for a transformer network with tree-based attention for natural language processing
-
Application No.: US16581035Application Date: 2019-09-24
-
Publication No.: US11615240B2Publication Date: 2023-03-28
- Inventor: Xuan Phi Nguyen , Shafiq Rayhan Joty , Chu Hong Hoi
- Applicant: salesforce.com, inc.
- Applicant Address: US CA San Francisco
- Assignee: salesforce.com, inc.
- Current Assignee: salesforce.com, inc.
- Current Assignee Address: US CA San Francisco
- Agency: Haynes and Boone, LLP
- Main IPC: G06F40/205
- IPC: G06F40/205

Abstract:
Embodiments described herein provide an attention-based tree encoding mechanism. Specifically, the attention layer receives as input the pre-parsed constituency tree of a sentence and the lower-layer representations of all nodes. The attention layer then performs upward accumulation to encode the tree structure from leaves to the root in a bottom-up fashion. Afterwards, weighted aggregation is used to compute the final representations of non-terminal nodes.
Public/Granted literature
- US20210049236A1 SYSTEMS AND METHODS FOR A TRANSFORMER NETWORK WITH TREE-BASED ATTENTION FOR NATURAL LANGUAGE PROCESSING Public/Granted day:2021-02-18
Information query