Invention Grant
- Patent Title: Neural machine translation with latent tree attention
-
Application No.: US15901722Application Date: 2018-02-21
-
Publication No.: US10565318B2Publication Date: 2020-02-18
- Inventor: James Bradbury
- Applicant: salesforce.com, inc.
- Applicant Address: US CA San Francisco
- Assignee: salesforce.com, inc.
- Current Assignee: salesforce.com, inc.
- Current Assignee Address: US CA San Francisco
- Agency: Haynes and Boone, LLP
- Main IPC: G06F17/28
- IPC: G06F17/28 ; G06F17/27 ; G06N3/08 ; G06N3/04 ; G06N5/00

Abstract:
We introduce an attentional neural machine translation model for the task of machine translation that accomplishes the longstanding goal of natural language processing to take advantage of the hierarchical structure of language without a priori annotation. The model comprises a recurrent neural network grammar (RNNG) encoder with a novel attentional RNNG decoder and applies policy gradient reinforcement learning to induce unsupervised tree structures on both the source sequence and target sequence. When trained on character-level datasets with no explicit segmentation or parse annotation, the model learns a plausible segmentation and shallow parse, obtaining performance close to an attentional baseline.
Public/Granted literature
- US20180300317A1 NEURAL MACHINE TRANSLATION WITH LATENT TREE ATTENTION Public/Granted day:2018-10-18
Information query