Invention Grant
- Patent Title: Implementing a whole sentence recurrent neural network language model for natural language processing
-
Application No.: US15954399Application Date: 2018-04-16
-
Publication No.: US10431210B1Publication Date: 2019-10-01
- Inventor: Yinghui Huang , Abhinav Sethy , Kartik Audhkhasi , Bhuvana Ramabhadran
- Applicant: International Business Machines Corporation
- Applicant Address: US NY Armonk
- Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
- Current Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
- Current Assignee Address: US NY Armonk
- Agent Amy J. Pattillo; Feb Cabrasawan
- Main IPC: G10L15/197
- IPC: G10L15/197 ; G10L15/16 ; G06N3/08 ; G10L15/22 ; G06N7/00 ; G10L15/06

Abstract:
A whole sentence recurrent neural network (RNN) language model (LM) is provided for for estimating a probability of likelihood of each whole sentence processed by natural language processing being correct. A noise contrastive estimation sampler is applied against at least one entire sentence from a corpus of multiple sentences to generate at least one incorrect sentence. The whole sentence RNN LN is trained, using the at least one entire sentence from the corpus and the at least one incorrect sentence, to distinguish the at least one entire sentence as correct. The whole sentence recurrent neural network language model is applied to estimate the probability of likelihood of each whole sentence processed by natural language processing being correct.
Public/Granted literature
- US20190318732A1 IMPLEMENTING A WHOLE SENTENCE RECURRENT NEURAL NETWORK LANGUAGE MODEL FOR NATURAL LANGUAGE PROCESSING Public/Granted day:2019-10-17
Information query