Invention Grant
- Patent Title: Implementing a whole sentence recurrent neural network language model for natural language processing
-
Application No.: US16549893Application Date: 2019-08-23
-
Publication No.: US10692488B2Publication Date: 2020-06-23
- Inventor: Yinghui Huang , Abhinav Sethy , Kartik Audhkhasi , Bhuvana Ramabhadran
- Applicant: INTERNATIONAL BUSINESS MACHINES CORPORATION
- Applicant Address: US NY Armonk
- Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
- Current Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
- Current Assignee Address: US NY Armonk
- Agent Amy J. Pattillo; Feb Cabrasawan
- Main IPC: G10L15/197
- IPC: G10L15/197 ; G10L15/16 ; G06N3/08 ; G10L15/22 ; G06N7/00 ; G10L15/06

Abstract:
A computer selects a test set of sentences from among sentences applied to train a whole sentence recurrent neural network language model to estimate the probability of likelihood of each whole sentence processed by natural language processing being correct. The computer generates imposter sentences from among the test set of sentences by substituting one word in each sentence of the test set of sentences. The computer generates, through the whole sentence recurrent neural network language model, a first score for each sentence of the test set of sentences and at least one additional score for each of the imposter sentences. The computer evaluates an accuracy of the natural language processing system in performing sequential classification tasks based on an accuracy value of the first score in reflecting a correct sentence and the at least one additional score in reflecting an incorrect sentence.
Public/Granted literature
- US20200013393A1 IMPLEMENTING A WHOLE SENTENCE RECURRENT NEURAL NETWORK LANGUAGE MODEL FOR NATURAL LANGUAGE PROCESSING Public/Granted day:2020-01-09
Information query