Invention Grant
- Patent Title: Method for training a linguistic model and electronic device
-
Application No.: US17451380Application Date: 2021-10-19
-
Publication No.: US11900918B2Publication Date: 2024-02-13
- Inventor: Liao Zhang , Zhengxiang Jiang , Xiaoyin Fu
- Applicant: BEIJING BAIDU NETCOM SCIENCE TECHNOLOGY CO., LTD.
- Applicant Address: CN Beijing
- Assignee: BEIJING BAIDU NETCOM SCIENCE TECHNOLOGY CO., LTD.
- Current Assignee: BEIJING BAIDU NETCOM SCIENCE TECHNOLOGY CO., LTD.
- Current Assignee Address: CN Beijing
- Agency: Osha Bergman Watanabe & Burton LLP
- Priority: CN 2011165544.5 2020.10.27
- Main IPC: G10L15/06
- IPC: G10L15/06 ; G06F40/253 ; G06F40/30

Abstract:
The present disclosure provides a method for training a linguistic model, related to fields of speech, natural language processing, deep learning technologies. A method includes: obtaining grammars corresponding to a plurality of sample texts and a slot value of a slot in each grammar by using semantic analysis; generating a grammar graph corresponding to each grammar based on the corresponding grammar and the slot value of the slot in the corresponding grammar; obtaining a weight of each grammar, a weight of each slot, and a weight of each slot value in each grammar graph based on the sample texts; determining at least one grammar frequency of each order based on the weight of each grammar, the weight of each slot, and the weight of each slot value in each grammar graph; and training the linguistic model based on the at least one grammar frequency of each order.
Public/Granted literature
- US20220036880A1 METHOD FOR TRAINING A LINGUISTIC MODEL AND ELECTRONIC DEVICE Public/Granted day:2022-02-03
Information query