Invention Grant
- Patent Title: Pretraining utilizing software dependencies
-
Application No.: US16813778Application Date: 2020-03-10
-
Publication No.: US11262985B2Publication Date: 2022-03-01
- Inventor: Yan Luo , Liujia Shao , Yan Xu , Sibin Fan
- Applicant: International Business Machines Corporation
- Applicant Address: US NY Armonk
- Assignee: International Business Machines Corporation
- Current Assignee: International Business Machines Corporation
- Current Assignee Address: US NY Armonk
- Agent Randy E. Tejeda
- Main IPC: G06F9/44
- IPC: G06F9/44 ; G06F8/33 ; G06N3/02 ; G06N20/00 ; G06F8/73

Abstract:
In an approach to creating code snippet auto-commenting models utilizing a pre-training model leveraging dependency data, one or more computer processors create a generalized pre-training model trained with one or more dependencies and one or more associated dependency embeddings, wherein dependencies include frameworks, imported libraries, header files, and application programming interfaces associated with a software project. The one or more computer processors create a subsequent model with a model architecture identical to the created pre-training model. The one or more computer processors computationally reduce a training of the created subsequent model utilizing one or more trained parameters, activations, memory cells, and context vectors contained in the created pre-training model. The one or more computer processors create deploy the subsequent model to one to more production environments.
Public/Granted literature
- US20210286598A1 PRETRAINING UTILIZING SOFTWARE DEPENDENCIES Public/Granted day:2021-09-16
Information query