Invention Grant
- Patent Title: Depth-constrained knowledge distillation for inference on encrypted data
-
Application No.: US16907578Application Date: 2020-06-22
-
Publication No.: US11599806B2Publication Date: 2023-03-07
- Inventor: Kanthi Sarpatwar , Nalini K. Ratha , Karthikeyan Shanmugam , Karthik Nandakumar , Sharathchandra Pankanti , Roman Vaculin , James Thomas Rayfield
- Applicant: International Business Machines Corporation
- Applicant Address: US NY Armonk
- Assignee: International Business Machines Corporation
- Current Assignee: International Business Machines Corporation
- Current Assignee Address: US NY Armonk
- Agent Jeffrey S. LaBaw; David H. Judson
- Main IPC: G06N5/04
- IPC: G06N5/04 ; H04L9/00 ; G06N3/04 ; G06K9/62

Abstract:
This disclosure provides a method, apparatus and computer program product to create a full homomorphic encryption (FHE)-friendly machine learning model. The approach herein leverages a knowledge distillation framework wherein the FHE-friendly (student) ML model closely mimics the predictions of a more complex (teacher) model, wherein the teacher model is one that, relative to the student model, is more complex and that is pre-trained on large datasets. In the approach herein, the distillation framework uses the more complex teacher model to facilitate training of the FHE-friendly model, but using synthetically-generated training data in lieu of the original datasets used to train the teacher.
Public/Granted literature
- US20210397988A1 DEPTH-CONSTRAINED KNOWLEDGE DISTILLATION FOR INFERENCE ON ENCRYPTED DATA Public/Granted day:2021-12-23
Information query