Invention Grant
- Patent Title: System and method for privacy-preserving distributed training of machine learning models on distributed datasets
-
Application No.: US17998120Application Date: 2020-05-08
-
Publication No.: US12206758B2Publication Date: 2025-01-21
- Inventor: David Froelicher , Juan Ramon Troncoso-Pastoriza , Apostolos Pyrgelis , Sinem Sav , Joao Gomes De Sa E Sousa , Jean-Pierre Hubaux , Jean-Philippe Bossuat
- Applicant: Ecole Polytechnique Federale De Lausanne (EPFL)
- Applicant Address: CH Lausanne
- Assignee: Ecole Polytechnique Federale De Lausanne (EPFL)
- Current Assignee: Ecole Polytechnique Federale De Lausanne (EPFL)
- Current Assignee Address: CH Lausanne
- Agency: Hovey Williams LLP
- Agent Kameron D. Kelly
- International Application: PCT/EP2020/062810 WO 20200508
- International Announcement: WO2021/223873 WO 20211111
- Main IPC: H04L9/00
- IPC: H04L9/00 ; G06N3/098

Abstract:
A system for privacy-preserving distributed training of a global model on distributed datasets has a plurality of data providers being communicatively coupled. Each data provider has a local model and a local training dataset for training the local model using an iterative training algorithm. Further it has a portion of a cryptographic distributed secret key and a corresponding collective cryptographic public key of a multiparty fully homomorphic encryption scheme. All models are encrypted with the collective public key. Each data provider trains its local model using the respective local training dataset, and combines the local model with the current global model into a current local model. A data provider homomorphically combines current local models into a combined model, and updates the current global model based on the combined model. The updated global model is provided to at least a subset of the other data providers.
Public/Granted literature
Information query