System and method for decoding Reed-Muller codes

    公开(公告)号:US11736124B2

    公开(公告)日:2023-08-22

    申请号:US17422923

    申请日:2020-01-17

    CPC classification number: H03M13/136 H03M13/43 H03M13/253 H03M13/2948

    Abstract: Various embodiments are directed to Reed-Muller decoding systems and methods based on recursive projections and aggregations of cosets decoding, exploiting the self-similarity of RM codes, and extended with list-decoding procedures and with outer-code concatenations. Various embodiments are configured for decoding RM codes (and variants thereof) over binary input memoryless channels, such as by, for each received word of RM encoded data, projecting the received word onto each of a plurality of cosets of different subspaces to form thereby a respective plurality of projected words; recursively decoding each of the respective plurality of projected words to form a respective plurality of decoded projected words; and aggregating each of the respective decoded projected words to obtain thereby a decoding of the corresponding received word of RM encoded data.

    SYSTEM AND METHOD FOR PRIVACY-PRESERVING DISTRIBUTED TRAINING OF MACHINE LEARNING MODELS ON DISTRIBUTED DATASETS

    公开(公告)号:US20230188319A1

    公开(公告)日:2023-06-15

    申请号:US17998120

    申请日:2020-05-08

    CPC classification number: H04L9/008 G06N3/098

    Abstract: A computer-implemented method and a distributed computer system (100) for privacy-preserving distributed training of a global model on distributed datasets (DS1 to DSn). The system has a plurality of data providers (DP1 to DPn) being communicatively coupled. Each data provider has a respective local model (LM1 to LMn) and a respective local training dataset (DS1 to DSn) for training the local model using an iterative training algorithm (IA). Further it has a portion of a cryptographic distributed secret key (SK1 to SKn) and a corresponding collective cryptographic public key (CPK) of a multiparty fully homomorphic encryption scheme, with the local and global model being encrypted with the collective public key. Each data provider (DP1) trains its local model (LM1) using the respective local training dataset (DS1) by executing gradient descent updates of its local model (LM1), and combining (1340) the updated local model (LM1′) with the current global model (GM) into a current local model (LM1c). At least one data provider homomorphically combines at least a subset of the current local models of at least a subset of the data providers into a combined model (CM1), and updates the current global model (GM) based on the combined model. The updated global model is provided to at least a subset of the other data providers.

Patent Agency Ranking