-
公开(公告)号:AU2021269906A1
公开(公告)日:2022-10-27
申请号:AU2021269906
申请日:2021-04-13
Applicant: IBM
Inventor: RIEGEL RYAN , LUUS FRANCOIS , AKHALWAYA ISMAIL YUNUS , KHAN NAWEED AGHMAD , MAKONDO NDIVHUWO , BARAHONA FRANCISCO , GRAY ALEXANDER
IPC: G06N3/04
Abstract: A system for configuring and using a logical neural network including a graph syntax tree of formulae in a represented knowledgebase connected to each other via nodes representing each proposition. One neuron exists for each logical connective occurring in each formula and, additionally, one neuron for each unique proposition occurring in any formula. All neurons return pairs of values representing upper and lower bounds on truth values of their corresponding subformulae and propositions. Neurons corresponding to logical connectives accept as input the output of neurons corresponding to their operands and have activation functions configured to match the connectives' truth functions. Neurons corresponding to propositions accept as input the output of neurons established as proofs of bounds on the propositions' truth values and have activation functions configured to aggregate the tightest such bounds. Bidirectional inference permits every occurrence of each proposition in each formula to be used as a potential proof.
-
公开(公告)号:AU2021271230A1
公开(公告)日:2022-10-27
申请号:AU2021271230
申请日:2021-03-18
Applicant: IBM
Inventor: LUUS FRANCOIS , RIEGEL RYAN , AKHALWAYA ISMAIL YUNUS , KHAN NAWEED AGHMAD , VOS ETIENNE , MAKONDO NDIVHUWO
IPC: G06N3/04
Abstract: Maximum expressivity can be received representing a ratio between maximum and minimum input weights to a neuron of a neural network implementing a weighted real-valued logic gate. Operator arity can be received associated with the neuron. Logical constraints associated with the weighted real-valued logic gate can be determined in terms of weights associated with inputs to the neuron, a threshold-of-truth, and a neuron threshold for activation. The threshold-of-truth can be determined as a parameter used in an activation function of the neuron, based on solving an activation optimization formulated based on the logical constraints, the activation optimization maximizing a product of expressivity representing a distribution width of input weights to the neuron and gradient quality for the neuron given the operator arity and the maximum expressivity. The neural network of logical neurons can be trained using the activation function at the neuron, the activation function using the determined threshold-of-truth.
-