FIRST-ORDER LOGICAL NEURAL NETWORKS WITH BIDIRECTIONAL INFERENCE

    公开(公告)号:ZA202100290B

    公开(公告)日:2022-01-26

    申请号:ZA202100290

    申请日:2021-01-15

    Applicant: IBM

    Abstract: A system for configuring and using a logical neural network including a graph syntax tree of formulae in a represented knowledgebase connected to each other via nodes representing each proposition. One neuron exists for each logical connective occurring in each formula and, additionally, one neuron for each unique proposition occurring in any formula. All neurons return pairs of values representing upper and lower bounds on truth values of their corresponding subformulae and propositions. Neurons corresponding to logical connectives accept as input the output of neurons corresponding to their operands and have activation functions configured to match the connectives' truth functions. Neurons corresponding to propositions accept as input the output of neurons established as proofs of bounds on the propositions' truth values and have activation functions configured to aggregate the tightest such bounds. Bidirectional inference permits every occurrence of each proposition in each formula to be used as a potential proof.

    First-order logical neural networks with bidirectional inference

    公开(公告)号:AU2021269906A1

    公开(公告)日:2022-10-27

    申请号:AU2021269906

    申请日:2021-04-13

    Applicant: IBM

    Abstract: A system for configuring and using a logical neural network including a graph syntax tree of formulae in a represented knowledgebase connected to each other via nodes representing each proposition. One neuron exists for each logical connective occurring in each formula and, additionally, one neuron for each unique proposition occurring in any formula. All neurons return pairs of values representing upper and lower bounds on truth values of their corresponding subformulae and propositions. Neurons corresponding to logical connectives accept as input the output of neurons corresponding to their operands and have activation functions configured to match the connectives' truth functions. Neurons corresponding to propositions accept as input the output of neurons established as proofs of bounds on the propositions' truth values and have activation functions configured to aggregate the tightest such bounds. Bidirectional inference permits every occurrence of each proposition in each formula to be used as a potential proof.

    GENERATIVE ONTOLOGY LEARNING AND NATURAL LANGUAGE PROCESSING WITH PREDICTIVE LANGUAGE MODELS

    公开(公告)号:ZA202102725B

    公开(公告)日:2022-08-31

    申请号:ZA202102725

    申请日:2021-04-23

    Applicant: IBM

    Abstract: An ontology topic is selected and a pretrained predictive language model is primed to create a predictive primed model based on one or more ontological rules corresponding to the selected ontology topic. Using the predictive primed model, natural language text is generated based on the ontology topic and guidance of a prediction steering component. The predictive primed model is guided in selecting text that is predicted to be appropriate for the ontology topic and the generated natural language text. The generated natural language text is processed to generate extracted ontology rules and the extracted ontology rules are compared to one or more rules of an ontology rule database that correspond to the ontology topic. A check is performed to determine if a performance of the ontology extractor is acceptable.

    AUTOMATIC FORM COMPLETION FROM A SET OF FEDERATED DATA PROVIDERS

    公开(公告)号:ZA202004652B

    公开(公告)日:2021-08-25

    申请号:ZA202004652

    申请日:2020-07-28

    Applicant: IBM

    Abstract: One or more application forms can be hosted, which application forms are received from a service provider, the service provider having been authenticated by at least one data custodian of a set of data custodians. One or more application forms include at least form fields to be populated with information. A user selection of an application form to be automatically populated can be received, the user having been authenticated by at least one data custodian of the set of data custodians. Data request is sent to at least one data custodian of the set of data custodians for automatically populating at least some of the form fields. Received data from said at least one data custodian can be collated and used to populate one or more form fields. The populated form fields can be returned to the service provider, for instance, provided a minimum requirement is met.

    MACHINE LEARNING MODELS OF LIVESTOCK VALUE CHAIN

    公开(公告)号:ZA202103572B

    公开(公告)日:2022-08-31

    申请号:ZA202103572

    申请日:2021-05-26

    Applicant: IBM

    Abstract: Methods and systems for operating a machine learning system are described. In an example, a device can receive an image and assign a set of pixels in the image as a digital representation of a livestock. The device can further train a machine learning model using the digital representation. The device can further run the machine learning model to generate prediction data relating to the livestock. The device can further generate output data relating to at least one activity among a livestock value chain. The at least one activity can correspond to a process to generate a commodity based on the livestock.

    WORD SENSE DISAMBIGUATION USING A DEEP LOGICO-NEURAL NETWORK

    公开(公告)号:ZA202100291B

    公开(公告)日:2022-02-23

    申请号:ZA202100291

    申请日:2021-01-15

    Applicant: IBM

    Abstract: Word sense disambiguation using a glossary layer embedded in a deep neural network includes receiving, by one or more processors, input sentences including a plurality of words. At least two words in the plurality of words are homonyms. The one or more processors convert the plurality of words associated with each input sentence into a first vector including possible senses for the at least two words. The first vector is then combined with a second vector including a domain-specific contextual vector associated with the at least two words. The combination of the first vector with the second vector is fed into a recurrent deep logico-neural network model to generate a third vector that includes word senses for the at least two words. A threshold is set for the third vector to generate a fourth vector including a final word sense vector for the at least two words.

    Optimizing capacity and learning of weighted real-valued logic

    公开(公告)号:AU2021271230A1

    公开(公告)日:2022-10-27

    申请号:AU2021271230

    申请日:2021-03-18

    Applicant: IBM

    Abstract: Maximum expressivity can be received representing a ratio between maximum and minimum input weights to a neuron of a neural network implementing a weighted real-valued logic gate. Operator arity can be received associated with the neuron. Logical constraints associated with the weighted real-valued logic gate can be determined in terms of weights associated with inputs to the neuron, a threshold-of-truth, and a neuron threshold for activation. The threshold-of-truth can be determined as a parameter used in an activation function of the neuron, based on solving an activation optimization formulated based on the logical constraints, the activation optimization maximizing a product of expressivity representing a distribution width of input weights to the neuron and gradient quality for the neuron given the operator arity and the maximum expressivity. The neural network of logical neurons can be trained using the activation function at the neuron, the activation function using the determined threshold-of-truth.

Patent Agency Ranking