-
公开(公告)号:US11416712B1
公开(公告)日:2022-08-16
申请号:US17560474
申请日:2021-12-23
Applicant: SAS Institute Inc.
Inventor: Amirhassan Fallah Dizche , Ye Liu , Xin Jiang Hunt , Jorge Manuel Gomes da Silva
Abstract: A computing device generates synthetic tabular data. Until a convergence parameter value indicates that training of an attention generator model is complete, conditional vectors are defined; latent vectors are generated using a predefined noise distribution function; a forward propagation of an attention generator model that includes an attention model integrated with a conditional generator model is executed to generate output vectors; transformed observation vectors are selected; a forward propagation of a discriminator model is executed with the transformed observation vectors, the conditional vectors, and the output vectors to predict whether each transformed observation vector and each output vector is real or fake; a discriminator model loss value is computed based on the predictions; the discriminator model is updated using the discriminator model loss value; an attention generator model loss value is computed based on the predictions; and the attention generator model is updated using the attention generator model loss value.
-
公开(公告)号:US11436438B1
公开(公告)日:2022-09-06
申请号:US17559735
申请日:2021-12-22
Applicant: SAS Institute Inc.
Inventor: Ruiwen Zhang , Weichen Wang , Jorge Manuel Gomes da Silva , Ye Liu , Hamoon Azizsoltani , Prathaban Mookiah
Abstract: (A) Conditional vectors are defined. (B) Latent observation vectors are generated using a predefined noise distribution function. (C) A forward propagation of a generator model is executed with the conditional vectors and the latent observation vectors as input to generate an output vector. (D) A forward propagation of a decoder model of a trained autoencoder model is executed with the generated output vector as input to generate a plurality of decoded vectors. (E) Transformed observation vectors are selected from transformed data based on the defined plurality of conditional vectors. (F) A forward propagation of a discriminator model is executed with the transformed observation vectors, the conditional vectors, and the decoded vectors as input to predict whether each transformed observation vector and each decoded vector is real or fake. (G) The discriminator and generator models are updated and (A) through (G) are repeated until training is complete.
-
公开(公告)号:US11531907B2
公开(公告)日:2022-12-20
申请号:US17854264
申请日:2022-06-30
Applicant: SAS Institute Inc.
Inventor: Afshin Oroojlooyjadid , Mohammadreza Nazari , Davood Hajinezhad , Amirhassan Fallah Dizche , Jorge Manuel Gomes da Silva , Jonathan Lee Walker , Hardi Desai , Robert Blanchard , Varunraj Valsaraj , Ruiwen Zhang , Weichen Wang , Ye Liu , Hamoon Azizsoltani , Prathaban Mookiah
IPC: G06N5/02
Abstract: A computing device trains a machine state predictive model. A generative adversarial network with an autoencoder is trained using a first plurality of observation vectors. Each observation vector of the first plurality of observation vectors includes state variable values for state variables and an action variable value for an action variable. The state variables define a machine state, wherein the action variable defines a next action taken in response to the machine state. The first plurality of observation vectors successively defines sequential machine states to manufacture a product. A second plurality of observation vectors is generated using the trained generative adversarial network with the autoencoder. A machine state machine learning model is trained to predict a subsequent machine state using the first plurality of observation vectors and the generated second plurality of observation vectors. A description of the machine state machine learning model is output.
-
公开(公告)号:US20220374732A1
公开(公告)日:2022-11-24
申请号:US17854264
申请日:2022-06-30
Applicant: SAS Institute Inc.
Inventor: Afshin Oroojlooyjadid , Mohammadreza Nazari , Davood Hajinezhad , Amirhassan Fallah Dizche , Jorge Manuel Gomes da Silva , Jonathan Lee Walker , Hardi Desai , Robert Blanchard , Varunraj Valsaraj , Ruiwen Zhang , Weichen Wang , Ye Liu , Hamoon Azizsoltani , Prathaban Mookiah
IPC: G06N5/02
Abstract: A computing device trains a machine state predictive model. A generative adversarial network with an autoencoder is trained using a first plurality of observation vectors. Each observation vector of the first plurality of observation vectors includes state variable values for state variables and an action variable value for an action variable. The state variables define a machine state, wherein the action variable defines a next action taken in response to the machine state. The first plurality of observation vectors successively defines sequential machine states to manufacture a product. A second plurality of observation vectors is generated using the trained generative adversarial network with the autoencoder. A machine state machine learning model is trained to predict a subsequent machine state using the first plurality of observation vectors and the generated second plurality of observation vectors. A description of the machine state machine learning model is output.
-
-
-