-
公开(公告)号:US11635988B1
公开(公告)日:2023-04-25
申请号:US17820952
申请日:2022-08-19
Applicant: SAS Institute Inc.
Inventor: Yan Gao , Joshua David Griffin , Yu-Min Lin , Yan Xu , Seyedalireza Yektamaram , Amod Anil Ankulkar , Aishwarya Sharma , Girish Vinayak Kolapkar , Kiran Devidas Bhole , Kushawah Yogender Singh , Jorge Manuel Gomes da Silva
Abstract: A computing device determines an optimal number of threads for a computer task. Execution of a computing task is controlled in a computing environment based on each task configuration included in a plurality of task configurations to determine an execution runtime value for each task configuration. An optimal number of threads value is determined for each set of task configurations having common values for a task parameter value, a dataset indicator, and a hardware indicator. The optimal number of threads value is an extremum value of an execution parameter value as a function of a number of threads value. A dataset parameter value is determined for a dataset. A hardware parameter value is determined as a characteristic of each distinct executing computing device in the computing environment. The optimal number of threads value for each set of task configurations is stored in a performance dataset in association with the common values.
-
公开(公告)号:US11062219B1
公开(公告)日:2021-07-13
申请号:US17106488
申请日:2020-11-30
Applicant: SAS Institute Inc.
Inventor: Joshua David Griffin , Riadh Omheni , Yan Xu
Abstract: A computer solves a nonlinear optimization problem. An optimality check is performed for a current solution to an objective function that is a nonlinear equation with constraint functions on decision variables. When the performed optimality check indicates that the current solution is not an optimal solution, a barrier parameter value is updated, and a Lagrange multiplier value is updated for each constraint function based on a result of a complementarity slackness test. The current solution to the objective function is updated using a search direction vector determined by solving a primal-dual linear system that includes a dual variable for each constraint function and a step length value determined for each decision variable and for each dual variable. The operations are repeated until the optimality check indicates that the current solution is the optimal solution or a predefined number of iterations has been performed.
-
公开(公告)号:US10949747B1
公开(公告)日:2021-03-16
申请号:US16950145
申请日:2020-11-17
Applicant: SAS Institute Inc.
Inventor: Majid Jahani , Joshua David Griffin , Seyedalireza Yektamaram , Wenwen Zhou
Abstract: A computer trains a neural network model. (A) Observation vectors are randomly selected from a plurality of observation vectors. (B) A forward and backward propagation of a neural network is executed to compute a gradient vector and a weight vector. (C) A search direction vector is computed. (D) A step size value is computed. (E) An updated weight vector is computed. (F) Based on a predefined progress check frequency value, second observation vectors are randomly selected, a progress check objective function value is computed given the weight vector, the step size value, the search direction vector, and the second observation vectors, and based on an accuracy test, the mini-batch size value is updated. (G) (A) to (F) are repeated until a convergence parameter value indicates training of the neural network is complete. The weight vector for a next iteration is the computed updated weight vector.
-
公开(公告)号:US20210264287A1
公开(公告)日:2021-08-26
申请号:US17081118
申请日:2020-10-27
Applicant: SAS Institute Inc.
Inventor: Steven Joseph Gardner , Joshua David Griffin , Yan Xu , Patrick Nathan Koch , Brett Alan Wujek , Oleg Borisovich Golovidov
Abstract: Tuned hyperparameter values are determined for training a machine learning model. When a selected hyperparameter configuration does not satisfy a linear constraint, if a projection of the selected hyperparameter configuration is included in a first cache that stores previously computed projections is determined. When the projection is included in the first cache, the projection is extracted from the first cache using the selected hyperparameter configuration, and the selected hyperparameter configuration is replaced with the extracted projection in the plurality of hyperparameter configurations. When the projection is not included in the first cache, a projection computation for the selected hyperparameter configuration is assigned to a session. A computed projection is received from the session for the selected hyperparameter configuration. The computed projection and the selected hyperparameter configuration are stored to the first cache, and the selected hyperparameter configuration is replaced with the computed projection.
-
公开(公告)号:US11055639B1
公开(公告)日:2021-07-06
申请号:US17064280
申请日:2020-10-06
Applicant: SAS Institute Inc.
Inventor: Pelin Cay , Nabaruna Karmakar , Natalia Summerville , Varunraj Valsaraj , Antony Nicholas Cooper , Steven Joseph Gardner , Joshua David Griffin
IPC: G06N20/00 , G06N3/08 , G06Q10/04 , G06F9/54 , G06N3/02 , G06N20/20 , G06N20/10 , G06N3/04 , G06F9/50
Abstract: Manufacturing processes can be optimized using machine learning models. For example, a system can execute an optimization model to identify a recommended set of values for configurable settings of a manufacturing process associated with an object. The optimization model can determine the recommended set of values by implementing an iterative process using an objective function. Each iteration of the iterative process can include selecting a current set of candidate values for the configurable settings from within a current region of a search space defined by the optimization model; providing the current set of candidate values as input to a trained machine learning model that can predict a value for a target characteristic of the object or the manufacturing process based on the current set of candidate values; and identifying a next region of the search space to use in a next iteration of the iterative process based on the value.
-
公开(公告)号:US10360517B2
公开(公告)日:2019-07-23
申请号:US15822462
申请日:2017-11-27
Applicant: SAS Institute Inc.
Inventor: Patrick Nathan Koch , Brett Alan Wujek , Oleg Borisovich Golovidov , Steven Joseph Gardner , Joshua David Griffin , Scott Russell Pope , Yan Xu
Abstract: A computing device automatically selects hyperparameter values based on objective criteria to train a predictive model. Each session of a plurality of sessions executes training and scoring of a model type using an input dataset in parallel with other sessions of the plurality of sessions. Unique hyperparameter configurations are determined using a search method and assigned to each session. For each session of the plurality of sessions, training of a model of the model type is requested using a training dataset and the assigned hyperparameter configuration, scoring of the trained model using a validation dataset and the assigned hyperparameter configuration is requested to compute an objective function value, and the received objective function value and the assigned hyperparameter configuration are stored. A best hyperparameter configuration is identified based on an extreme value of the stored objective function values.
-
公开(公告)号:US12299503B1
公开(公告)日:2025-05-13
申请号:US19000697
申请日:2024-12-24
Applicant: SAS Institute Inc.
Inventor: Xindian Long , Liping Cai , Xingqi Du , Steven Eric Krueger , Joshua David Griffin , Yan Xu , Scott Russell Pope , Lawrence Edmund Lewis
Abstract: A system, method, and computer-program product includes receiving, by a worker process, a plurality of chunks of data from a client process; deriving, by the worker process, an input pattern for feeding the plurality of chunks of data to a machine learning model; caching, by the worker process, a subset of data elements of the plurality of chunks of data specified by the input pattern based on a data caching policy; and training the machine learning model by feeding the subset of data elements cached by the worker process and a remainder of data elements in the plurality of chunks of data when requested by the input pattern.
-
公开(公告)号:US20220198340A1
公开(公告)日:2022-06-23
申请号:US17523607
申请日:2021-11-10
Applicant: SAS Institute Inc.
Inventor: Yan Gao , Joshua David Griffin , Yu-Min Lin , Bengt Wisen Pederson , Ricky Dee Tharrington,, JR. , Pei-Yi Tan , Raymond Eugene Wright
Abstract: A computing device selects new test configurations for testing software. Software under test is executed with first test configurations to generate a test result for each test configuration. Each test configuration includes a value for each test parameter where each test parameter is an input to the software under test. A predictive model is trained using each test configuration of the first test configurations in association with the test result generated for each test configuration based on an objective function value. The predictive model is executed with second test configurations to predict the test result for each test configuration of the second test configurations. Test configurations are selected from the second test configurations based on the predicted test results to define third test configurations. The software under test is executed with the defined third test configurations to generate the test result for each test configuration of the third test configurations.
-
公开(公告)号:US10963802B1
公开(公告)日:2021-03-30
申请号:US17120340
申请日:2020-12-14
Applicant: SAS Institute Inc.
Inventor: Steven Joseph Gardner , Joshua David Griffin , Yan Xu , Yan Gao
Abstract: A computing device selects decision variable values. A lower boundary value and an upper boundary value is defined for a decision variable. (A) A plurality of decision variable configurations is determined using a search method. The value for the decision variable is between the lower boundary value and the upper boundary value. (B) A decision variable configuration is selected. (C) A model of the model type is trained using the decision variable configuration. (D) The model is scored to compute an objective function value. (E) The computed objective function value and the selected decision variable configuration are stored. (F) (B) through (E) is repeated for a plurality of decision variable configurations. (G) The lower boundary value and the upper boundary value are updated using the objective function value and the decision variable configuration stored. Repeat (A)-(F) with the lower boundary value and the upper boundary value updated in (G).
-
公开(公告)号:US10769528B1
公开(公告)日:2020-09-08
申请号:US16590544
申请日:2019-10-02
Applicant: SAS Institute Inc.
Inventor: Ben-hao Wang , Joshua David Griffin , Seyedalireza Yektamaram , Yan Xu
Abstract: A computer trains a neural network model. (B) A neural network is executed to compute a post-iteration gradient vector and a current iteration weight vector. (C) A search direction vector is computed using a Hessian approximation matrix and the post-iteration gradient vector. (D) A step size value is initialized. (E) An objective function value is computed that indicates an error measure of the executed neural network. (F) When the computed objective function value is greater than an upper bound value, the step size value is updated using a predefined backtracking factor value. The upper bound value is computed as a sliding average of a predefined upper bound updating interval value number of previous upper bound values. (G) (E) and (F) are repeated until the computed objective function value is not greater than the upper bound value. (H) An updated weight vector is computed to describe a trained neural network model.
-
-
-
-
-
-
-
-
-