Executing a machine learning model in an artificial intelligence infrastructure
Abstract:
Executing a machine learning model in an artificial intelligence infrastructure that includes one or more storage systems and one or more graphical processing unit (‘GPU’) servers, including: receiving, by a graphical processing unit (‘GPU’) server, a dataset transformed by a storage system that is external to the GPU server; and executing, by the GPU server, one or more machine learning algorithms using the transformed dataset as input.
Information query
Patent Agency Ranking
0/0