Cache management for multi-node databases
Abstract:
Techniques related to cache management for multi-node databases are disclosed. In some embodiments, a system comprises one or more computing devices including a training component, data store, cache, filtering component, and listening component. The training component produces a plurality of models based on user interaction data. The plurality of models are stored in the data store, which responds to requests from the cache when the cache experiences cache misses. The cache stores a first subset of the plurality of models. The filtering component selects a second subset of the plurality of models based on one or more criteria. Furthermore, the filtering component sends the second subset of the plurality of models to a messaging service. The listening component retrieves the second subset of the plurality of models from the messaging service. Furthermore, the listening component causes the second subset of the plurality of models to be stored in the cache.
Public/Granted literature
Information query
Patent Agency Ranking
0/0