Abstract:
The present invention is directed to a network switch that determines when specific content is hot and directs flow to one or more cache servers. The architecture of the present invention can include a cache and a digest generator to store predetermined objects in the cache based on the digest generated by the digest generator.
Abstract:
PROBLEM TO BE SOLVED: To provide a network switch, intelligently carrying out load balance of a flow while maintaining the transaction integrity. SOLUTION: This network switch determines that a specified content is hot, and directs flow to one or more cache servers. The architecture of the present invention can include a cache and a digest generator to store predetermined objects in the cache based on the digest generated by the digest generator. COPYRIGHT: (C)2008,JPO&INPIT
Abstract:
PROBLEM TO BE SOLVED: To provide a network switch which intellectually perform the load balance of flow while keeping the consistency of transaction. SOLUTION: The present invention is directed to a network switch that determines when specific content is hot and directs flow to one or more cache servers. The architecture of the present invention can include a cache and a digest generator to store predetermined objects in the cache based on the digest generated by the digest generator. COPYRIGHT: (C)2008,JPO&INPIT
Abstract:
The present invention is directed to a network switch that determines when specific content is hot and directs flow to one or more cache servers. The architecture of the present invention can include a cache and a digest generator to store predetermined objects in the cache based on the digest generated by the digest generator.
Abstract:
The present invention is directed to a network switch that determines when specific content is hot and directs flow to one or more cache servers. The architecture of the present invention can include a cache and a digest generator to store predetermined objects in the cache based on the digest generated by the digest generator.
Abstract:
The present invention is directed to a network switch that determines when specific content is hot and directs flow to one or more cache servers. The architecture of the present invention can include a cache and a digest generator to store predetermined objects in the cache based on the digest generated by the digest generator.