Abstract:
A new system and method of available bandwidth estimation applies even where the narrow link and tight link in a network path are in different locations. In embodiments of the invention, a unique packet probe series structure and processing is employed to estimate available bandwidth. In an embodiment of the invention, the spacing between probe packets is adjusted at the source to account for dilation caused by links leading to the tight link (311), so that the spacing is appropriate when the probes arrive at the tight link (311). Moreover, the multi-packet probe comprises a large packet (301) followed by two much smaller packets (303,305). The large packet (301) is then dropped once it has traversed the tight link (311). The two small packets (303,305), which are impacted little by subsequent narrow links, preserve the spacing set by the tight link (311), encoding the delay induced by the tight link (311), all the way to the destination.
Abstract:
A method of reducing bandwidth limitations to send events to a set of interested clients within a pre-defined time period as quickly and fairly as possible. The clients can be re-distributed among the servers in a network such that the delay due to server overloading is minimized by moving clients from an overloaded server to a server with available bandwidth. In addition, the latency of client-server communications can be incorporated into an estimation of download times, and the servers can then initiate delivery to respective clients based on those download times. By staggering the send times to account for heterogeneous latencies, more clients can receive the event at the same time, and a fairness of distribution can be achieved.
Abstract:
A method of reducing bandwidth limitations to send events to a set of interested clients within a pre-defined time period as quickly and fairly as possible. The clients can be re-distributed among the servers in a network such that the delay due to server overloading is minimized by moving clients from an overloaded server to a server with available bandwidth. In addition, the latency of client-server communications can be incorporated into an estimation of download times, and the servers can then initiate delivery to respective clients based on those download times. By staggering the send times to account for heterogeneous latencies, more clients can receive the event at the same time, and a fairness of distribution can be achieved.
Abstract:
To provide a fairness of distribution, an encrypted event, containing information not intended for release prior to a release time, can be sent to clients prior to the release time. In such a manner the bulk of the information can be transferred to the clients without concern to the duration of the transfer. At the release time, a small decryption key can be sent, either from a central sever, or from multiple server, utilizing multiple network paths to provide for the greatest likelihood that each client will receive the decryption key with a minimum of delay. Each client is thereby provided access to the information at approximately the same time, regardless of the bandwidth available to each client. Additionally, trusted edge servers, that can be trusted not to release information prior to an appropriate time, can send an unencrypted event, or decrypt the encrypted event and send the decrypted event, at a determined time, either prior to or after the release time, such that the decrypted or unencrypted event arrives at the clients that could not store and decrypt the encrypted event at approximately the same time as the key arrives at the other clients. Each client can thus receive the information at approximately the same time, regardless of the client's bandwidth or its ability to store and decrypt information.
Abstract:
A method of reducing bandwidth limitations to send events to a set of interested clients within a pre-defined time period as quickly and fairly as possible. The clients can be re-distributed among the servers in a network such that the delay due to server overloading is minimized by moving clients from an overloaded server to a server with available bandwidth. In addition, the latency of client-server communications can be incorporated into an estimation of download times, and the servers can then initiate delivery to respective clients based on those download times. By staggering the send times to account for heterogeneous latencies, more clients can receive the event at the same time, and a fairness of distribution can be achieved.
Abstract:
To provide a fairness of distribution, an encrypted event, containing information not intended for release prior to a release time, can be sent to clients prior to the release time. In such a manner the bulk of the information can be transferred to the clients without concern to the duration of the transfer. At the release time, a small decryption key can be sent, either from a central sever, or from multiple server, utilizing multiple network paths to provide for the greatest likelihood that each client will receive the decryption key with a minimum of delay. Each client is thereby provided access to the information at approximately the same time, regardless of the bandwidth available to each client. Additionally, trusted edge servers, that can be trusted not to release information prior to an appropriate time, can send an unencrypted event, or decrypt the encrypted event and send the decrypted event, at a determined time, either prior to or after the release time, such that the decrypted or unencrypted event arrives at the clients that could not store and decrypt the encrypted event at approximately the same time as the key arrives at the other clients. Each client can thus receive the information at approximately the same time, regardless of the client's bandwidth or its ability to store and decrypt information.
Abstract:
To provide a fairness of distribution, an encrypted event, containing information not intended for release prior to a release time, can be sent to clients prior to the release time. In such a manner the bulk of the information can be transferred to the clients without concern to the duration of the transfer. At the release time, a small decryption key can be sent, either from a central sever, or from multiple server, utilizing multiple network paths to provide for the greatest likelihood that each client will receive the decryption key with a minimum of delay. Each client is thereby provided access to the information at approximately the same time, regardless of the bandwidth available to each client. Additionally, trusted edge servers, that can be trusted not to release information prior to an appropriate time, can send an unencrypted event, or decrypt the encrypted event and send the decrypted event, at a determined time, either prior to or after the release time, such that the decrypted or unencrypted event arrives at the clients that could not store and decrypt the encrypted event at approximately the same time as the key arrives at the other clients. Each client can thus receive the information at approximately the same time, regardless of the client's bandwidth or its ability to store and decrypt information.
Abstract:
A method of reducing bandwidth limitations to send events to a set of interested clients within a pre-defined time period as quickly and fairly as possible. The clients can be re-distributed among the servers in a network such that the delay due to server overloading is minimized by moving clients from an overloaded server to a server with available bandwidth. In addition, the latency of client-server communications can be incorporated into an estimation of download times, and the servers can then initiate delivery to respective clients based on those download times. By staggering the send times to account for heterogeneous latencies, more clients can receive the event at the same time, and a fairness of distribution can be achieved.