Abstract:
A network-based production service is configured to process client requests for the production service via a network, capture production request data defining the requests and store the production request data in a data store. A test system comprising one or more controllers creates test jobs according to a test plan for testing the production service. The test plan creates a test profile for using specified production request data to simulate a load on the production service. Each job created by the test plan specifies a portion of production request data. A job queue receives and queues test jobs from one or more controllers configured to add test jobs to the job queue according to the test plan. Workers access jobs from the job queue and the production request data from the data store as specified in each job and replay the production request data to the production service.
Abstract:
Provided is a management system, comprising an interface, a processor and a storage device, wherein the interface has coupled thereto at least one maintenance target machine to be a target of a maintenance operation, which is identified by one piece of identification information, and wherein the management system is configured to: store information indicating a scheduled start time and scheduled finish time of the maintenance operation; determine, based on the scheduled start time, the scheduled finish time, and a login and logout for the maintenance operation on the at least one maintenance target machine detected by the management system, whether or not the maintenance operation is in execution on the at least one maintenance target machine; and determine, based on a result of the determination, whether or not to execute predetermined processing associated with an event transmitted from the at least one maintenance target machine.
Abstract:
A method of protecting data on a mobile computing device using a storage network by deploying to the mobile computing device, a synchronization agent and then associating a synchronization policy with the synchronization agent. The mobile computing device is monitored for at least one threshold event. Its determined that the threshold event has occurred which causes a request to initiate a data synchronization event to be transmitted. The response to the request is synchronizing the mobile computing device with the storage network.
Abstract:
A specifying method executed by a computer, the specifying method includes: acquiring, every specific time interval, a measurement value of a specific property from each of a plurality of devices which have the specific property; calculating a variation between the measurement value for each of the plurality of devices and an estimated value based on a plurality of past measurement values which are acquired from the plurality of devices prior to the measurement value; and specifying at least one device, which expresses a different behavior from other devices, from among the plurality of devices based on a set of variations including the variation regarding the plurality of devices.
Abstract:
During a test-generation technique, a test session with web flows associated with a set of users and a browser is recorded. Then, the test session is modified to generalize the web flows to a larger set of users. For example, data may be converted into variables, requests may be added or removed, and/or delays may be inserted between requests. In addition, the test session may be filtered using one or more filters to remove: static content, images and/or types of requests. After the test session has been generalized, it can be incorporated into a load test that accurately simulates interactions with multiple users.
Abstract:
Performance information is gathered on a client, and indicates the performance of a hosted service with respect to the client. A cross origin resource sharing system shares the performance information with an analysis system, that is separate from the hosting service.
Abstract:
According to embodiments of the present invention, one or more computer processors determine a source for a performance indicator of a target service element is known. The one or more computer processors select, at run time, a source for the performance indicator that has a least amount of information gaps. The one or more computer processors determine information associated with the performance indicator from the source. The one or more computer processors determine whether the determined information associated with the performance indicator is more current than a quality indicator periodicity value that is associated with the target service element. In response to determining that the determined information associated with the performance indicator is more current than a quality indicator periodicity value that is associated with the target service element, the one or more computer processors selecting the performance indicator.
Abstract:
A method and system for testing the end-to-end performance of cloud based applications. Real workload is created for the cloud based applications using synthetic users. The load and length of demand may be adjusted to address different traffic models allowing the measurement and analysis of user performance metrics under specified conditions.
Abstract:
In one embodiment, a method is performed by a computer system. The method includes providing a performance-monitoring platform as a service, the performance-monitoring platform comprising at least one agent manager. The method further includes facilitating creation of a customized performance-monitoring application, the performance-monitoring application comprising an agent and at least one user dashboard. The agent is configured to collect performance data related to a specified monitored resource. The agent parses the collected performance data to a monitoring server for storage according to at least one standard data model. The at least one user dashboard allows users to view information related to the collected performance data. The facilitating includes configuring deployment attributes of the customized performance-monitoring application responsive to developer input. Moreover, the method includes deploying the customized performance-monitoring application on the performance-monitoring platform. The method also includes making the customized performance-monitoring application available to end users as a service.
Abstract:
Load testing an online game server environment using a web-based interface includes: configuring a load test with configuration parameters including client behavior parameters and server parameters, wherein the client behavior parameters provide settings for various behaviors such as cheating and aggressiveness, and wherein the server parameters provide a setup for server states and messages; building and deploying simulation client and game server binaries; scheduling and running the load test; and collecting test data output from the load test. Keywords include load test automation, load test service, load test resource management.