WebSurge-Allow.txt and Robots.txt to run Load Tests

In order to run a load test on any non-local server you are accessing, WebSurge requires one of two marker files to verify that the site owner allows the site to be run under load.

The marker files are meant to avoid using WebSurge for generating denial of service attacks on non-owned Web properties.

Required Server Files

One of the following files is required for WebSurge to hit your site under load. The files are expected to live in the root folder of the Web site and have to be HTTP accessible (ie. https://yoursite.com/websurge-allow.txt):

  • websurge-allow.txt
    You can place websurge-allow.txt file in the root folder of your Web site. The content of the file is ignored as long as the request returns an HTTP 200 result.

  • robots.txt
    If you already have a robots.txt file in the root of your site, you can use it and the following line into the file:

    Allow: WebSurge

    WebSurge looks for the robots.txt file in the root of the Website and looks for the above directive and if found enables access under load.

Make sure the files are Accessible via HTTP

It's important that websurge-allow.txt or robots.txt are not just placed in the root of your site, but are also accessible via HTTP. This can be a problem for ASP.NET MVC application which typically don't serve static file content from disk.

To work around this you may have to set up a custom route to specifically allow the files to be served. For more info on how to handle this ASP.NET MVC and IIS please see this post.

© West Wind Technologies, 2014, 2014-2019 • Updated: 04/23/19
Comment or report problem with topic