WebSurge-Allow.txt and Robots.txt to run Load Tests
In order to run a load test on any non-local server you are accessing, WebSurge requires one of two marker files to verify that the site owner allows the site to be run under load.
The marker files are meant to avoid using WebSurge for generating denial of service attacks on non-owned Web properties.
One of the following files is required for WebSurge to hit your site under load. The files are expected to live in the root folder of the Web site and have to be HTTP accessible (ie.
You can place
websurge-allow.txtfile in the root folder of your Web site. The content of the file is ignored as long as the request returns an HTTP 200 result.
If you already have a
robots.txtfile in the root of your site, you can use it and the following line into the file:
WebSurge looks for the
robots.txtfile in the root of the Website and looks for the above directive and if found enables access under load.
It's important that
robots.txtare not just placed in the root of your site, but are also accessible via HTTP. This can be a problem for ASP.NET MVC application which typically don't serve static file content from disk.
To work around this you may have to set up a custom route to specifically allow the files to be served. For more info on how to handle this ASP.NET MVC and IIS please see this post.
Comment or report problem with topic