WebSurge-Allow.txt and Robots.txt to run Load Tests
In order to run a load test any non-localhost server you are accessing is required to have an empty
websurge-allow.txt or a
robots.txt file with
Allow: WebSurge inside of it in the root of the Web site being tested.
This is to avoid using this tool for generating denial of service attacks on non-owned Web properties.
Either one of the following will allow WebSurge to hit your site under load:
You can place a
websurge-allow.txtfile in the root folder of your Web site. The file just has to exist and doesn't have to have any content in it (although it can).
If you already have a
robots.txtfile you can use that file and add the following line into the file:
WebSurge looks for the
robots.txtfile in the root of the Website and looks for the above directive and if found enables access under load.
Comment or report problem with topic