2 Replies Latest reply on Sep 24, 2015 2:30 AM by antnee777

    Which criteria to use for better peformance / scalability




      I am deploying a fairly basic whitelist based ruleset, which will deal with quite a high traffic volume.


      Both criteria below fulfil my requirements, but I would like to know if the use of one is more expensive that the use of the other in performance terms.



      URL.Categories (domains placed into custom categories).


      In terms of scalability and complexity of list maintenance, Url.Host.BelongsToDomains seems the better option, but I wonder if there is a price to pay for that.





        • 1. Re: Which criteria to use for better peformance / scalability

          Hi Ant,


          Url.Host.BelongsToDomains is better. Using URL.Category will require first checking the category of the URL, and then matching that category to your Whitelist. URL.Host.belongsToDomains on the other hand will iterate through the list of domains until it hits a match, at which point it will stop processing and move on to the next rule. Generally speaking simpler is faster.

          If you are interested in seeing how fast the rules take to process, you can either use rule tracing which shows how long it took to process the rule for a specific cycle: (random example below took 261 microseconds)


          Or you can using the stopwatch timers, and put the results into either a block page or a log file to see how long rule evaluation is taking.

          To use the StopWatch, on a rule you want to start timing from, Under Events add a new event StopWatch.Start(Parameter) and then on the rule you want to stop timing on Under Events add a new event StopWatch.Stop(Parameter) finally, to put it in the log file, edit the Log lines property and add Number.ToString(Stopwatch.GetMicroSeconds(Parameter)).




          • 2. Re: Which criteria to use for better peformance / scalability

            Thanks Tris, very helpful.