Stupid question I know, but I seem to be forgetting how to allow or block parts of sites. For example, there may be a URL such as www.test.com/stuff/thegoodstuff.
The default Whitelist rules use the URL.Host matches in list. While the list seems to support wildcard expressions (heck the column is even called "Wildcard Expression") in reality it only takes top level fqdn entries with wildcards. If you wish to allow / block at a lower level in the URL it fails.
So is there a property we can use to inspect a URL with subsites and filter deeper in that URL rather than just the top level?
For my whitelists, black lists, and almost every other list that relates to a URL, i break it up into 2 types. Hosts only and full URLs.
Name: Global Whitelist: Domains or Global Whitelist: URLs
URL.Host.BelongsToDomains (Global Whitelist: Domains) equals true OR
URL matches in list Global Whitelist: URLs
The string list of Global Whitelist: Domains are just domain or host names:
The Wildcard list of Global Whitelist: URLs are where i put full wildcards:
For those who haven't covered it, it's important to note the difference between "Stop Cycle" and "Stop Rule Set". So, we use "Stop Cycle" for our global white list, and that's upstream of authentication, anti-malware scanning, and DLP. So, we are cautious to limit what goes in the global white list.
We use a completely different set of lists for those things that should not be blocked but do need the scanning and authentication.
On a separate note. There is a performance consideration. Wildcard matches that start with a wild card, e.g. *.google.com, may search an entire string before giving up (unless there's some weird, behind-the-scenes optimization). So if you need matches of that type, matching against URL.Host or URL.Domain can provide some performance improvement and even eliminate some of those patterns that start with a wild card.