The users in our Finance Dept. having recently switched over to Wells Fargo. Their site https://wellsoffice.wellsfargo.com will somtimes display properly and at other times be missing the images from akamai.net (changes from minute to minute). The Wells Fargo site seems to have rotating IP addresses, going through idk how large of a pool, that change about every 30 - 60 seconds. Does the web filter cache the IP address of the site and could this be causing the problem?
Thanks in advance!
In the Web Gateway DNS caching does exist (Configuration > Proxies > DNS Settings). BUT there are a couple things to understand about it.
There is a minimum TTL, and a maximum TTL.
-Minimum TTL is the minimum ammount of time the Web Gateway will allow a DNS entry to be stored in cache.
Example: WG makes DNS request for akaimai.net, DNS server returns response with a TTL of 1 second (meaning dont cache the record).
What WG does: Assuming your minimum TTL is 1 second, it will not cache it. If your minumum TTL is 5 seconds, then the Web Gateway will cache it for 5 seconds.
-Maximum TTL is the maximum ammount of time the Web Gateway will allow a DNS entry to be stored in cache.
Example: WG makes DNS request for mcafee.com, DNS server returns response with a TTL of 4500 seconds.
What WG does, assuming your maximum TTL is 3600 seconds (default), WG would cache it for a maximum of 3600 seconds.
I hope this helps, let me know if further clarification. But essentially yes, DNS cache could be a factor if some of your DNS settings were changed.
Narrowed it down to this.
Here is the Wells Fargo site:
When the user's IP is given full access to the internet, it works 100% of the time.
When the URLs are created as exceptions (*.wellsfargo.com*), it only works some of the time.
There are references to their content server at akamai.net. Here's an example:
Well, to assume your entry would match for *.wellsfargo.com* implies that you are using the property "URL" in your rule, if you add *.wellsfargo.com* to a whitelist based on the property of "URL.Host" then no, it would not match. Perhaps check the property used in the rule you are whitelisting it based on.
You will see a lot of issues like this with HTTPS sites.
What usually is happening is that they are coming through as IP addresses and not URLs.
Adding IPs to the rule sets always solves the problem with HTTPS sites, but sites that have many IP addresses, this causes a big issue and is much harder to correct.
That does cause a problem. I've requested a list or range of IPs from Wells Fargo and I am waiting for a response. Is there a reason why HTTPS sites come through as IPs instead of URLs?
If you are using the Web Gateway transparently (like WCCP) and NOT using SSL scanning, then the Web Gateway will not see the request (within the SSL tunnel) that the client is making to the server so the Web Gateway will only see the IP to start.
There are options in the Web Gateway to allow for whitelisting/blacklisting based on the host/fqdn in a transparent setup. Some of these whitelisting examples can be found within the default SSL scanner ruleset.