there is an GTI problem today. Try this.
Policy – Settings – Engines – Anti-Malware [your AV setting name]
Expand the “Advanced Settings” and uncheck the option “Enable GTI file reputation queries”
Disable in URL Filter:
Policy – Settings – Engines – URL Filter [your URL Filter setting] – Rating Settings
Uncheck first of all the option “Enable the Dynamic Content Classifier if GTI web categorization yields no result”
Uncheck “Use online GTI web reputation and categorization services if local rating yields no result
speed tests are not helpful. They will not give you any indication about how fast MWG is capable of processing a request. The speed tests measure the raw performance but MWG has to do some things to the data (e.g. keep it away from the client until scanning has finished) thish the speed tests do now know about, so the result may vary from very bad performance to unbelievable performance... they don't give any usable indication of what is wrong.
Instead use a tool on the browser site to measure how long a common page takes until it is displayed completely. The page should not be a news site with 300 images and it also should not be a site with one line of text only. Also it may make sense to use multiple sites for testing :-)
There are usually two kinds of performance issues, those which are introduced by rules (e.g. MWG processes the request) and those introduced by the proxy (which could also include network or DNS trouble).
You can check this easily:
1.) Browse to the web site with MWG enabled
2.) Browse to the web site with MWG enabled but a white list entry at the top of the rule set that exempts the page from any scanning (make a white list entry for the client IP or username, not for the destination)
If the page is slow with MWG enabled but fast with MWG enabled but the site whitelisted it is a problem with the rules. Use rule engine tracing to find out where delays occur (could be authentication for example, or GTI lookups)
If the page is slow in both cases the problem is NOT introduced by rules but by the proxy. Check DNS settings and network performance. Take a packet capture and have a look at what takes long or involve support.
The issue should be done.
hi troja, I just did, but we're having this issues all the time, do this problems occur often? If I let it disabled, what consequences does this have? How does the web gateway know about URL categories-I guess through the local GTI database? This are our settings, should I also uncheck "search for and rate embedded URLs", since this could have an impact on the performance?
this was a problem yesterday and it was also the first trouble i know in this ways since MWG is available.
If there are performance issues i would suggest the following.
- TCP Dump directly on MWG to check if there are any troubles.
- Adding a Ruleset on top of your policy where a source IP is whitelisted. Reproduce the problem. IF the problem is gone, there is something wong in the ruleset.
- To check your policy you have several options.
-- upload a backup of MWG directly to contentsecurity.mcafee.com. There is an option to check it.(Online Tools - Consistency check)
-- use Rule Tracing Central to check directly on MWG.
-- use the stopwatch criteria to measure specific parts of your ruleset
We built a ruleset to write a Debug Log File for one client. At the bottom you can see also the performance information.
Date: [02/Jun/2015:14:27:08 +0200]
Authenticated User: \
Client IP: x.x.x.x
User Agent: Google Update/126.96.36.199;winhttp;cup-ecdsa
HTTP Status Code: 200
URL Request Header first line: POST https://tools.google.com/service/update2?cup2key=5:1410025602&cup2hreq=064cb7fc2 f0fb088df0dd3f940816924c6e8187a2e551de998227ffdd7e78b8e HTTP/1.1
URL Host: tools.google.com
URL Categories: Internet Services
URL Reputation: Minimal Risk
MediaType from HTTP Header: text/xml; charset=UTF-8
Media Type (by Signature - High Prop): text/xml
Media Type (by Signature - Low Prop):
Media Type from File Extension:
Body Filename: update2
Supported by Opener: true
Body is Encrypted: false
Body is Multipart Object: false
Body is Corrupted Object: false
Application Name: -
Security Engine Information:
Body changed by any engine: false
Current/Last Rule: SppCloud (Webhybrid) (with Template: -)
Fired Rules: Removed
Rule Set Processing Time from StopWatch: 0ms / 0micro sec.
Time Consumed by Rule Engine: 16 ms
Proxy(first) - Server(first): 31 ms
Proxy(last) - Server(first): 30 ms
Proxy(last) - Server(last): 33 ms
whitelisting speeds everything up, because NTLM authentication ruleset is being skipped (did the test with rule tracing central). I see that authentication is required for every embedded URL, would it make sense to skip authentication for embedded URLs?
I guess webgateway is able to filter those, since you can choose if embedded URLs should be checked or not:
you should only authenticate in the request cycle, not in Response and Embedded object cycle. :-)
I've reconfigured our proxy to use Kerberos instead of NTLM for authentication, that helped reducing the amount of authentication requests and so it helped to speed everything a little bit up...