This article gives a good explanation of what data trickling is:
It also talks about the other "progress indication" method, of progress pages
I don't have an example of a speedtest tool that would show better progress.
I think one important question is the kind of "slow performance" that is reported.
- Does it take a lot of time until a web site shows up?
- Does downloading of a big file take a lot of time?
- Is it only the speedtest.net result that shows slow throughput?
Starting with item 3. While I have to admin that I don't know how exactly those speedtest websites work I can tell from my experience that they are not accurate. I think they are generally designed for usage at home where your network consists of your wifi router and a handful of computers. Usually the sites state that you need to rule out everything that might interfere the test, such as other computers, VOIP phones and even wifi. In such situations the speedtest might give you a good result but even then it is not given the results are correct.
One example: At home I have a downstream of 100 MBit. Checking various speedtest sites with a laptop plugged directly into the wall the results vary from 20 MBit to 110 MBit, depending which speedtest site I asked. So what I did was looking for a couple of really fast servers, I preferred those where you can download ISO images of various linux distributions from. Every download I made came in with a very good performance, around 14 MByte/s. That was when I stopped worrying about speedtest websites.
Now at a company you are at a different situation. I think in a company the main goal should not be amazing performance for a single download but we want good performance for everyone. The company network is more complex than your home network. All devices you will find in the company network are optimized for serving thousands of users and every router, switch, firewall, IPS, load balancer etc will add some delay. Depending on which layers they work they add more or less delay, but usually - while browsing - you won't notice a difference.
What I want to state is: Tools and websites that measure your internet throughput do not now anything about the complexity of your network. However they calculate the speed displayed after the test, it won't consider all the operations that were probably made on your connection, so in the end I don't believe the results help in any way. If they show the expected value - fine. If they don't - doesn't mean anything.
So item 2: Since we now excluded the speedtest as a valid performance indicator go ahead and fetch a couple of large files. To measure "throughput" you may want to fetch files which MWG does not filter in depth as otherwise filtering will take much longer than fetching the file. Put some URLs into the global allow list skipping all the filters and see what runs trough. The performance might be worse than "without MWG" since MWG still influences the packet flow and it is optimized for tons of users and compatibility rather than single connections, but the overall performance should be acceptable.
On MWG you can easily do a few tests for bigger files. SSH into the appliance and try how fast MWG is able to fetch the file like this:
wget -O /dev/null --no-check-certificate https://tau.mcafee.com/updates/datfile.test
It will give you an Output like this: 99,983,374 5.00M/s in 20s
This means it took me 20s to fetch the 100 MB file, most likely because my Internet Connection isn't any better at the moment.
Now I use MWG as a proxy:
First I see data trickling working (slow speed):
3.66K/s eta 6h 28m
But when MWG finished downloading the file the speed goes up and in the end I have a result like this: 99,983,374 23.5M/s in 28s
In the end with filtering the file it took a few seconds longer. The speed may be misleading here, at the beginning you see slow speeds as MWG performs data trickling but once the file is scanned completely the speed may increase up to the speed of the LAN link. Important is how long the download actually took and you can see that MWG took a while longer to apply filters etc.
From the list above item 1 is probably the one that you may encounter. In such cases it is not the throughput which causes problems but MWG sees delays in fetching the objects from the internet. If you have 300 ms delay for a single object a website with 100 objects (images, css, js, etc.) will already take noticeable longer and users will complain.
To compare it might be useful to get a browser and an appropriate plugin that helps you "counting" the time it took to load and render the page. I use Firefox + Firebug for example. Then I go to www.mcafee.com without having a proxy configured and I can see exactly how long it took to show the web site:
Then I enable the proxy in my browser configuration, wipe the cache and try it again:
I could see that the website was a little quicker through MWG which may be because it was in the cache or just accidentally. For measuring performance you may want to try multiple pages and try them multiple times during different times per day. Anyway in case MWG was giving a slow performance the page would have taken 30 seconds, so you would see some impact of MWG filtering your requests.
If you find websites run much slower through MWG the first thing to do should be putting your client IP into an allow list which prevents you from being filtered. So you can easily find out if there is any rule causing the delay in the policy or if the issues ir more "proxy" or "network" related. Now advanced tools such as rule engine traces or packet captures come into play - this is where you may want to get in touch with support.
I hope this gives you some idea how to proceed.