Solved! Go to Solution.
Hello,
thank you for the rule trace. As mentioned it is hard to make a useful rule. My idea would be to do a rule:
Request.Header.Get("User-Agent") matches FileZilla/*
AND
URL.Port > 1024
to bypass SSL inspection for these requests. Along with some URL Filtering rules that make sure category and reputation are good you should have some protection left.
It might be better to take a look if it is possible to re-configure File Zilla to use MWGs FTP Proxy to allow it to inspect the traffic.
Andre
Hi,
this might become difficult, I don't think there is any relation between the control connection on port 21 and the high port data connection.
Do you have a way to create some rule engine trace or similar of the data connection so I can take a look if there is a way to create a generic whitelist approach?
Andre
Hello Andre,
Thanks for replying to me. Attached are rule traces from one attempt as requested. Currently we have destinations in SSL whitelist. But we are still looking for general rule to avoid administrative work for every new connection as previously we used Symantec SGs where it was possible. Thanks,
Hello,
thank you for the rule trace. As mentioned it is hard to make a useful rule. My idea would be to do a rule:
Request.Header.Get("User-Agent") matches FileZilla/*
AND
URL.Port > 1024
to bypass SSL inspection for these requests. Along with some URL Filtering rules that make sure category and reputation are good you should have some protection left.
It might be better to take a look if it is possible to re-configure File Zilla to use MWGs FTP Proxy to allow it to inspect the traffic.
Andre
Hi Andre,
Thank you for analysis and input. If there is no single parameter to cover it I guess we will just have to play with it a bit to cover as much as possible while not affecting other services. Thank you,
Corporate Headquarters
6220 America Center Drive
San Jose, CA 95002 USA