One big problem I'ev in the past is when whitelisting specific URL's and sometimes full websites (e.g. www.nytimes.com) there is content that is filled from other URL's that are not ni my whitelist so it gets blocked. THis can oftern be imgaes, stylesheets etc which sometimes means they can view the site or URL. Is there a nifty way in V7 to allow URL's that have references to toher sites based by the inital URL. It has to be dynamic too becuase I often have to add individual URL's to this whitelist and I don twant to have to research each one of these everytime to determine what other URL's are needed and manually add them.
Solved! Go to Solution.
you may utilize the referrer header for this. Imagine you have whitelisted
If you now access the site some parts will work, but CSS and Images are loaded from
and other URLs. Those objects will not be whitelisted, because they are not explicitly listed.
All those requests which are caused by the original request contain a "Referer" header:
What you can do is design a Rule Set which does not only say
If URL matches in List "Whitelisted URLs" -> then stop filtering
but extend the criteria and say
If URL matches in List "Whitelisted URLs" OR Header.Request.Get(Referer) matches in List "Whitelisted URLs" -> then stop filtering
This would magically whitelist those embedded object.
BUT, the Referer header probably can be injected by experienced users, there is no way to protect against this kind of abuse, so it may be possible for insiders to whitelist websites which are not originally listed on your Whitelist.
If I were to do this, I would isolate the host name in the referrer and use that in the whitelist.
Then use that for the match.
Yes, this is huge security hole, especially for search engine results. I wouldn't necessarily recommend this method.
Thanks that might work. THe idea would be the the users only have a very small "whitelist" to begin with so could i create a rule with the condition that the URL be listed ni the whitelist. That would only match if the user requests the specific URL and then could i use your suggestion to then find the referer URLs and dynamically allow them?
I understand your point about search engine results but lucily they dont have access t any search engines so that shouldnt be a huge issue and im 99.9999999% sure these users are nto technically savy enough to inject a header and get around this.
Also I tried to get your suggested config running to see if it would work for me but couldnt get that to work. Could you do up a XML file i could import and see how it works?Message was edited by: lancekentwell on 24/07/11 8:11:23 PM
Brilliamt, it almost worked for me. What I did first of all was to set some group memberhsip criteria (which will never change) to identify the users who are to be restricted. It then applied the whitelist and loaded the content with the referer but i found that rules following this where also being applied that allowed more access. I think this is becuase the users group memberships also triggered the lower rules but I didnt want them to have that extra access.
So what i did in your ruleset is added a final rule that blocks all URLs that are not in the whitelist and that works great. If you can suggest a different way to do this or is this method I used ok?
it is possible that the referer will be used if you access a different URL. In this case the "final rule to block all URLs" should be a suitable approach. I think that should do the trick.
Yep seems to be working the way i would like. There are some links on vairous sites which go out to other sites i dont want them accessing but have that referer header in them so the final block all rule on the bottom cleans that up nicely.
Thanks for the help.