Snort, Help Blocking Anonymous Proxy Usage
-
Having had so many issues with students getting around the Squid +Squidguard configuration I have in one of the schools I manage. I started to look for other ways of preventing them from bypassing squid using the growing number of proxy websites. In my search I found an article describing ways to block such websites using snort (Link at the bottom of my post). So my adventure with snort began. I proceeded to download/install/configure it as well as download the latest rules from snort.org. My main issue is that I cannot seem to find the rules specific to blocking these types of websites (PHPproxy, Glype, CGIProxy and so on). So I have turned my attention to adding the rules myself and am unable to find the option to add my own rules. The upload custom rules option just refreshes the gui.
Any help would be appreciated,
Thanks!
Article: http://www.sans.org/reading_room/whitepapers/detection/detecting-preventing-anonymous-proxy-usage_32943
-
Why don't you try using squidguard with squid to do your filtering against those proxy sites.
-
Why don't you try using squidguard with squid to do your filtering against those proxy sites.
That would require me to watch the users like a hawk. Since I am only there once a week its not exactly easy to go through the history on each computer to see which new site they are using. Their are hundreds if not thousands of new proxy sites that come out every day. Would be next to impossible to block them all without using a high level of filtering. The scripts these websites are using get around the proxies keyword filtering as the page source does not contain any real words. Its all random letters and numbers. Here is an example: http://www.frozenshadow.com/
-
Its true that there are hundreds of thousands of proxy sites, although the vast majority of them use off-the-shell proxy software (cgi-proxy, phpproxy, etc) which have very easily identified signatures in the URL stream. With a little effort and some regex-fu, you can probably come up with a decent list of expressions for a squidGuard configuration to block them. I suspect you might find someone who has done some serious work with snort identifying and blocking these sites as well, but I haven't seen a set of signatures for this purpose yet.
-
I'm not familiar with regex-fu. In doing some research with the site http://highdirt.com and there are about approximately 8 different combination to the word "bypassing" that they use. I have looked at the source code every time I refresh and it changes. Here are the one's I managed to nab before it began the source began to repeat itself.
' by' 'pa' 'ss' 'ing'
' by' 'pa' 'ssi' 'ng '
' by' 'pas' 'sin' 'g '
'by' 'pa' 'ss' 'in' 'g I'
'by' 'pas' 'si' 'ng '
'byp' 'as' 'sin' 'g I'
'byp' 'ass' 'in' 'g '
'byp' 'ass' 'in' 'g I'I tried adding these to the expressions of a destination that has been set to deny and then saved/applied the settings and it did not seem to work at blocking the sites. Even if when looking at the loaded page's source code these expressions show up multiple times.
' by'|'pa'|'ss'|'ing'
' by'|'pa'|'ssi'|'ng '
' by'|'pas'|'sin'|'g '
'by'|'pa'|'ss'|'in'|'g I'
'by'|'pas'|'si'|'ng '
'byp'|'as'|'sin'|'g I'
'byp'|'ass'|'in'|'g '
'byp'|'ass'|'in'|'g I'Now correct me if I'm wrong but this is because the proxy server is looking at URL's and not within the source code of the page. If that's the case then I should be looking for similarities between the url sting of the page when I use it to go to say, facebook.com?
-
As you mentioned a pattern is definitely emerging with these website. Just using the PHProxy option on the 20 anon proxy sites I was using to gather information. I have come up with the following results.
http://domain/(encryptedtext).php?(encryptedtext)A=(encryptedtext)
http://domain/(encryptedtext).php?(encryptedtext)A=(encryptedtext)g=(encryptedtext)
http://domain/(encryptedtext).php?(encryptedtext)g=(encryptedtext)
http://domain/(encryptedtext).php?(encryptedtext)Q=(encryptedtext)
http://domain/(encryptedtext).php?(encryptedtext)Q=(encryptedtext)g=(encryptedtext)
http://domain/(encryptedtext).php?(encryptedtext)Q=(encryptedtext)Q=(encryptedtext)
http://domain/(encryptedtext).php?(encryptedtext)w=(encryptedtext)g=(encryptedtext)
http://domain/images/(encryptedtext).php?u=(encryptedtext)b=5&f=norefer -
have you also tried using opendns as your forwarders? then just set a firewall rule that only opendns can be used.
i find opendns quite effective and it's another layer for them to try and overcome.