E2guardian package for pfsense - $??
-
Does E2guardian have "Use SafeSearch engine"?
I haven't found that as an option anywhere. Google has changed the way it implements forced safe search though. It's DNS based now.
https://support.google.com/websearch/answer/186669?hl=enI'm sure that the filter/proxy could do a redirect of some sort also, but I just implemented it through DNS as Google suggests.
-
It would be nice if it did it automatically for all search engines with this feature.
I am having trouble find a step by step guide for setting this up with pfsense, do you have a link?
-
It would be nice if it did it automatically for all search engines with this feature.
I am having trouble find a step by step guide for setting this up with pfsense, do you have a link?
Most of the step by steps talk about using the nossl.google.com which doesn't work anymore.
Basically you just have to be running DNS Resolver and all your computers pointing to it as their primary DNS (can be done through DHCP easy enough), then at the bottom of DNS Resolver add an override for www.google.com and have it point to 216.239.38.120 which is what forcesafesearch.google.com resolves to. (Hint you can also do this for www.youtube.com)
It's not the best solution since it would be better to use the FQDN forcesafesearch.google.com, but you can't do CNAME records in DNS Resolver so using the IP address works.
Also if this is in a Microsoft Domain environment you want all windows boxes pointing to your DNS server running on the Domain Controller. To work around that, you can add a forwarder to the MS DNS server to point to your pfSense box that is running DNS Resolver. (You can't add a CNAME record to MS DNS for www.google.com because Server 2008 r2 and above won't allow it)…
Also to make sure things are locked down, block all dns requests from going out your firewall. That way people can't specify a DNS server outside your control to grab www.google.com from.
https://doc.pfsense.org/index.php/Blocking_DNS_queries_to_external_resolversHope that's clear enough...
-
It would be nice if it did it automatically for all search engines with this feature.
I am having trouble find a step by step guide for setting this up with pfsense, do you have a link?
I just found this that looks promising.. I don't know if it would work with HTTPS though. Or at least not until you setup HTTPS filtering.
https://groups.google.com/forum/#!topic/e2guardian/1ieUvbsNDkU
In urlregexplist
"(^http://[0-9a-z]+.google.[a-z]+[-/%.0-9a-z]/images?)"->"\1safe=vss&"
"(^http://[0-9a-z]+.google.[a-z]+[-/%.0-9a-z]/search?)"->"\1safe=vss&"
"(^http://[0-9a-z]+.google.[a-z]+[-/%.0-9a-z]*/s?)"->"\1safe=vss&"vss = very safe search
-
I am trying to enable the clamav in the e2guardian program, but alas, I get an error that is shown on my monitor:
Aug 17 20:56:44 e2guardian[86361]: Unable to load plugin config /usr/local/etc/e2guardian/contentscanners/clamdscan.conf
I'm guessing this is still in a working progress? However, I'm glad that the squid3 comes with the clamav, but I like the e2guardian (formally Dansguardian) access denied error when it finds a virus on a site, like eicar's test antivirus.
I'll post back if I found a solution (temporarily speaking).
-
"(^http://[0-9a-z]+.google.[a-z]+[-/%.0-9a-z]/images?)"->"\1safe=vss&"
"(^http://[0-9a-z]+.google.[a-z]+[-/%.0-9a-z]/search?)"->"\1safe=vss&"
"(^http://[0-9a-z]+.google.[a-z]+[-/%.0-9a-z]*/s?)"->"\1safe=vss&"vss = very safe search
That could be a starting point, maybe even add youtube, yahoo, bing and other search engines (this would be a nice option in the E2guardian menu), will try out the manual way once E2guardian is out in the package.
I got it to work on youtube (it now shows restricted mode on at the bottom).
Spoke too soon, it stopped working.
Ok youtube working now,however not googleworking nowshowever it did not work for google search (in search settings "turn on safe mode" is unticked).
-
Not sure if this is known, but it seems that the SSLIMITM is not currently working. I had SSLIMITM setup on Squid and it was working fine, I added E2Guardian to the mix and turned on SSLIMITM and it stopped working. Maybe it was chaining the two together I don't know… I did notice that E2Guardian 3.2.0 was just released and in the notes was "SSLIMITM is now fully functional in explicit proxy mode". I'm not sure if that means that the current version is broken or.... I'm guessing it's going to be a little bit of time before E2Guardian update makes it's way over to FreeBSD...
Does anyone know the correct config for SSLIMITM? I'm using explicit so this isn't a transparent issue... My question is should I have problems with E2Guardian diong the SSL Man in the Middle AND Squid also doing the SSL Man in the Middle??? Should one be on and one be off? Or is there some other setting?
-
Not sure if this is known, but it seems that the SSLIMITM is not currently working.
Current e2guardian version for Freebsd does not have SSLMITM code.
-
And that's a good thing. :P
-
And that's a good thing. :P
So are you suggesting that there is a better way?? Or that you just don't like filtering HTTPS communication at all??? If it's the latter I totally get that stand and agree, but unfortunately at a school it's just not possible to ignore HTTPS traffic and not filter it… Not to mention the need for caching (Low bandwidth)....
Is there a way, for now, to pass SSL onto Squid and have Squid do the SSLIMITM so I can at least still cache???
-
MITM is evil. Period.
-
caching is fairly useless for the majority of the popular sites. dynamic content is a pain to cache. perhaps after some tweaking you'll manage a 5% hitrate at a typical school.
mitm ssl is essentially destroying ssl altogether. you say it's impossible to ignore https in schools ? i know personally of +30 schools that don't mitm their pupils …. they have rights too, privacy is one of them.
what about the evil_x_url? you do have teachers present right? -
Look I get it… I'm a privacy advocate myself, but not in this case... First off Caching is way higher then 5% for my use case, because 90% of the sites they are visiting will have a ton of static images and the sites are visited by multiple classes because they are education related. Second these are kids under the age of 15 using school provided computers for school related activities. They get no privacy because the computers are not to be used for private use. Teachers are present, but no one teacher can watch all computers all the time. There are reasons for having to man in the middle ssl connections. MITM hacks are not good, I get that and I don't like it either. But until they come up with a standard that allows for voluntary interception of secured communication (such as an https proxy where the man in the middle is expected and requested), MITM "hacks" will have to do.
Oh and https://www.fcc.gov/guides/childrens-internet-protection-act
-
have you considered ip blacklists? the paid subscriptions generally filter out most of the crap.
afaik squid3 can be made to work with ssl … once you get the falsified certs deployed to every client, it might work.
then you'll have to go around and install the certs on every cellphone/tablet/laptop of every student ... good luck with that.governments are allways wrong.
-
have you considered ip blacklists? the paid subscriptions generally filter out most of the crap.
afaik squid3 can be made to work with ssl … once you get the falsified certs deployed to every client, it might work.
then you'll have to go around and install the certs on every cellphone/tablet/laptop of every student ... good luck with that.governments are always wrong.
Governments are often wrong I agree :) The network is locked down to only authorized devices so no problems with the certs either… Google itself is another reason I have to do this, wish they had a better method... https://support.google.com/a/answer/1668854?hl=en
I have Squid working with SSL, but does anyone know if it will be an issue having the ssl traffic running encrypted through E2Guardian first? Not sure if squid will still be able to deal with the SSL traffic correctly if it's second in the chain (first E2guardian doing a basic URL filter, then Squid for caching)... I thought about pushing ssl traffic directly to squid through a PAC script, but then I lose URL filtering... I could add Squid guard for that, but that just sounds like a mess of stuff to go wrong...
-
can't a wpad be used here? Not having an issue filtering https sites with it (only google/other search engines images is a pain to filter).
-
can't a wpad be used here? Not having an issue filtering https sites with it (only google/other search engines images is a pain to filter).
I was thinking along those lines, but I would be skipping the URL filter completely for any HTTPS sites… Such as Anonymous proxy https sites etc... I'll play some more with it, it's possible that I can go through E2Guardian encrypted then have Squid decrypt for caching... Unless i'm misunderstanding your intended use of wpad...
Google is taken care of by using the DNS redirect, and I registered our school's IP address with bing so both are locked to safe search. All other search engines are just blocked...
-
Such as Anonymous proxy https sites etc
I block Anonymous proxy websites in squidguard to solve that.
Google is taken care of by using the DNS redirect
What method did you use to redirect all google domains? see https://forum.pfsense.org/index.php?topic=97948.0
and I registered our school's IP address with bing so both are locked to safe search
Home users are out of luck here :(
search engines are just blocked
How is this achieved? must be a big list as there are many search engines (each search engine could have many domains), what if new ones come up?
-
I block Anonymous proxy websites in squidguard to solve that.
Thats what i'm using E2Guardian for. But i'm worried I won't be able to use both SSL interception on Squid, and keep it running through E2Guardian first… I'll let you know how it goes.
What method did you use to redirect all google domains? see https://forum.pfsense.org/index.php?topic=97948.0
I used the method I described back a couple of pages on this thread. DNS override
How is this achieved? must be a big list as there are many search engines (each search engine could have many domains), what if new ones come up?
Again since i'm using E2Guardian, I just added search engines to the site block list. I made specific exceptions for google and bing since they are set to safe search.
-
When/How does this package make it to the 'menu' of available packages?