E2guardian package for pfsense - $??
-
Not sure if this is known, but it seems that the SSLIMITM is not currently working. I had SSLIMITM setup on Squid and it was working fine, I added E2Guardian to the mix and turned on SSLIMITM and it stopped working. Maybe it was chaining the two together I don't know… I did notice that E2Guardian 3.2.0 was just released and in the notes was "SSLIMITM is now fully functional in explicit proxy mode". I'm not sure if that means that the current version is broken or.... I'm guessing it's going to be a little bit of time before E2Guardian update makes it's way over to FreeBSD...
Does anyone know the correct config for SSLIMITM? I'm using explicit so this isn't a transparent issue... My question is should I have problems with E2Guardian diong the SSL Man in the Middle AND Squid also doing the SSL Man in the Middle??? Should one be on and one be off? Or is there some other setting?
-
Not sure if this is known, but it seems that the SSLIMITM is not currently working.
Current e2guardian version for Freebsd does not have SSLMITM code.
-
And that's a good thing. :P
-
And that's a good thing. :P
So are you suggesting that there is a better way?? Or that you just don't like filtering HTTPS communication at all??? If it's the latter I totally get that stand and agree, but unfortunately at a school it's just not possible to ignore HTTPS traffic and not filter it… Not to mention the need for caching (Low bandwidth)....
Is there a way, for now, to pass SSL onto Squid and have Squid do the SSLIMITM so I can at least still cache???
-
MITM is evil. Period.
-
caching is fairly useless for the majority of the popular sites. dynamic content is a pain to cache. perhaps after some tweaking you'll manage a 5% hitrate at a typical school.
mitm ssl is essentially destroying ssl altogether. you say it's impossible to ignore https in schools ? i know personally of +30 schools that don't mitm their pupils …. they have rights too, privacy is one of them.
what about the evil_x_url? you do have teachers present right? -
Look I get it… I'm a privacy advocate myself, but not in this case... First off Caching is way higher then 5% for my use case, because 90% of the sites they are visiting will have a ton of static images and the sites are visited by multiple classes because they are education related. Second these are kids under the age of 15 using school provided computers for school related activities. They get no privacy because the computers are not to be used for private use. Teachers are present, but no one teacher can watch all computers all the time. There are reasons for having to man in the middle ssl connections. MITM hacks are not good, I get that and I don't like it either. But until they come up with a standard that allows for voluntary interception of secured communication (such as an https proxy where the man in the middle is expected and requested), MITM "hacks" will have to do.
Oh and https://www.fcc.gov/guides/childrens-internet-protection-act
-
have you considered ip blacklists? the paid subscriptions generally filter out most of the crap.
afaik squid3 can be made to work with ssl … once you get the falsified certs deployed to every client, it might work.
then you'll have to go around and install the certs on every cellphone/tablet/laptop of every student ... good luck with that.governments are allways wrong.
-
have you considered ip blacklists? the paid subscriptions generally filter out most of the crap.
afaik squid3 can be made to work with ssl … once you get the falsified certs deployed to every client, it might work.
then you'll have to go around and install the certs on every cellphone/tablet/laptop of every student ... good luck with that.governments are always wrong.
Governments are often wrong I agree :) The network is locked down to only authorized devices so no problems with the certs either… Google itself is another reason I have to do this, wish they had a better method... https://support.google.com/a/answer/1668854?hl=en
I have Squid working with SSL, but does anyone know if it will be an issue having the ssl traffic running encrypted through E2Guardian first? Not sure if squid will still be able to deal with the SSL traffic correctly if it's second in the chain (first E2guardian doing a basic URL filter, then Squid for caching)... I thought about pushing ssl traffic directly to squid through a PAC script, but then I lose URL filtering... I could add Squid guard for that, but that just sounds like a mess of stuff to go wrong...
-
can't a wpad be used here? Not having an issue filtering https sites with it (only google/other search engines images is a pain to filter).
-
can't a wpad be used here? Not having an issue filtering https sites with it (only google/other search engines images is a pain to filter).
I was thinking along those lines, but I would be skipping the URL filter completely for any HTTPS sites… Such as Anonymous proxy https sites etc... I'll play some more with it, it's possible that I can go through E2Guardian encrypted then have Squid decrypt for caching... Unless i'm misunderstanding your intended use of wpad...
Google is taken care of by using the DNS redirect, and I registered our school's IP address with bing so both are locked to safe search. All other search engines are just blocked...
-
Such as Anonymous proxy https sites etc
I block Anonymous proxy websites in squidguard to solve that.
Google is taken care of by using the DNS redirect
What method did you use to redirect all google domains? see https://forum.pfsense.org/index.php?topic=97948.0
and I registered our school's IP address with bing so both are locked to safe search
Home users are out of luck here :(
search engines are just blocked
How is this achieved? must be a big list as there are many search engines (each search engine could have many domains), what if new ones come up?
-
I block Anonymous proxy websites in squidguard to solve that.
Thats what i'm using E2Guardian for. But i'm worried I won't be able to use both SSL interception on Squid, and keep it running through E2Guardian first… I'll let you know how it goes.
What method did you use to redirect all google domains? see https://forum.pfsense.org/index.php?topic=97948.0
I used the method I described back a couple of pages on this thread. DNS override
How is this achieved? must be a big list as there are many search engines (each search engine could have many domains), what if new ones come up?
Again since i'm using E2Guardian, I just added search engines to the site block list. I made specific exceptions for google and bing since they are set to safe search.
-
When/How does this package make it to the 'menu' of available packages?
-
Will this still be E2guardian 2.2 or will it be a later version? as they are up to version 3.2.0 now
-
Will this still be E2guardian 2.2 or will it be a later version? as they are up to version 3.2.0 now
For now, e2guardian 3.0.4
-
When/How does this package make it to the 'menu' of available packages?
Am I being dense? Do we wait for a new version of pfsense to be released before we see this on the menu of Install-able Apps?
-
How is the performance of E2guardian vs dansguardian in pfsense?
-
How do you want to compare performance between two packages out of which one does not work any more (without tons of manual hack) and the other does not work yet (without tons of manual hacks)? Cannot see how's that even a factor here ATM. Not to mention, DG is dead code upstream.
-
well, I got dansguardian somewhat working (stopped using though), found it very slow and not very effective. Was looking forward to a faster more affective filter in E2guardian. I guest I was just looking for some reassurance that this will be the case. Anyway keep up the good work :)