E2guardian package for pfsense - $??
-
I do appreciate the work you have been doing to clean up the packages! You have been committing clean up and fixes like crazy and I really do appreciate it! I will revoke my previous opinion only if the packages continue to be well maintained after this blitz :) . To me the packages are one of the greatest strengths of PFSense!
Hopefully packages will be much more self-maintaining and uptodate with 2.3. The PBI packaging format was one of the worst disasters that could happen here. Hopefully, someone would also
- provide documentation
- provide some VM image with pre-built package server set up ready for testing
The current state with exactly ZERO documentation had led to all this shit, certainly NOT the way to go on!
P.S. Don't count on me maintaining ~100 of packages. I'm ~ half way done with the cleanup, considerably tired, disgusted by the shitty code and annoyed by the absence of any docs. Even the very little we had (a howto on setting up a custom package repo) has vanished.
:( >:( >:(
I definitely don't expect that any one person can keep up with all the packages. I hope they implement the things your suggesting and provide documentation on packages so that you won't ever have to clean the mess you are dealing with right now.
-
Does E2guardian have "Use SafeSearch engine"?
-
It's been a while since I used pfSense until my old setup died of capacitor failure for the memory. Currently, I have pfSense 2.2.4 i386 installed on another computer that I received from my grandpa a couple of months ago.
It's running on a Compaq Presario 6300us (6000 Series) Desktop with the following specs:
Intel Celeron (Pentium III/Pentium III Xeon) Socket 370 clocked at 1.4GHz (100MHz FSB Multiplier of 14)
512MB PC-133 SD-RAM (Max is 512MB unfortunately)
Intel i810 Chipset (reason the system has 512MB max memory)
Intel Integrated Graphics (up to 11MB Memory)
Syba SATA PCI Controller Card
Syba USB 2.0 PCI Card (might remove)
Intel Pro/1000GT Desktop Gigabit Ethernet PCI Card (WAN)
Integrated Realtek RTL8139 FastEthernet Adapter (LAN)
1.44MB Floppy Drive (as fellow YouTuber uxwbill would like to say: "All real computers have floppy drives")
Mitsumi 48x CD ROM
Lite-On DVD Burner
200W ATX PSU
60GB SATA Slim HDD (pulled from my old Xbox 360 that had a graphics card failure, copied all of my files from the drive to the new Xbox 360)
Vantec 2.5" Slim to 3.5" SATA HDD converterpfSense 2.2.4 i386
I have done the manual installation of E2guardian, and I have to say it's more stable than the Dansguardian package, however, I am running into two issues: The first issue is the Daemon tab (not too serious), but clicking on the tab again brings the menu to its knees, and whenever I try to run the clamav in the General settings, the service stops, and starts back up after I disable the clamav since the one supplied in the Squid3 causes the internet to crash (page not found errors galore). Other than that, the package is in great progress. So far, so good.
Just have to add the blacklist, make new ACLs for me, and test to see what would happen.
-
I went ahead and installed manually. First off thank you Marcelloc for all your work on this and everyone else! Content filtering is such a HUGE option to have in my opinion. I did hit a couple of issues however… First the tabs, Users, and IPs, do not work. They both point to xml files but the only files that match are template files. I could see where the users template files (There are two) could be combined and be a working XML, but the IPs file is only half of the xml file. Second and it's not a break issue, the first tab doesn't load correctly until you click on it and have it reload. I can see that it's just how the original menu option loads the url is slightly off. Everything else seems to be working great and I was able to get it all up and running without much work! Love it!
Thanks again for all your hard work on this!!!!
It looks like after a reboot the Users and IPs tabs both work. Unfortunately the E2guardian service wouldn't start at all. After checking the logs it appears that the black list I was using didn't have an AD folder which for some reason threw everything off. I created it manually with a "domains", and "URLs" file inside, and the service started back up. Not sure if that's an issue with the black list (www.shallalist.de) or E2guardian…
-
Does E2guardian have "Use SafeSearch engine"?
I haven't found that as an option anywhere. Google has changed the way it implements forced safe search though. It's DNS based now.
https://support.google.com/websearch/answer/186669?hl=enI'm sure that the filter/proxy could do a redirect of some sort also, but I just implemented it through DNS as Google suggests.
-
It would be nice if it did it automatically for all search engines with this feature.
I am having trouble find a step by step guide for setting this up with pfsense, do you have a link?
-
It would be nice if it did it automatically for all search engines with this feature.
I am having trouble find a step by step guide for setting this up with pfsense, do you have a link?
Most of the step by steps talk about using the nossl.google.com which doesn't work anymore.
Basically you just have to be running DNS Resolver and all your computers pointing to it as their primary DNS (can be done through DHCP easy enough), then at the bottom of DNS Resolver add an override for www.google.com and have it point to 216.239.38.120 which is what forcesafesearch.google.com resolves to. (Hint you can also do this for www.youtube.com)
It's not the best solution since it would be better to use the FQDN forcesafesearch.google.com, but you can't do CNAME records in DNS Resolver so using the IP address works.
Also if this is in a Microsoft Domain environment you want all windows boxes pointing to your DNS server running on the Domain Controller. To work around that, you can add a forwarder to the MS DNS server to point to your pfSense box that is running DNS Resolver. (You can't add a CNAME record to MS DNS for www.google.com because Server 2008 r2 and above won't allow it)…
Also to make sure things are locked down, block all dns requests from going out your firewall. That way people can't specify a DNS server outside your control to grab www.google.com from.
https://doc.pfsense.org/index.php/Blocking_DNS_queries_to_external_resolversHope that's clear enough...
-
It would be nice if it did it automatically for all search engines with this feature.
I am having trouble find a step by step guide for setting this up with pfsense, do you have a link?
I just found this that looks promising.. I don't know if it would work with HTTPS though. Or at least not until you setup HTTPS filtering.
https://groups.google.com/forum/#!topic/e2guardian/1ieUvbsNDkU
In urlregexplist
"(^http://[0-9a-z]+.google.[a-z]+[-/%.0-9a-z]/images?)"->"\1safe=vss&"
"(^http://[0-9a-z]+.google.[a-z]+[-/%.0-9a-z]/search?)"->"\1safe=vss&"
"(^http://[0-9a-z]+.google.[a-z]+[-/%.0-9a-z]*/s?)"->"\1safe=vss&"vss = very safe search
-
I am trying to enable the clamav in the e2guardian program, but alas, I get an error that is shown on my monitor:
Aug 17 20:56:44 e2guardian[86361]: Unable to load plugin config /usr/local/etc/e2guardian/contentscanners/clamdscan.conf
I'm guessing this is still in a working progress? However, I'm glad that the squid3 comes with the clamav, but I like the e2guardian (formally Dansguardian) access denied error when it finds a virus on a site, like eicar's test antivirus.
I'll post back if I found a solution (temporarily speaking).
-
"(^http://[0-9a-z]+.google.[a-z]+[-/%.0-9a-z]/images?)"->"\1safe=vss&"
"(^http://[0-9a-z]+.google.[a-z]+[-/%.0-9a-z]/search?)"->"\1safe=vss&"
"(^http://[0-9a-z]+.google.[a-z]+[-/%.0-9a-z]*/s?)"->"\1safe=vss&"vss = very safe search
That could be a starting point, maybe even add youtube, yahoo, bing and other search engines (this would be a nice option in the E2guardian menu), will try out the manual way once E2guardian is out in the package.
I got it to work on youtube (it now shows restricted mode on at the bottom).
Spoke too soon, it stopped working.
Ok youtube working now,however not googleworking nowshowever it did not work for google search (in search settings "turn on safe mode" is unticked).
-
Not sure if this is known, but it seems that the SSLIMITM is not currently working. I had SSLIMITM setup on Squid and it was working fine, I added E2Guardian to the mix and turned on SSLIMITM and it stopped working. Maybe it was chaining the two together I don't know… I did notice that E2Guardian 3.2.0 was just released and in the notes was "SSLIMITM is now fully functional in explicit proxy mode". I'm not sure if that means that the current version is broken or.... I'm guessing it's going to be a little bit of time before E2Guardian update makes it's way over to FreeBSD...
Does anyone know the correct config for SSLIMITM? I'm using explicit so this isn't a transparent issue... My question is should I have problems with E2Guardian diong the SSL Man in the Middle AND Squid also doing the SSL Man in the Middle??? Should one be on and one be off? Or is there some other setting?
-
Not sure if this is known, but it seems that the SSLIMITM is not currently working.
Current e2guardian version for Freebsd does not have SSLMITM code.
-
And that's a good thing. :P
-
And that's a good thing. :P
So are you suggesting that there is a better way?? Or that you just don't like filtering HTTPS communication at all??? If it's the latter I totally get that stand and agree, but unfortunately at a school it's just not possible to ignore HTTPS traffic and not filter it… Not to mention the need for caching (Low bandwidth)....
Is there a way, for now, to pass SSL onto Squid and have Squid do the SSLIMITM so I can at least still cache???
-
MITM is evil. Period.
-
caching is fairly useless for the majority of the popular sites. dynamic content is a pain to cache. perhaps after some tweaking you'll manage a 5% hitrate at a typical school.
mitm ssl is essentially destroying ssl altogether. you say it's impossible to ignore https in schools ? i know personally of +30 schools that don't mitm their pupils …. they have rights too, privacy is one of them.
what about the evil_x_url? you do have teachers present right? -
Look I get it… I'm a privacy advocate myself, but not in this case... First off Caching is way higher then 5% for my use case, because 90% of the sites they are visiting will have a ton of static images and the sites are visited by multiple classes because they are education related. Second these are kids under the age of 15 using school provided computers for school related activities. They get no privacy because the computers are not to be used for private use. Teachers are present, but no one teacher can watch all computers all the time. There are reasons for having to man in the middle ssl connections. MITM hacks are not good, I get that and I don't like it either. But until they come up with a standard that allows for voluntary interception of secured communication (such as an https proxy where the man in the middle is expected and requested), MITM "hacks" will have to do.
Oh and https://www.fcc.gov/guides/childrens-internet-protection-act
-
have you considered ip blacklists? the paid subscriptions generally filter out most of the crap.
afaik squid3 can be made to work with ssl … once you get the falsified certs deployed to every client, it might work.
then you'll have to go around and install the certs on every cellphone/tablet/laptop of every student ... good luck with that.governments are allways wrong.
-
have you considered ip blacklists? the paid subscriptions generally filter out most of the crap.
afaik squid3 can be made to work with ssl … once you get the falsified certs deployed to every client, it might work.
then you'll have to go around and install the certs on every cellphone/tablet/laptop of every student ... good luck with that.governments are always wrong.
Governments are often wrong I agree :) The network is locked down to only authorized devices so no problems with the certs either… Google itself is another reason I have to do this, wish they had a better method... https://support.google.com/a/answer/1668854?hl=en
I have Squid working with SSL, but does anyone know if it will be an issue having the ssl traffic running encrypted through E2Guardian first? Not sure if squid will still be able to deal with the SSL traffic correctly if it's second in the chain (first E2guardian doing a basic URL filter, then Squid for caching)... I thought about pushing ssl traffic directly to squid through a PAC script, but then I lose URL filtering... I could add Squid guard for that, but that just sounds like a mess of stuff to go wrong...
-
can't a wpad be used here? Not having an issue filtering https sites with it (only google/other search engines images is a pain to filter).