Unofficial E2guardian package for pfSense
-
Unselect any autentication, save and apply(and maybe restart the service). The small documentation on proxy-header says that you must select a http header field to identify as users, and then add them on groups
# Proxy-header auth plugin # FredB August 2016 # Identifies users with header; # relies upon the upstream proxy. # Eg: in groups file # Mozilla/5.0 (Windows NT 6.1; rv:47.0) Gecko/20100101 Firefox/47.0=filter3 # here: # header = 'user-agent'
-
Have updated looks ok so far except on the menus if you go to IPs then ACLs changes to Access Lists - will clear cache etc. and check again…
Don't believe sync has ever worked with the default 'Sync to configured system backup server' either - this is the log entry..
Apr 11 11:57:52 php-fpm 69222 /pkg_edit.php: [E2guardian] xmlrpc sync is enabled but there is no system backup hosts to push squid config.
Apr 11 11:57:52 php-fpm 69222 /pkg_edit.php: Reloading E2guardianEvery other package syncs fine with this setting.
-
Unselect any autentication, save and apply(and maybe restart the service). The small documentation on proxy-header says that you must select a http header field to identify as users, and then add them on groups
# Proxy-header auth plugin # FredB August 2016 # Identifies users with header; # relies upon the upstream proxy. # Eg: in groups file # Mozilla/5.0 (Windows NT 6.1; rv:47.0) Gecko/20100101 Firefox/47.0=filter3 # here: # header = 'user-agent'
Is there a difference between "None" and not having any selected?
-
With no authentication set I now get an Access Denied page from Squid.
When I access a page the Squid logs show:
1491926079.428 3 127.0.0.1 TCP_MISS/403 4185 GET http://forum.pfsense.com/ - HIER_NONE/- text/html 1491926079.455 248 127.0.0.1 TCP_MISS/403 4287 GET http://forum.pfsense.com/ - ORIGINAL_DST/127.0.0.1 text/html 1491926079.499 2 127.0.0.1 TCP_MEM_HIT/200 13066 GET http://localhost:3128/squid-internal-static/icons/SN.png - HIER_NONE/- image/png 1491926079.599 1 127.0.0.1 TAG_NONE/409 4118 CONNECT urs.microsoft.com:443 - HIER_NONE/- text/html 1491926079.601 1 127.0.0.1 TAG_NONE/409 4118 CONNECT urs.microsoft.com:443 - HIER_NONE/- text/html
/var/log/e2guardian/access.log shows:
2017.4.11 13:29:04 - 192.168.1.100 http://forum.pfsense.com/ GET 3849 0 1 403 text/html Default - - 2017.4.11 13:29:04 - 192.168.1.100 https://urs.microsoft.com:443 CONNECT 4118 0 1 200 - Default - - 2017.4.11 13:29:04 - 192.168.1.100 https://urs.microsoft.com:443 CONNECT 4118 0 1 200 - Default - -
No matter what page I go to, I get those same 5 lines. The first 2 vary depending on where I go but the last 3 are exactly the same. If I point directly to the Squid proxy port then pages load. If I pass through transparently then it works. If I point to the port 8080 for e2guardian it fails. I've gone back and undone the changes I made to proxy-header.conf and everything loads properly now. I've deselected all auth plugins on the General page.
EDIT: Well, not every time. I also get
1491928118.800 86400440 192.168.1.157 TAG_NONE/200 0 CONNECT 216.58.218.2:443 - HIER_NONE/- - 1491928118.800 86400301 192.168.1.157 TAG_NONE_TIMEDOUT/409 0 CONNECT pagead2.googlesyndication.com:443 - HIER_NONE/- text/html;charset=utf-8
EDIT 2: Here's my config:
forcequicksearch = off reverseaddresslookups = off reverseclientiplookups = off logclienthostnames = off createlistcachefiles = on prefercachedlists = off maxcontentfiltersize = 256 maxcontentramcachescansize = 1000 maxcontentfilecachescansize = 2000 filecachedir = '/tmp' deletedownloadedtempfiles = on initialtrickledelay = 20 trickledelay = 20 downloadmanager = '/usr/local/etc/e2guardian/downloadmanagers/fancy.conf' downloadmanager = '/usr/local/etc/e2guardian/downloadmanagers/trickle.conf' downloadmanager = '/usr/local/etc/e2guardian/downloadmanagers/default.conf' contentscannertimeout = 60 contentscanexceptions = off mapauthtoports = off recheckreplacedurls = off forwardedfor = off usexforwardedfor = off logconnectionhandlingerrors = on logsslerrors = off logchildprocesshandling = off maxchildren = 120 minchildren = 8 minsparechildren = 8 preforkchildren = 10 maxsparechildren = 64 maxagechildren = 500 maxips = 0 ipcfilename = '/tmp/.dguardianipc' urlipcfilename = '/tmp/.dguardianurlipc' ipipcfilename = '/tmp/.dguardianipipc' nodaemon = off nologger = off logadblocks = off loguseragent = daemonuser = 'clamav' daemongroup = 'nobody' softrestart = on cacertificatepath = '/etc/ssl/demoCA/cacert.pem' caprivatekeypath = '/etc/ssl/demoCA/private/cakey.pem' certprivatekeypath = '/etc/ssl/demoCA/private/serverkey.pem' generatedcertpath = '/usr/local/etc/e2guardian/ssl/generatedcerts'
-
Wanted to say, thanks for bringing this project back alive. Been waiting an extremely long time for E2 Guardian, as it seems to tick all my boxes in being able to scan HTTPS traffic properly, without having to rely on a black list or DNS based blocking. Most people don't seem to understand the importance of HTTPS scanning, even knowing that there's always new proxies, and websites coming along to bypass blocks.
How is the status of the package now? I see Stewart is saying that it isn't working, is it working for everyone else?
-
I've tested normal proxy and SSL interception. Did not started testing authentication and antivirus integration
-
Wanted to say, thanks for bringing this project back alive. Been waiting an extremely long time for E2 Guardian, as it seems to tick all my boxes in being able to scan HTTPS traffic properly, without having to rely on a black list or DNS based blocking. Most people don't seem to understand the importance of HTTPS scanning, even knowing that there's always new proxies, and websites coming along to bypass blocks.
How is the status of the package now? I see Stewart is saying that it isn't working, is it working for everyone else?
I've wiped the box I was having problems with and will go back to testing once my other projects simmer down since I couldn't ever get it to work properly. I'll report back here once I get back on it.
-
So I got this setup at basic level, and have got a phrase list up. However, how do I get it to scan HTTPS sites? I have a phrase list up for pornography, and it seems to be working for http sites but not HTTPS. And now it seems more and more websites are moving to HTTPS.
I am purely trying to test the phrase matching system, and make sure that this system is able to block those sites.
-
Create the CA on pfSense
apply it on e2guardian config
Install ca certificate on browser -
Create the CA on pfSense
apply it on e2guardian config
Install ca certificate on browserI thought it could be done transparently in e2Guardian. Is that not so?
-
Create the CA on pfSense
apply it on e2guardian config
Install ca certificate on browserGot it working by setting "Filter SSL sites forging SSL certificates" on group settings. However I am getting a error saying common name invalid on chrome, its working on Edge and Internet Explorer. Also, is there anyway to modify the block page? The standard DansGuardian one looks pretty terrible.
-
I thought it could be done transparently in e2Guardian. Is that not so?
I don't think so. To anylize the page content, it needs to be in the middle. The splice all feature on squid, if I'm not worng, extracts sites included on remote ip certificate and check against acls, no interception at all is done this way.
-
is there anyway to modify the block page? The standard DansGuardian one looks pretty terrible.
Sure, just need to edit template file on gui. Try to not include any javascript on it. At least on previous package version, pfSense GUI was including some extra code from active session.
-
What browser is everyone else using for HTTPS inspection? Chrome just doesn't seem to want to accept the self signed certificate. Even though I have it installed on my system as a trusted root CA.
-
I use chrome. What chrome complains about the site? Is the generated certificate different from the site your are trying to access?
-
I use chrome. What chrome complains about the site? Is the generated certificate different from the site your are trying to access?
Chrome complains about a missing alternative name, seems like it could be an issue relating to the way Squid forges certificates.
Here's a link to what I found regarding this issue, seems it's related to browsers upping security. I know you can get around this for sure since they do it at my school using smoothwall.
http://stackoverflow.com/questions/43665243/chrome-invalid-self-signed-ssl-cert-subject-alternative-name-missing
-
Found a small issue, when you block a category of domains using black list. When you try to visit any domain in that category it shows categories as : N/A, instead of showing the actual category.
-
Found a small issue, when you block a category of domains using black list. When you try to visit any domain in that category it shows categories as : N/A, instead of showing the actual category.
Saw this while using Dansguardian in the past. Do you think it's an old bug or something with report template file?
-
Found a small issue, when you block a category of domains using black list. When you try to visit any domain in that category it shows categories as : N/A, instead of showing the actual category.
Saw this while using Dansguardian in the past. Do you think it's an old bug or something with report template file?
Maybe an old bug, because it does seem to show the category if you use phrase lists.
-
e2guardian v 4.1 is stable now. I'll start updating the package for this new version.
Looks like it will not need anymore changes on SO to allow more then 1024 clients. On current compiled version I've set it to 4096 IIRC.
https://github.com/e2guardian/e2guardian/releases