I want to Block all websites and allow only some
-
Hello,
I am new with Pfsense. Just have installed and added Squid , squidGuard and Lightsquid packages. I have configured them like manual says but the problem is that there are a lot of websites that LAN part can access even I have Deny all categories on blacklist? Can you help how to block all webistes and prevent users from searching from search engines?
Thank you in advance -
What are you doing to force your users to use the proxy? I assume you're blocking 80/443tcp on LAN?
-
Actually I dont have blocked ports 80/443. But if I do this the LAN side will still have access to internet right ?
-
I have blocked traffic for ports TCP 80/443 and it all fine with blocking all webistes but the problem now is that the whiteliste dont work.
I have add a Whitelist at Target Categories and configured as Whitelist at Common ACL and all other categories DENY.
Default access is Allow.
Also at Squid Proxy Server > ACL I have configured some websites as Whitelist.
Can you help how to solve this issue? -
…. Can you help how to block all webistes and prevent users from searching from search engines?
All search engins ?
As stated, block all connections who have a destination with port 80 or 443 in it.
If not doing so, maintaining a "black" list will all "search engines" is close to impossible.Actually I dont have blocked ports 80/443. But if I do this the LAN side will still have access to internet right ?
Closing 80/443 doesn't shut down you internet connection. Just all connections with a destination using port 80 and 443. These are the two that web servers - and web browsers use .
mail, dns, frp, ssh, vpn, etc etc etc etc will still work.edit : have a look at what OpenDNS can do for you.
-
We use opendns and want to pass to Pfsense but the problem now is that the whitelist is not working
Closing 80/443 doesn't shut down you internet connection. Just all connections with a destination using port 80 and 443. These are the two that web servers - and web browsers use .
mail, dns, frp, ssh, vpn, etc etc etc etc will still work.I know that other services will still work I only want to restrict them using the websites that the company dont allow.
I have blocked traffic for ports TCP 80/443 and it all fine with blocking all webistes but the problem now is that the whiteliste dont work.
I have add a Whitelist at Target Categories and configured as Whitelist at Common ACL and all other categories DENY.
Default access is Allow.
Also at Squid Proxy Server > ACL I have configured some websites as Whitelist.
Can you help how to solve this issue? -
Post screens of your config. Remember that with squidguard, you must go back to the General settings tab and click Save then Apply for changes on any other tab to take effect.
By the way, there is a dedicated forum for squid & squidguard, the Cache/Proxy forum.
-
Attached I have send the conf for Squid proxy and Squid Guard. I have used as BlackList http://www.shallalist.de/Downloads/shallalist.tar.gz
After every change I click the Apply button :)
-
You need to set rotation on your squid logs or they will fill up your drive eventually. You also might want to edit the X-Forward headers to delete, disable VIA mode and suppress squid version.
Next, show screens for squidguard: Common ACL and Target Categories.
-
Next, show screens for squidguard: Common ACL and Target Categories.









 -
What happens if you change the Whitelist selector from Whitelist to Allow in the Target Rules List?
-
Its the same thing even when use allow
-
And each time you're going back to the General settings tab, clicking Save then Apply?
It's been awhile since I've setup a whitelist but it was working for me. Make sure that it's order is first.
-
I have done all as you said but it still dont work (after every change I click Apply at general settings) the webistes added to whitelist stay always loading and at the end shows that the site take too long to respond. I don't know what to do we are very confused at this point
-
I have successfully done this entirely with aliases and pfsense rules only, on specific LAN interfaces. It is a very time consuming task to do this effectively by just using "Firewall Rules".
To be successful, you must be running Wireshark on a workstation on the LAN. Set the filter in Wireshark to DNS only, and resolve names while you browse the target website. Observe ALL the coincident domains and Content Distribution Service providers ( CDN networks) needed to deliver the target website (Akamai, Fastly, ….)
Smaller independent sites are relatively easy to isolate. Anything hosted on AWS is virtually impossible isolate.
In some cases the best solution is to derive IP lists in pfblockerng using the ASN lookup feature to create "Aliase permit" list rules which you can refer to from the firewall configuration screens. For example Facebook has its own ASN so its very easy to filter it either by blocks or permits. (ASN = Autonomous System Number)
Anyway that is the concept methodology in broad terms, to achieve your objective by just using "Firewall Rules". It's as much as I can help you with
Good luck !!
-
UPDATE
I have created a CA and activate HTTPS/SSL Interception with this configuration :
SSL/MITM Mode –------------- Splice All
SSL Intercept Interface(s)----------- LAN
SSL Proxy Port----------3129
SSL Proxy Compatibility Mode ----------- Modern
DHParams Key Size-------------2048
CA------------- CA Filter (the cetificate that I have created)other fields are default
At this point everything is ok the blacklist is blocked and the whitelist works but after some minutes some of whitelist goes black for example gmail.com. I have add it as gmail.com / mail.google.com in both Target Categories as whitelist and at Squid Proxy as whitelist at ACL.
- I have export the certificate and installed on Windows computer.