Ensuring safe searches
Hello, I am trying to see if there is any way to ensure that web searches are re-written to be 'safe' for the major search engines. SquidGuard claims to have this capability, but it has never worked for me. I have Squid configured as a transparent proxy, with HTTPS/SSL Interception set to Splice All so as to be able to use SquidGuard so https destinations can be filtered.
I don't know if this is the problem, but I don't want to go the https man-in-the-middle approach and require CA's on the clients. Anybody out there that has successfully gotten this to work, and can explain their method? Web searches on the topic don't seem to turn up solutions, or refer to rewriting google.com DNS addresses to nosslsearch.google.com, but it seems like that approach is no longer available.
You have to use MITM for it to work or you can use this method, which also works perfectly fine.