Improve Custom refresh pattern
-
Thank you!! This works great even in 2022
(Image: GUI Refresh added)
(Image: Showing a refresh is now marked as "miss" so ready for next time)Side Note: Have you tried blocking out by regular expression in Squid Guard? I did m.talk.google. I have been researching a way to do this, last night I found a bit of information on the internet on Reddit.
Reason 1. I do not have mtalk installed or in use on any system.
Reason 2. it runs all the time nonstop on any system you have Chrome installed on.Reason 3. It passed traffic even when not using Chrome.
(Image: SquidGuard regular expression)
(Image: Mtalk gone!!!) -
Anyone figure out a way to use this with containers that are used with Docker? A catch and guard for approved containers?
-
@jonathanlee what exactly is it that you're wanting to know if you can cache this way for Docker images, like are you wanting the cache the apt packages or what is it you're after? If it's the apt packages, that I'm pretty confident that they already get cached regardless of whether they're in Docker or not because of some of the previous refresh patterns covered. Also my apologies I actually haven't been messing with this for a little bit due to other network complications, I actually had to uninstall squid entirely, so I'm a little bit late to responding to any of this.
-
Hello thank you for the response. Yes I am trying to cache this way for Docker VM container/images. Mainly I would like to know if Kali's pentesting toolkit/container is downloaded from a security standpoint, Kali is an amazing pentesting tool however in the wrong hands it can be scary. I would like to have the ability to block for one that specific container from unapproved use, similar to what Squidguard does with unwanted URLs. The goal here is to start to compartmentalize Docker containers and other 3rd party VM containers, with a tracking system that has labels functions similar to Squidguard's "Blacklisting" options. I really would like to learn more about blocking specific Docker containers from being downloaded at all over a firewall with the help of dynamic catching. This is very new to me working with custom patterns. However, for cybersecurity, this is pure gold for current container based security issues. With containers now having an ability to even sandbox themselves on user or antivirus discovery similar to that of Windows 10's Sandbox software, these unapproved containers are now becoming a cybersecurity issue. Todays rapid internet speeds and full VM deployments inside of containers are becoming the norm as well as becoming a real talking point for cybersecurity issues. Kali's Docker container would be the main Item I want to block to learn with. Once a container is downloaded and in use on a system that system sometimes can not see it, or even scan that container with antivirus software, as this container/VM sometimes resides in RAM or memory on some versions. Memory today has reached levels of 16GBs and up on some laptops. I for one use to run a 8088 CPU that had a 10MB HDD and MS-DOS 3.11 on it. Now with the volatile memory sizes of today this paints a new issue that plagues our electronic devices if left unchecked. The main goal is to have the proxy have container based security options and measures. This proxy caching looks like a major start for a solution for such a major issue. Older content accelerators of the past with some new software can fix a lot of container issues. Yes not all containers are a issue but like any there is always some bad ones we would like to keep a close eye on.
-
@jonathanlee said in Improve Custom refresh pattern:
Older content accelerators of the past with some new software can fix a lot of container issues. Yes not all containers are a issue but like any there is always some bad ones we would like to keep a close eye on.
"Older content accelerators used in the past with some new software can fix a lot of container issues. Yes not all containers are a issue but like any thing there is always some bugs we would like to keep a close eye on.
-
@jonathanlee assuming your using pfsense AND squid and not just using squid by itself on something like ubuntu, it might be possible to use the ACL tab and block or restrict access to specific domains used for the kali docker image and or find the urls that the kali docker image seeks for updates and block it that way, though at this point I feel this is less a job for squid refresh patterns and more a job for an actual firewall given your after blocking and not so much caching given you want to disallow access to kali. Unless I am mistaken, I feel this would be better served by firewall side and access control lists. In any case, it feels like you want to use access control lists, if you only want to allow certain authorized hosts to use kali, then you want access control lists to lock down access to specific ip addresses, ranges, or hosts.
-
@high_voltage spot on. I added custom URL blocking for Docker, ruby gems, and hotjar, as well as expressions and blocking of url shorteners. Thanks for your reply. I also keep having Windows 10 trying to update all the sudden over http, and the firewall is blocking that as a policy violation.
It's weird this update was working before. What's confusing is I can click on that link and it will download directly also.
I was able to direct the Raspberry pi to update only over https using a different mirror. But Windows it just keeps trying http.
-
@jonathanlee have you made sure that you cleared out the squid disk cache (again, assuming you're using squid over pfsense, which I would in hindsight hope you are doing now that I think about it given your posting here on the pfsense support forum, DUH) after making any and all changes? though....hindsight, you might not need to do that since your not messing with refresh patterns this time but instead messing with ACL's.... uh, not sure if this would be useful in this case here or not, but possibly, if you're saying things are being strange (not entirely sure, my mind is on 20 different projects right now as I make this latest reply right here) but maybe if things ARE being weird, try resetting the states table in pfsense? I have had the oddball situation that things after paying a ton of attention (more than I should have to be honest) I finally said "F it" and reset the firewall states table to try and test if that cleared out oddball situations, and it has, so maybe try that?
apologies for the hastily typed up reply, but as I said, my mind is on 20 different projects at the moment that I type this up.
-
@high_voltage thanks for the reply,
Yes this Netgate 2100 max firewall is running Squid over pfSense with custom refresh patterns and Squid guard. I also reset the disk cache and have this installed WPAD for on and off the wifi with smartphone I even added SSL certificates to everything. I forced all traffic into the proxy port 3128-3129 everything works Hulu, Disney plus, Amazon video. I watch XFILES alot lately over Hulu with this config. What's weird I can take that link and manually download the update over the browser, but somehow the system won't download the update. It's almost paused just gets to 0. My Raspberry Pi was doing this also, I changed the mirrors to only use https downloads under some settings and that now updates fine. Again Windows 10 pro is not using https for the anti virus signature updates right now. I am still new to this refresh options. When you force traffic over the proxy port 80 is still working for everything else I can access http over web url use. It can see Squid guard blocks and will display viruses on clam AV tests. This thing is a tank. This last configuration fix will make it work perfectly. Why does Microsoft use http for updates? Most of the internet moved to HTTPS. It's weird right? I am about to factory default it and try again. Nat I tested with a port redirect also. I tested using just transparent. Everything works except the Windows 10 pro updates. It worked last night however Microsoft started using https for a couple hours.
-
@jonathanlee ill be totally honest, part of why i recently disabled squid was for similar issues, for whay ever reason, linux updates broke recently when using squid, not sure why but this definitely seems to be an issue for some reason right now.
-
@high_voltage I got my Raspberry Pi to work with a different mirror I edited sources to one that allowed https. When I run apt-get update it uses a different mirror now I use the constant com's mirror.
Edit this file
/etc/apt/sources.listAdd a https source from the update mirrors for example in Raspberry Pi Linux I changed it to a https source.
Check out other countries some are almost all https like Germany.
-
@jonathanlee it has got to work the same for refreshers for other Linux flavors also.
-
@jonathanlee that was the only way to get Linux updates to work with Squid for me, it was doing the same thing as Windows updates, Squid would show a http and when you looked at Squid guard's live connections it would only show 0.
-
Made new post with this specific issue.
https://forum.netgate.com/topic/169166/warning-possible-bypass-attempt-found-multiple-slashes-where-only-one-is-expected-http-dl-delivery-mp-microsoft-com-filestreamingservice-files/3?_=1642466910316
-
Here it is, per your request, a Windows 10 update cached and delivered to another machine. Notice the HIT
(IMAGE: Windows dynamic refresh patterns to work recently)
-
@jonathanlee For whatever reason, it's worth noting I literally only just discovered 2 weeks ago that apparently a good chunk of my problems were due to transparent squid and clam AV, having clamAV set to scan all mode was causing random issues I cannot even begin to pinpoint. Setting it to scan Web only fixed everything, but having it set to scan all mode for whatever reason would cause apt packages To fail at trying to receive header information. Even http connections failed due to this.
-
@high_voltage I think it is the same as if you were to do a ClamAV scan on Kali Linux. So many packages and tools come up as issues when they are in fact only Pen Testing tools. In PFsense Curl, and many other items are included in packages and may scan as false positives also as they are not on a client machine however part of a firewall. It should have a scan Squid Cache option, that is what should be scanned right? Think about the number of items stored in the content accelerator that could be invasive. Why does squid not include scan local cache as an option?
-
@jonathanlee no, i mean it broke traffic entirely.
-
@high_voltage wow that's different. I had issues where I needed to clear the cache before the traffic would flow again, almost like a container was in the cache.
-
@jonathanlee Huh? last time I posted in this thread was 4 years ago.