• Cache on squid installed on Pfsense not working

    5
    0 Votes
    5 Posts
    925 Views
    stephenw10S
    The reverse proxy is entirely separate, no need to enable that. Steve
  • [error] open() failed (2: No such file or directory)

    3
    0 Votes
    3 Posts
    1k Views
    kklouzalK
    I deleted the squid package since it was caching less than 5% of traffic anyways. But just in case your curious I revisited a bunch of the sites from the log that were throwing the error and it never came up once.
  • Blocking file extensions not shown in URL

    3
    0 Votes
    3 Posts
    522 Views
    D
    Sure. HTTPS filtering ON, Bump, splice whitelist. The Diadele solution may work, but it's not free. That solution probably rely on the Layer 7, since it promotes "Content Filter". After every new "virus spread" like today, the Bad Rabbit, I almost faint just by remembering that SOME content can't be blocked…
  • Waiting for Proxy Tunnel…

    8
    0 Votes
    8 Posts
    6k Views
    KOMK
    I don't bother caching dynamic content, youtube, microsoft updates, etc.. seemed pointless. Most of the web is dynamic these days.  I stopped caching two years ao when I realized that my hit rate was in the area of 3-7%, and it wasn't worth the hassle.
  • Unable to run ANY speed/ping tests

    7
    0 Votes
    7 Posts
    4k Views
    kklouzalK
    Thats fine, I really appreciate all the help :) If I go back to the flash based version it seems to work without any issues. Going over to other testing sites and they all fail. doing a google search for "Speed Test" lets you run the google speed tester (it fails to ever start) [image: fMfYG6r.png] http://speedtest.xfinity.com/ [image: FaM2DOB.png] https://www.speakeasy.net/speedtest/ [image: huhD3L7.png] https://www.verizon.com/speedtest/ [image: 8imRFTT.png] http://speedtest.att.com/speedtest/ -Worked (we all know AT&T is horrid though, they probably lie about their speed tests to make U-Verse customers feel better) https://fast.com/ -Worked (no clue) In all my previous attempts I never went down the google result page and tried THIS many test sites I appreciate all your help, thank you so much!
  • Squid Caching SSL

    5
    0 Votes
    5 Posts
    2k Views
    KOMK
    Not quite.  That config will allow you to get the domain but not the full URL or content.  You can use explicit with WPAD to get the domain, or transparent with Splice All.  Full URL or contents requires cert on every client, which is a major hassle.
  • Moving from BlueCoat to PfSense and issue with Squidguard

    1
    0 Votes
    1 Posts
    410 Views
    No one has replied
  • Let's Encypt problem on 2.4

    9
    0 Votes
    9 Posts
    6k Views
    jimpJ
    A new version of the ACME package will be available later today which should correct this.
  • Haproxy Widget: Missing Actions Button

    2
    0 Votes
    2 Posts
    513 Views
    D
    It is working now, the problem was the following: "Your user does not have access to "WebCfg - Services: HAProxy package" so it does not have sufficient privileges to control the haproxy process." https://redmine.pfsense.org/issues/7987
  • After 2.4.0 HAproxy nolonger works with ACL's

    3
    0 Votes
    3 Posts
    587 Views
    P
    under normal circumstances I would say yes but because it is resolving a DNS entry that resolves to 1 IP address and gets routed based on some rules I cannot have a "split-dns" situation with pfsense. It would be nice to have pfsense give back two different ip addresses to 1 dns entry depending on the subnet but that isnt the case lol.
  • Why select "allow" rather than "–-" in squidguard ACLs?

    1
    0 Votes
    1 Posts
    410 Views
    No one has replied
  • After 2.4 Upgrade: Squid agonizingly slow

    2
    0 Votes
    2 Posts
    1k Views
    -flo- 0-
    Lacking any sensible clue I ended up iterating through all options. Forcing resolution of IPv4 DNS lookup first seems to have solved the issue (option "Resolve DNS IPv4 First"). However I do not understand why and what was actually changed since August (2.4 RC version from mid August). I did not have this setting enabled before … Any insight regarding this?
  • [Solved] Clamav Custom Warning Page

    3
    0 Votes
    3 Posts
    575 Views
    G
    thank you doktornotor.  :)
  • Nest Delay Pools

    1
    0 Votes
    1 Posts
    457 Views
    No one has replied
  • Delay Pool Buckets Status

    1
    0 Votes
    1 Posts
    362 Views
    No one has replied
  • HAPROXY and constant traffic on LAN

    3
    0 Votes
    3 Posts
    710 Views
    dragoangelD
    Create 2 firewall rule to block trafic on interface LAN for IPv4 TCP destination: Firewall itself destination port 1: HTTP and duplicate it to second rule and change to destination port 1: HTTPS. This is easy like a charm
  • Multiple wan / multiple squid running on the same pfsense

    3
    0 Votes
    3 Posts
    617 Views
    dragoangelD
    Why not use one SQUID on localhost, and nat to that that interfaces you want? And properly configure it
  • HAProxy Certificate Transparency

    3
    0 Votes
    3 Posts
    875 Views
    M
    Thank you very much! It would be great if you could initiate steps towards including the required capabilities in the package or if you could recommend where/whom else I should ask.  Michael
  • Issues with valid Dynamic REGEX

    1
    0 Votes
    1 Posts
    396 Views
    No one has replied
  • What is the regex that is accepted by the reverse proxy rules of squid ?

    3
    0 Votes
    3 Posts
    3k Views
    S
    I've figured out the problem. It's two parted. 1. The regex library used in this case does not seem to support negative lookaheads like "(?!word)" for some reason. Not sure what library is used, if it's bundled with squid or if a local regex library is used. Maybe something can be done here? 2. It's indeed a allow/deny config. I think the only way to achieve what we're trying to do if problem 1 cannot be solved is to add some functionality to the reverse proxy GUI. acl rvm_server1 url_regex -i ^https?://(www.)?domain.com.$ acl rvm_server2 url_regex -i ^https?://(www.)?domain.com/cloud($|/).$ cache_peer_access rvp_server1 allow rvm_server1 cache_peer_access rvp_server2 allow rvm_server2 cache_peer_access rvp_server1 deny allsrc cache_peer_access rvp_server2 deny allsrc never_direct allow rvm_server1 never_direct allow rvm_server2 http_access allow rvm_server1 http_access allow rvm_server2 Above is an excerpt from my squid.conf as generated by pfsense. Adding a single line at the correct position solves the problem. cache_peer_access rvp_server1 deny rvm_server2 Adding the line above before the allow line of rvp_server1 and presto. Doing this from the GUI is probably easier to do by adding another url_regex on the same mapping page and denying that instead of cross referencing and I'm doing above. Does anyone acquainted with the pfsense squid package have any input on this? Maybe the thread should be moved to packages too.
Copyright 2025 Rubicon Communications LLC (Netgate). All rights reserved.