Squid caching website status messages
I currently have pfSense 2.0.2-RELEASE (i386) installed with Squid 2.7.9 pkg v.4.3.3. I have all port-80 traffic passing through Squid. I've noticed that if a website becomes unavailable and I fix the web server, I will continue to get website unavailable messages until I restart the Squid service. Afterward, it returns back to normal. It's not truely caching web content, but it seems to be caching the HTTP status of pages. Is there a way to turn this off? I have everything left blank in the Cache Mgmt tab, Access control has "127.0.0.1;192.168.0.1;" for the external cache-managers, and no other specific settings enabled that would cause such behavior. My custom options for Squid are:
strip_query_terms off; ident_lookup_access allow all; ident_timeout 3 seconds; http_port 192.168.0.1:8080 transparent;
Can anyone provide a direction to look in?
Maybe there is a little more than I initially thought. I have two websites here, one "default" site that listens on all addresses and URLs, and one that listens to a specific URL. The "default" site doesn't currently have anything but a "directory listing denied" message. If I have both sites running and load up the URL, I get the expected website. If I stop that website's service on the server and reload the URL, I get "directory listing denied". Turning the website back on, I continue to get "directory listing denied" until I restart the Squid service. Ideas?
Try with these custom options:
negative_ttl 1 seconds negative_dns_ttl 60 seconds positive_dns_ttl 6 hours dns_timeout 30 seconds
That resolves that problem. I've been trying to google the meaning of each, but details are pretty vague.
Thank you very much for the tip! It makes my life easier.
On squid-cache.org you probably find a description for nearly all config options. An example:
And you have the possibility to check the different values for the different squid versions.