Squid ssl https cache pdf files
-
What do you want to debug here? The entire workflow is broken. How on earth is a cloud hosting a terabyte worth of insanely huge PDF files which keep changing and that users need to work with locally all the time a sane way to do things?
I agree ::)
-
this confluence system is not in my hand, we (with our 200 people) are only 1 of x locations they access this system. in near future they dont change anything. so when this sslbump pdf cache not working, then i let it be as it is. its okay, in some month we got a 100Mbit internet cable :)
-
this confluence system is not in my hand, we (with our 200 people) are only 1 of x locations they access this system. in near future they dont change anything. so when this sslbump pdf cache not working, then i let it be as it is. its okay, in some month we got a 100Mbit internet cable :)
Nevertheless, although not in your hand and perhaps not your own design choice, this is the drawback of such strangely design solution.
Cloud based approach may have, in some cases, some added value, either as target design or transition solution while designing something else but constraints have to be clearly understood and trying to bypass it, like you do, introducing some unexpected component in the middle (here your proxy cache) will just break the way is works, regardless how initial design is perceived.
-
I've never heard of that website (probably because I'm in the US). Anyway, I didn't have an account but I did poke around the site until I found a PDF link to test - this cached successfully without a problem on my end but then again it's not really a large file.
https://www.atlassian.com/legal/privacy-policy/pageSections/0/contentFullWidth/00/content_files/file/document/2015-06-23-atlassian-privacy-policy.pdf
Possibility #1 - If your caching currently works and your SSL is setup correctly, there might just be a limitation with the "Maximum object size" under the "Local Cache" Tab of Squid. If you want to cache a 100Mb file this setting should be at least "100000" as it represents kilobytes. I currently have mine set to 300000.
Possibility #2 - perhaps you have an proxy exception rule applied to either an IP address or URL which could be linked to a hosted CDN. If you don't use any proxy exception rules then you can ignore this, but
if you do you might try disabling the rule temporarily and simply retest.I've personally setup two Aliases for this specific reason "Proxy_Bypass_Hosts" and "Proxy_Bypass_Ranges". I use these specifically to whitelist sites, IP's and/or IP Ranges using ARIN and Robtex when addressing problem applications or services.
-
@JStyleTech:
Possibility #1 - If your caching currently works and your SSL is setup correctly, there might just be a limitation with the "Maximum object size" under the "Local Cache" Tab of Squid. If you want to cache a 100Mb file this setting should be at least "100000" as it represents kilobytes. I currently have mine set to 300000.
richie1985 allready post his squid.conf
and "maximum_object_size" is set to 512000 KB@JStyleTech:
Possibility #2 - perhaps you have an proxy exception rule applied to either an IP address or URL which could be linked to a hosted CDN. If you don't use any proxy exception rules then you can ignore this, but
if you do you might try disabling the rule temporarily and simply retest.I've personally setup two Aliases for this specific reason "Proxy_Bypass_Hosts" and "Proxy_Bypass_Ranges". I use these specifically to whitelist sites, IP's and/or IP Ranges using ARIN and Robtex when addressing problem applications or services.
Cand find anything that point to an exeption in the squid.conf