Netgate Discussion Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Search
    • Register
    • Login

    Squid's new SslBump Peek and Splice for https caching?

    Scheduled Pinned Locked Moved Cache/Proxy
    7 Posts 4 Posters 2.4k Views
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • A
      aGeekhere
      last edited by

      Has anyone had any success caching https sites using the new SslBump Peek and Splice method?
      https://wiki.squid-cache.org/Features/SslPeekAndSplice

      Never Fear, A Geek is Here!

      GertjanG 1 Reply Last reply Reply Quote 0
      • GertjanG
        Gertjan @aGeekhere
        last edited by Gertjan

        @aGeekhere said in Squid's new SslBump Peek and Splice for https caching?:

        SslBump Peek and Splice

        I had some time this morning, so I decides that it was time for trying to understand what this 'peek','splice' and 'bump' is all about.

        World's biggest search engine was wilkling to help me : what does bump or splice mean ?

        A second link was asking that very question pointing to the first link, which was your https://wiki.squid-cache.org/Features/SslPeekAndSplice ...
        I read, as explained, the article 3 times.

        I retained

        Pick your poison

        , something I did understand.

        Btw : caching web pages is something from the past.
        These days, a web page is unique, fabricated for you, being valid at that moment, and meaningless, useless afterwards. A browser doesn't cache https pages, a web server doesn't neither. Any one in between is having a hard time.

        edit : sorry, I'm not answering your question, was just trying to understand it.

        No "help me" PM's please. Use the forum, the community will thank you.
        Edit : and where are the logs ??

        1 Reply Last reply Reply Quote 0
        • High_VoltageH
          High_Voltage
          last edited by

          to resurect a slightly old thread given both myself and @aGeekhere are both trying to implement this with our respective pfsense squid caches, to your point, yes, most SSL content these days is not HIGHLY useful for caching, but for the use I'm after, and the same use geekhere is after, there IS still relevance to us, for a couple examples:

          one such example that both me and geek are seeking to accomplish: nvidia.com uses SSL, most modern websites these days do which is why everyone is saying that caching is dead, but for a home user who has no issues with the ethical problems this has, given a home user, myself and geek, own every device on our respective networks, would like to be able to have nvidia driver downloads cached, if we have multiple systems that use nvidia drivers, why the heck should we download every update multiple times if we can cache them??

          one example, another example, this one is me specific as a media hoarder:

          say I want to browse one of my favorite artists art posts on (twitter/facebook/generic image site here)? that site most likely is delivered by SSL as most other sites these days are, well why do I NEED to reload EVERY SINGLE SITE FILE PER PAGE PER IMAGE I want to download when things like the css, javascript, and static HTML pages can be cached so only the NEW IMAGES need to be downloaded every added page load?

          caching can once again prove epic here in this situation as well, there are plenty of files on modern SSL driven websites that can competently be cached in a useful way even today with the bulk of the web being SSL driven, yea, dynamic sites like Wordpress and a LOT of facebook are useless for caching, but they are all still very much built using file chunks that ARE static and CAN be cached to speed up site load times and reduce bandwidth usage. with everyone and their cousins being stuck at home, if one home individual has the ability to reduce the bandwidth used by their web browsing, then why, even in the days of most sites being SSL driven, should we NOT want to do exactly this?

          call me weird, I know I'm weird, but dangit! I want to do exactly this.

          1 Reply Last reply Reply Quote 0
          • DerelictD
            Derelict LAYER 8 Netgate
            last edited by

            Considering squid never sees unencrypted traffic using peek/splice and all it ever sees is the traffic encrypted with the ephemeral key negotiated between the initial client and server connection, I have no idea how one might think a cache of that data will ever be any good to anyone else ever again.

            Chattanooga, Tennessee, USA
            A comprehensive network diagram is worth 10,000 words and 15 conference calls.
            DO NOT set a source address/port in a port forward or firewall rule unless you KNOW you need it!
            Do Not Chat For Help! NO_WAN_EGRESS(TM)

            1 Reply Last reply Reply Quote 0
            • High_VoltageH
              High_Voltage
              last edited by High_Voltage

              so, you're telling me that there is 100% officially or unofficially no way at all, (attempting blanket statement to make sure this is a fool's errand at this point per my interpretation of that being what you're stating here) to ever cache ssl traffic, what so ever?

              I honestly do not really care if it is a difficult task, I'm willing to put forth the effort if it's possible AT ALL, but I want to be totally clear, that my understanding of what you're saying, SO, ARE YOU saying for sure, it is 100% impossible, blanket statement across the board?

              I straight up admit I don't know much of what I am trying to do, so it would be nice to know for sure one way or another, if it IS possible at all anymore or not.

              to elaborate a slight bit more, my goal is both to cache, as well as to decrypt so as to scan with clamav the data in the ssl transmission, NOT just to cache it.

              1 Reply Last reply Reply Quote 0
              • A
                aGeekhere
                last edited by

                maybe QoS3 "Secure Caching in HTTPS Based on Fine-Grained Trust Delegation" Will help in the future https://www.hindawi.com/journals/scn/2019/3107543/#conclusion-and-future-work

                Never Fear, A Geek is Here!

                GertjanG 1 Reply Last reply Reply Quote 0
                • GertjanG
                  Gertjan @aGeekhere
                  last edited by

                  @aGeekhere said in Squid's new SslBump Peek and Splice for https caching?:

                  maybe QoS3

                  If the server, some proxy device and the client (browser) all install the needed modules ....
                  It would become one hack of a standard before such a thing gets implemented.
                  Typically, this will be needing 3 admins implementing software on their side,as end users often don't know what a 'proxy' is.

                  @High_Voltage said in Squid's new SslBump Peek and Splice for https caching?:

                  to scan with clamav the data in the ssl transmission, NOT just to cache it.

                  That would be my main reason to centralize (== cache ?) downstream data. As far as I know, only 'mails' are handled like this these days. That is, if you run your own mail server (like running some proxy). This takes down a huge security issue already.

                  Btw : You're happy, you control all your devices.
                  Those you don't : they go into the non trusted network. When these need access to local trusted resources like NAS : it will be a case by case consideration.

                  No "help me" PM's please. Use the forum, the community will thank you.
                  Edit : and where are the logs ??

                  1 Reply Last reply Reply Quote 0
                  • First post
                    Last post
                  Copyright 2025 Rubicon Communications LLC (Netgate). All rights reserved.