Netgate Discussion Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Search
    • Register
    • Login

    How to limit filehosting websites.

    Scheduled Pinned Locked Moved Traffic Shaping
    11 Posts 5 Posters 4.8k Views
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • B
      briggs
      last edited by

      How can i limit the filehosting site download?

      if it is already defined in the layer7, what are the protocols so i can use the limiter and defined in the layer7?

      Thank you very much.

      1 Reply Last reply Reply Quote 0
      • C
        cmb
        last edited by

        They're HTTP, so they look no different from any other HTTP. For scenarios like that, easiest to put a limiter across all hosts. Alternatively if you have a proxy server or can set one up, you can set TOS based on URL and then shape on TOS.

        1 Reply Last reply Reply Quote 0
        • D
          dreamslacker
          last edited by

          For file sharing websites, you should use Squid to police the downloads instead because the mainpage and the actual download servers have different sub-domains.

          Setup the ACL URL regex for the domain names like rapidshare.com, megaupload.com etc.
          Then configure a delay pool for limiting the download speed.  Alternatively, you can even block access by using the ACL for DENY instead.

          Example for slowing down filesharing access:

          
          #Configure delay pool of class 2 to allow overall limit and per user limit
          delay_pools 1;
          delay_class 1 2;
          delay_parameters 1 8000/8000 4000/4000;
          #Configure filesharing Regex ACL
          acl filesharing url_regex -i rapidshare.com fileserve.com rapidshare.de megaupload.com depositfiles.com hotfile.com zshare.net uploading.com sharingmatrix.com filesonic.com 2shared.com 4shared.com;
          #Configure delay pools to use ACL
          delay_access 1 allow localnet filesharing;
          delay_access 1 allow localnet filesharing;
          #Do not penalize all other URLs
          delay_access 1 deny all;
          #Do not cache the traffic for the file sharing hosts
          cache deny filesharing;
          
          

          The example above allows a total of 8000 Byte/s for all users and 4000 Bytes/s for each user.  Change these limits as per your requirements.

          To deny access to filesharing instead, use the following code:

          
          acl filesharing url_regex -i rapidshare.com fileserve.com rapidshare.de megaupload.com depositfiles.com hotfile.com zshare.net uploading.com sharingmatrix.com filesonic.com 2shared.com 4shared.com;
          http_access deny filesharing;
          
          
          1 Reply Last reply Reply Quote 0
          • B
            briggs
            last edited by

            Thank you very much for the response. Will try your suggestions.

            1 Reply Last reply Reply Quote 0
            • B
              briggs
              last edited by

              Hi dreamslacker, I have my squid and squidguard running in pfsense, is it ok to add the code to the squid config file?

              1 Reply Last reply Reply Quote 0
              • D
                dreamslacker
                last edited by

                You do not need to directly edit the squid config file.

                Just copy and paste the code into the Custom Options box.

                1 Reply Last reply Reply Quote 0
                • B
                  briggs
                  last edited by

                  Hi dreamslacker, thank you for your help. squid always stop after i paste the code in the Custom option box and restart squid. I just found out that the traffic mngt is enabled with an overall bandwidth throttling of 1000kb and with a disabled per user throttling. I just disabled the bandwidth throttling in the traffic mngt and define the code you have provided, and then restarted squid. But when i tried to download large files like ms office in the microsoft website and add microsoft website in the code you provided and set to 100kb for the total bandwidth and 10kb for the individual user. it seems that the code is not working, because the speed or the bandwidth is almost 300kb.

                  I just enabled traffic mngt by defining 1000kb for the total bandwidth throttling and 10kb per user and then enabled also the Throttle only specific extensions. It works

                  thank you for all your response and really appreciate all you help.

                  1 Reply Last reply Reply Quote 0
                  • D
                    dreamslacker
                    last edited by

                    @briggs:

                    Hi dreamslacker, thank you for your help. squid always stop after i paste the code in the Custom option box and restart squid. I just found out that the traffic mngt is enabled with an overall bandwidth throttling of 1000kb and with a disabled per user throttling. I just disabled the bandwidth throttling in the traffic mngt and define the code you have provided, and then restarted squid. But when i tried to download large files like ms office in the microsoft website and add microsoft website in the code you provided and set to 100kb for the total bandwidth and 10kb for the individual user. it seems that the code is not working, because the speed or the bandwidth is almost 300kb.

                    I just enabled traffic mngt by defining 1000kb for the total bandwidth throttling and 10kb per user and then enabled also the Throttle only specific extensions. It works

                    thank you for all your response and really appreciate all you help.

                    You have to verify that the downloads from microsoft actually come from the same (sub-)domain as you added.  It is possible that microsoft links out via IP address rather than domain name.  In that case, the regex will not catch it.  Furthermore, if it rides on HTTPS, it won't work either.

                    1 Reply Last reply Reply Quote 0
                    • X
                      xarope
                      last edited by

                      I tried to implement what dreamslacker outlined above, but I then saw this error in the /var/squid/log/cache.log file:
                      parse_delay_pool_count: multiple delay_pools lines, aborting all previous delay_pools config

                      looking into /usr/local/etc/squid/squid.conf, sure enough, I see that the webgui's proxy server: traffic mgmt tab, already creates and uses a delay_pools, hence why you can't create one yourself in custom options.

                      Anybody have any ideas about how to get around this?

                      1 Reply Last reply Reply Quote 0
                      • B
                        briggs
                        last edited by

                        @dreamslacker:

                        @briggs:

                        Hi dreamslacker, thank you for your help. squid always stop after i paste the code in the Custom option box and restart squid. I just found out that the traffic mngt is enabled with an overall bandwidth throttling of 1000kb and with a disabled per user throttling. I just disabled the bandwidth throttling in the traffic mngt and define the code you have provided, and then restarted squid. But when i tried to download large files like ms office in the microsoft website and add microsoft website in the code you provided and set to 100kb for the total bandwidth and 10kb for the individual user. it seems that the code is not working, because the speed or the bandwidth is almost 300kb.

                        I just enabled traffic mngt by defining 1000kb for the total bandwidth throttling and 10kb per user and then enabled also the Throttle only specific extensions. It works

                        thank you for all your response and really appreciate all you help.

                        You have to verify that the downloads from microsoft actually come from the same (sub-)domain as you added.  It is possible that microsoft links out via IP address rather than domain name.  In that case, the regex will not catch it.  Furthermore, if it rides on HTTPS, it won't work either.

                        I think it is in the sub-domain of the microsoft website. The url is download.microsoft.com and i defined microsoft.com. I also try downloading from microsoft (same file) with set up of traffic mngt above, and change the per user bandwidth. and during the download, i got the bandwidth closed to what i just defined in the traffic mngt. Anyway thank you for your kindness.

                        1 Reply Last reply Reply Quote 0
                        • D
                          dhatz
                          last edited by

                          @cmb:

                          Alternatively if you have a proxy server or can set one up, you can set TOS based on URL and then shape on TOS.

                          Talking about setting TOS in Squid, there is an interesting feature called ZPH (Zero Penalty Hit) included in recent Squid versions, which can be used to set TOS of already cached content (cache "HIT") so it can be delivered to local users at full speed, i.e. only shape un-cached traffic.

                          Is anyone using such a setup with pfsense?

                          I just started to configure it (added zph_local to squid.conf, checked with tcpdump that squid cache HIT entries sent out packets with correct TOS set etc) and will probably complete the setup tomorrow.

                          1 Reply Last reply Reply Quote 0
                          • First post
                            Last post
                          Copyright 2025 Rubicon Communications LLC (Netgate). All rights reserved.