• 0 Votes
    5 Posts
    768 Views
    B

    Bumping this as I am experiencing the exact same issue with the exact same behavior. I have even tried putting a transparent bypass for 127.0.0.1 as the source and destination, the hostname of the firewall, and the firewall's own public address as a source with no success.

  • 1 Votes
    10 Posts
    2k Views
    JonathanLeeJ

    @JonathanLee said in UNOFFICIAL GUIDE: Have Package Logs Record to a secondary SSD drive Snort Syslog Squid and or Squid cache system:

    ln -s -F /nvme/LOGS_Optane/snort /var/log/snort

    Also you can do this with suricata.

    /var/log/suricata remove this mkdir /nvme/LOGS_Optane/suricata ln -s -F /nvme/LOGS_Optane/suricata /var/log/suricata
  • 0 Votes
    18 Posts
    2k Views
    JonathanLeeJ

    This is a better WPAD file

    server.modules = ( "mod_access", "mod_staticfile", "mod_expire", "mod_setenv" ) server.document-root = "/var/www/html" server.errorlog = "/var/log/lighttpd/error.log" server.pid-file = "/run/lighttpd.pid" server.username = "www-data" server.groupname = "www-data" server.port = 80 server.bind = "192.168.1.6" server.tag = "" server.range-requests = "disable" server.max-connections = 10 connect-timeout = 2 server.max-keep-alive-idle = 2 server.max-keep-alive-requests = 1 server.max-read-idle = 2 server.max-write-idle = 2 dir-listing = "disable" $HTTP["request-method"] =~ "^(TRACE|TRACK)$" { url.access-deny = ( "" ) } # Cache WPAD and proxy PAC files for 1 day (good practice) expire.url = ( "/wpad.dat" => "access plus 1 day", "/proxy.pac" => "access plus 1 day" ) # Disable access logs to reduce SD card wear (optional) accesslog = "" $HTTP["url"] =~ "^/(wpad\.dat|proxy\.pac)$" { setenv.add-response-header = ( "X-Content-Type-Options" => "nosniff", "X-Frame-Options" => "DENY", "Content-Security-Policy" => "default-src 'none';", "Cache-Control" => "public, max-age=86400", "Referrer-Policy" => "no-referrer", "X-Download-Options" => "noopen", "X-Permitted-Cross-Domain-Policies" => "none" ) # Allow only GET and HEAD methods $HTTP["request-method"] !~ "^(GET|HEAD)$" { url.access-deny = ( "" ) } # Restrict access by IP subnets $HTTP["remoteip"] == "192.168.1.0/27" { } else $HTTP["remoteip"] == "2001:470:8052:a::/64" { } else { url.access-deny = ( "" ) } } # Deny all other URL requests $HTTP["url"] !~ "^/(wpad\.dat|proxy\.pac)$" { url.access-deny = ( "" ) } # Strict URL parsing for security and consistency server.http-parseopts = ( "header-strict" => "enable", "host-strict" => "enable", "host-normalize" => "enable", "url-normalize-unreserved"=> "enable", "url-normalize-required" => "enable", "url-ctrls-reject" => "enable", "url-path-2f-decode" => "disable", "url-path-2f-reject" => "enable", "url-path-dotseg-remove" => "disable", "url-path-dotseg-reject" => "enable", ) url.access-deny = ( "~", ".inc" ) static-file.exclude-extensions = ( ".php", ".pl", ".fcgi" ) # Add WPAD MIME type for correct browser handling mimetype.assign = ( ".dat" => "application/x-ns-proxy-autoconfig", ".pac" => "application/x-ns-proxy-autoconfig" )
  • Squid V6.10

    Cache/Proxy
    32
    1 Votes
    32 Posts
    5k Views
    B

    @michmoor
    Yes, it works for them, unfortunately only there :(

  • 0 Votes
    10 Posts
    2k Views
    johnpozJ

    @JonathanLee tls 1.3 has been used for quite some time.. Any time I bother to look at the connection to pretty much anything its tls 1.3.. This connection to the forums is using tls 1.3

    ensi is dead but long live ech, that could be problematic I would bet..

    But again I don't do any sort of mitm, its not good practice - I want my ssl/tls to be end to end.. As the internet gods intended it to be ;)

    I have no need or desire to run a proxy.. If I want to block someting I would filter on IP or DNS.. Yes I block the bane of filtering doh and dot.

    I run a reverse proxy, but not as a filtering method or as a way to do mitm.. But as a way to offload the ssl connection because the actual services have no ssl support at all, or are a pain to setup. These connections are tls 1.3.. And I don't even allow 1.2, if your not using 1.3 then your not accessing it. And use strict sni - so if you don't send the valid sni your not being proxied in either. This keeps rando port scanners from being able to actually get to the sites interface.

    And I block most of the known scanners from talking to any of my forwards anyway, and only allow access into my forwards if your coming from US IP, etc.

  • 0 Votes
    16 Posts
    1k Views
    JonathanLeeJ

    So generation 2 proxy technology can help if its built right...

  • Squid.conf.documented mix up

    Cache/Proxy
    1
    0 Votes
    1 Posts
    177 Views
    No one has replied
  • Squid and IPv6

    Cache/Proxy
    1
    0 Votes
    1 Posts
    282 Views
    No one has replied
  • Hits Cache What is better?

    Cache/Proxy
    1
    0 Votes
    1 Posts
    278 Views
    No one has replied
  • New Squid 6.7 and Clamav 1.3.0

    Cache/Proxy
    11
    8 Votes
    11 Posts
    2k Views
    T

    @lg1980 said in New Squid 6.7 and Clamav 1.3.0:

    https://git.labexposed.com/lgcosta/gists/src/branch/main/squid-6x

    Hi

    I hope you are doing well.

    I have reinstall pfsense OS ,i need to reconfigure squid Proxy, I am unable to download pakage from above github link.Can you share the new repo link.

  • Squid and ACLs

    Cache/Proxy
    19
    0 Votes
    19 Posts
    4k Views
    JonathanLeeJ

    @mcury I also had to disable some ethernet rules that all the sudden showed a lot of activity

    Screenshot 2023-12-16 at 8.38.44 AM.png

  • 0 Votes
    3 Posts
    1k Views
    JonathanLeeJ

    Store ID program:

    I am using the built in program attached here..
    /usr/local/libexec/squid/storeid_file_rewrite

    #!/usr/local/bin/perl use strict; use warnings; use Pod::Usage; =pod =head1 NAME storeid_file_rewrite - File based Store-ID helper for Squid =head1 SYNOPSIS storeid_file_rewrite filepath =head1 DESCRIPTION This program acts as a store_id helper program, rewriting URLs passed by Squid into storage-ids that can be used to achieve better caching for websites that use different URLs for the same content. It takes a text file with two tab separated columns. Column 1: Regular expression to match against the URL Column 2: Rewrite rule to generate a Store-ID Eg: ^http:\/\/[^\.]+\.dl\.sourceforge\.net\/(.*) http://dl.sourceforge.net.squid.internal/$1 Rewrite rules are matched in the same order as they appear in the rules file. So for best performance, sort it in order of frequency of occurrence. This program will automatically detect the existence of a concurrency channel-ID and adjust appropriately. It may be used with any value 0 or above for the store_id_children concurrency= parameter. =head1 OPTIONS The only command line parameter this helper takes is the regex rules file name. =head1 AUTHOR This program and documentation was written by I<Alan Mizrahi <alan@mizrahi.com.ve>> Based on prior work by I<Eliezer Croitoru <eliezer@ngtech.co.il>> =head1 COPYRIGHT * Copyright (C) 1996-2023 The Squid Software Foundation and contributors * * Squid software is distributed under GPLv2+ license and includes * contributions from numerous individuals and organizations. * Please see the COPYING and CONTRIBUTORS files for details. Copyright (C) 2013 Alan Mizrahi <alan@mizrahi.com.ve> Based on code from Eliezer Croitoru <eliezer@ngtech.co.il> This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program; if not, write to the Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307, USA. =head1 QUESTIONS Questions on the usage of this program can be sent to the I<Squid Users mailing list <squid-users@lists.squid-cache.org>> =head1 REPORTING BUGS Bug reports need to be made in English. See http://wiki.squid-cache.org/SquidFaq/BugReporting for details of what you need to include with your bug report. Report bugs or bug fixes using http://bugs.squid-cache.org/ Report serious security bugs to I<Squid Bugs <squid-bugs@lists.squid-cache.org>> Report ideas for new improvements to the I<Squid Developers mailing list <squid-dev@lists.squid-cache.org>> =head1 SEE ALSO squid (8), GPL (7), The Squid wiki http://wiki.squid-cache.org/Features/StoreID The Squid Configuration Manual http://www.squid-cache.org/Doc/config/ =cut my @rules; # array of [regex, replacement string] die "Usage: $0 <rewrite-file>\n" unless $#ARGV == 0; # read config file open RULES, $ARGV[0] or die "Error opening $ARGV[0]: $!"; while (<RULES>) { chomp; next if /^\s*#?$/; if (/^\s*([^\t]+?)\s*\t+\s*([^\t]+?)\s*$/) { push(@rules, [qr/$1/, $2]); } else { print STDERR "$0: Parse error in $ARGV[0] (line $.)\n"; } } close RULES; $|=1; # read urls from squid and do the replacement URL: while (<STDIN>) { chomp; last if $_ eq 'quit'; my $channel = ""; if (s/^(\d+\s+)//o) { $channel = $1; } foreach my $rule (@rules) { if (my @match = /$rule->[0]/) { $_ = $rule->[1]; for (my $i=1; $i<=scalar(@match); $i++) { s/\$$i/$match[$i-1]/g; } print $channel, "OK store-id=$_\n"; next URL; } } print $channel, "ERR\n"; }
  • 0 Votes
    27 Posts
    6k Views
    JonathanLeeJ

    Could it be set flags SYN ACK ? and or state type keep or sloppy ?

  • Headers??

    Cache/Proxy
    4
    0 Votes
    4 Posts
    902 Views
    JonathanLeeJ

    @mcury Thanks for the information

  • 0 Votes
    1 Posts
    452 Views
    No one has replied
  • StoreID and Squid "helper program"

    Cache/Proxy
    16
    1 Votes
    16 Posts
    3k Views
    M

    @JonathanLee said in StoreID and Squid "helper program":

    Does anyone work with Store ID?

    Unfortunately no, I didn't.. splice all with cache disabled for me.
    Squid/Squidguard was just to filter SNI header..

  • 0 Votes
    9 Posts
    2k Views
    JonathanLeeJ

    @jimp thanks for looking into this. I will use that URL for future items. I did not know about that other URL until today.

  • Slow squid work after update to 2.7.0

    Cache/Proxy
    1
    0 Votes
    1 Posts
    581 Views
    No one has replied
  • 0 Votes
    2 Posts
    574 Views
    werterW

    Добрый
    @ao_kalachev
    В баг-репорт смотрели ? Может там чего есть на эту тему.

  • 0 Votes
    8 Posts
    5k Views
    P

    @JonathanLee Thanks Jonathan!