• HAProxy Query

    9
    0 Votes
    9 Posts
    614 Views
    A
    @viragomann yeah, it was enough of a nightmare to get this VPN setup originally, anytime you involve TCS in something IT related you can add an extra hundred hours!
  • Can we use squid ?

    2
    0 Votes
    2 Posts
    383 Views
    JonathanLeeJ
    I use it, I mean it is updated so the security issues are gone from it.
  • invalid characters

    Moved
    3
    0 Votes
    3 Posts
    334 Views
    B
    Thank you for your reply. Alas there are no spaces. But I have more issues. I re-entered a new front end which did not throw any errors on SAVE. But, Apply Changes, I got: [ALERT] (56576) : config : parsing [/var/etc/haproxy_test/haproxy.cfg:15] : 'bind 192.168.42.1:443' in section 'frontend' : 'crt-list' : unable to load certificate from file '/var/etc/haproxy_test/MyProxyA.pem': no start line. [ALERT] (56576) : config : Error(s) found in configuration file : /var/etc/haproxy_test/haproxy.cfg [ALERT] (56576) : config : Fatal errors found in configuration. I have gotten this before and I navigated to /var/etc and haproxy_test direcory does not exist. I have no idea where that is being picked up from. I have uninstalled haproxy and deleted the haproxy directory, re-install haproxy and it's still wants /var/etc/haproxy_test/haproxy.cfg, which does not exist. At this point I think my best bet is to restore a backup to before I started with haproxy and start over.
  • Squid problem after upgrade to 2.7.1

    12
    0 Votes
    12 Posts
    4k Views
    JonathanLeeJ
    https://github.com/pfsense/FreeBSD-ports/commit/476a7d0e3dca704b236839970f1d215912184f73 This is a known issue I had a merge for a previous version when you could disable the older tls however this directive is no longer on the latest version of squid. This directive is no longer part of the latest squid package.
  • HAProxy with an external modsecurity filter

    9
    4 Votes
    9 Posts
    6k Views
    M
    @lncc63 Hello, Sorry for my ignorance but can you provide the way you have dokerized the WAF from jcmoraisjr/modsecurity-spoa ?? May be is not the right place to post this question but on jcmoraisjr git page I can't find any guide to pull the image or so on... Can you provide any help? Thanks, Gianluca
  • Little help!

    1
    0 Votes
    1 Posts
    185 Views
    No one has replied
  • 0 Votes
    1 Posts
    533 Views
    No one has replied
  • 0 Votes
    1 Posts
    320 Views
    No one has replied
  • This topic is deleted!

    1
    0 Votes
    1 Posts
    13 Views
    No one has replied
  • Help with getting second server working with haproxy

    3
    0 Votes
    3 Posts
    754 Views
    V
    @viragomann said in Help with getting second server working with haproxy: @vMAC said in Help with getting second server working with haproxy: Sometimes I get a 503 error, and other times I get a Redirected Too Many times error. I'd consider these as different issues. HAproxy give 503 if the backend state is offline or the backend does not respond as expected. So first ensure, that HAproxy shows the backend as online in the stats. I'd switch over to basic health check for testing. However, "redirected to many times" might come from the browser. Best you use the browsers debugging mode to investigate, what's going on here. Got it, so here is what I found. Truenas has a Http -> Https redirect built into settings. I had it checked, unchecking it has not stopped the too many redirects, and looks to have resolved my original issue. Thank you! I am now trying to set one up for my Unifi Cloud Controller though and it is giving me a TLS mismatch error as I am trying to redirect to a 8443 port? Bad Request This combination of host and port requires TLS.
  • 0 Votes
    3 Posts
    1k Views
    JonathanLeeJ
    Store ID program: I am using the built in program attached here.. /usr/local/libexec/squid/storeid_file_rewrite #!/usr/local/bin/perl use strict; use warnings; use Pod::Usage; =pod =head1 NAME storeid_file_rewrite - File based Store-ID helper for Squid =head1 SYNOPSIS storeid_file_rewrite filepath =head1 DESCRIPTION This program acts as a store_id helper program, rewriting URLs passed by Squid into storage-ids that can be used to achieve better caching for websites that use different URLs for the same content. It takes a text file with two tab separated columns. Column 1: Regular expression to match against the URL Column 2: Rewrite rule to generate a Store-ID Eg: ^http:\/\/[^\.]+\.dl\.sourceforge\.net\/(.*) http://dl.sourceforge.net.squid.internal/$1 Rewrite rules are matched in the same order as they appear in the rules file. So for best performance, sort it in order of frequency of occurrence. This program will automatically detect the existence of a concurrency channel-ID and adjust appropriately. It may be used with any value 0 or above for the store_id_children concurrency= parameter. =head1 OPTIONS The only command line parameter this helper takes is the regex rules file name. =head1 AUTHOR This program and documentation was written by I<Alan Mizrahi <alan@mizrahi.com.ve>> Based on prior work by I<Eliezer Croitoru <eliezer@ngtech.co.il>> =head1 COPYRIGHT * Copyright (C) 1996-2023 The Squid Software Foundation and contributors * * Squid software is distributed under GPLv2+ license and includes * contributions from numerous individuals and organizations. * Please see the COPYING and CONTRIBUTORS files for details. Copyright (C) 2013 Alan Mizrahi <alan@mizrahi.com.ve> Based on code from Eliezer Croitoru <eliezer@ngtech.co.il> This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program; if not, write to the Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307, USA. =head1 QUESTIONS Questions on the usage of this program can be sent to the I<Squid Users mailing list <squid-users@lists.squid-cache.org>> =head1 REPORTING BUGS Bug reports need to be made in English. See http://wiki.squid-cache.org/SquidFaq/BugReporting for details of what you need to include with your bug report. Report bugs or bug fixes using http://bugs.squid-cache.org/ Report serious security bugs to I<Squid Bugs <squid-bugs@lists.squid-cache.org>> Report ideas for new improvements to the I<Squid Developers mailing list <squid-dev@lists.squid-cache.org>> =head1 SEE ALSO squid (8), GPL (7), The Squid wiki http://wiki.squid-cache.org/Features/StoreID The Squid Configuration Manual http://www.squid-cache.org/Doc/config/ =cut my @rules; # array of [regex, replacement string] die "Usage: $0 <rewrite-file>\n" unless $#ARGV == 0; # read config file open RULES, $ARGV[0] or die "Error opening $ARGV[0]: $!"; while (<RULES>) { chomp; next if /^\s*#?$/; if (/^\s*([^\t]+?)\s*\t+\s*([^\t]+?)\s*$/) { push(@rules, [qr/$1/, $2]); } else { print STDERR "$0: Parse error in $ARGV[0] (line $.)\n"; } } close RULES; $|=1; # read urls from squid and do the replacement URL: while (<STDIN>) { chomp; last if $_ eq 'quit'; my $channel = ""; if (s/^(\d+\s+)//o) { $channel = $1; } foreach my $rule (@rules) { if (my @match = /$rule->[0]/) { $_ = $rule->[1]; for (my $i=1; $i<=scalar(@match); $i++) { s/\$$i/$match[$i-1]/g; } print $channel, "OK store-id=$_\n"; next URL; } } print $channel, "ERR\n"; }
  • Dynamic Items within a Web Page?

    2
    0 Votes
    2 Posts
    499 Views
    JonathanLeeJ
    To Quote Squid Email Support "'On 1/01/25 21:21, Robin Wood wrote: I'm going to massively over simplify things here, but you can think of it like this. Files with html extensions are static web pages, you write them, put them on the server, and they are served as they are, no changes. Asp and the others are dynamic files, they are processed by an app on the server before they are sent to the client. This app may do nothing, so the page comes as it was, but usually it will add content. This content could be to create a CMS page by pulling the page content from a database, it could be your shopping orders pulled from your account, or it could be your current bank statement. Caching should never be done on anything that is specific to a single user, so it's fine to cache public CMS content with an asp extension, but not your bank statement. There is more to it than that, but hopefully that gives you a general idea.' That is mostly correct for simple HTTP/1.0-like behaviour. With HTTP/1.1 and later things are a little different. The biggest change is that URL no longer matters. The Content-Typereplaces "fiel extension" entirely, and Cache-Control headers take over the job of defining how and when something can be cached. For Squid, the refresh_pattern directive is what provides compatibility with HTTP 1.0 behaviour. It provides values for any Cache-Control settings the server omitted (eg for servers acting like HTTP/1.0 still). The default "refresh_pattern -i (/cgi-bin/|?) 0 0% 0" configuration line tells Squid the values which will perform HTTP/1.0 caching behaviour for any of the dynamic content coming out of broken or old cgi-bin services or anythign with query-string ('?...') URL. Jonathan: if you have not changed the refresh_pattern's you do not have to care specifically about dynamic-vs-static content caching. Whether it is plain-text HTTP(S) or SSL-Bump'ed HTTPS, it should all cache properly for its server-claimed needs. Your "cache deny" policy in squid.conf is telling Squid never to cache any URL containing the ACL-matching strings. Even if they could be cached safely. HTH Amos" Basically you do not really need to add this weird rule per the email support.. if you have not changed the refresh_pattern's you do not have to care specifically about dynamic-vs-static content caching. Whether it is plain-text HTTP(S) or SSL-Bump'ed HTTPS, it should all cache properly for its server-claimed needs. I do not know if anyone else wonders about this and stumbled on that rule on several different websites, it is not built into the Squid Package but it seems like something others have been adding.
  • Haproxy Cloudflare restoring original ip

    5
    0 Votes
    5 Posts
    2k Views
    V
    @kennethg01 Did you notice, that the real clients IP is only sent to the backend server as value of the "X-Forwarded-For" header? You have to configure your web server to log this header, since this is not done by default.
  • 0 Votes
    1 Posts
    235 Views
    No one has replied
  • Problem accessing Spotify via web browser app through Squid

    12
    0 Votes
    12 Posts
    2k Views
    JonathanLeeJ
    @michmoor You ever configure Squid with http tproxy ? I tested this it was amazing again any reboot or enable reset puts it back to the old way.
  • HAProxy not working for 1 site

    15
    0 Votes
    15 Posts
    1k Views
    V
    @CreationGuy What did you try? How did you access the server? From inside your network or from outside? Which URL? What exactly did you get?
  • Issue with HAProxy and Kubernetes Ingress Controller in Proxy Mode

    2
    0 Votes
    2 Posts
    686 Views
    M
    I managed to resolve the problem by removing the frontend name. After making this change, everything started working normally. Updated Frontend Configuration: mode tcp bind *:443 timeout client 30s use_backend k8s-ssl-pass-thru By simplifying the configuration and removing the unnecessary frontend name, the setup became functional. If anyone else is facing similar issues, I recommend checking if any redundant configuration elements can be removed.
  • 0 Votes
    8 Posts
    857 Views
    J
    @JeGr Many thanks. I had performed the upgrade on a SG4680 and 6100 and still got the elf error (no CE on prod). I'll try the upgrade over the weekend to 24.11 and check whether I see the same library problem on these and the fallback machines.
  • SSL intercept https squid and ClamAV updates over cron external swap

    1
    0 Votes
    1 Posts
    148 Views
    No one has replied
  • How to update ClamAV

    14
    0 Votes
    14 Posts
    3k Views
    JonathanLeeJ
    I use ssl intercept and it does scan https traffic. With protocols like doh, dns over https, pfblocking is just wackamole. Squid a pain to configure with ssl intercept but it works great once it is configured. ClamAV is a pain when it updates, it hogs resources. So I use cron and it updates in the early hours
Copyright 2025 Rubicon Communications LLC (Netgate). All rights reserved.