Squid Reverse Proxy or HAProxy?
-
The 'show' is a link you can click ;) But its working alright now?

 -
ooOOOO wow thank you did not see that
global maxconn 500 stats socket /tmp/haproxy.socket level admin uid 80 gid 80 nbproc 1 chroot /tmp/haproxy_chroot daemon listen HAProxyLocalStats bind 127.0.0.1:2200 name localstats mode http stats enable stats refresh 5 stats admin if TRUE stats uri /haproxy_stats.php?haproxystats=1 timeout client 5000 timeout connect 5000 timeout server 5000 frontend website-merged bind 200.116.XX.XX:80 name 200.116.XX.XX:80 mode http log global option http-keep-alive timeout client 30000 acl aclusr_host_matches_123.com hdr(host) -i 123.com acl aclusr_host_matches_mail.321.com_2fsquirrelmail hdr(host) -i mail.321.com/squirrelmail use_backend Website_http_ipvANY if aclusr_host_matches_123.com use_backend webemail_http_ipvANY if aclusr_host_matches_mail.321.com_2fsquirrelmail backend Website_http_ipvANY mode http timeout connect 30000 timeout server 30000 retries 3 option httpchk OPTIONS / server webiste 192.168.3.130:80 check inter 1000 backend webemail_http_ipvANY mode http timeout connect 30000 timeout server 30000 retries 3 option httpchk OPTIONS / server webemail 192.168.3.150:80 check inter 1000
Thank you
-
You havn't exactly specified what is and isn't working currently so i'm gonna make a few assumptions..
I think the website http://321.com/ is working, is that correct?
If not check what the stats page says for the LastChk column. Something like L7OK ? Or perhaps an error status.?But the second one "hdr(host) -i mail.321.com/squirrelmail" cannot work like that, as the "/squirrelmail" part is not in the host header. You will need to specify another acl for it and then maybe combine the host and path if thats required for your setup. Perhaps only use the "mail.321.com" domain name to match the host header against and leave it at that.
-
Thank you for the reply,
So fixed on what you commented, the issue in hand is that when I change the webmail port to 80 it ignores the website because the orden of the firewall rules so when i type 123.com it goes to mail.321.comThank you again see pictures
global maxconn 500 stats socket /tmp/haproxy.socket level admin uid 80 gid 80 nbproc 1 chroot /tmp/haproxy_chroot daemon listen HAProxyLocalStats bind 127.0.0.1:2200 name localstats mode http stats enable stats refresh 5 stats admin if TRUE stats uri /haproxy_stats.php?haproxystats=1 timeout client 5000 timeout connect 5000 timeout server 5000 frontend website-merged bind 200.116.XX.XX:80 name 200.116.XX.XX:80 mode http log global option http-keep-alive timeout client 30000 acl aclusr_host_matches_123.com hdr(host) -i 123.com acl aclusr_host_matches_mail.321.com hdr(host) -i mail.321.com use_backend Website_http_ipvANY if aclusr_host_matches_123.com use_backend webemail_http_ipvANY if aclusr_host_matches_mail.321.com backend Website_http_ipvANY mode http timeout connect 30000 timeout server 30000 retries 3 option httpchk OPTIONS / server webiste 192.168.3.130:80 check inter 1000 backend webemail_http_ipvANY mode http timeout connect 30000 timeout server 30000 retries 3 option httpchk OPTIONS / server webemail 192.168.3.150:80 check inter 1000
-
Delete the (relevant) NAT rules, and add a regular firewall rule to allow people from outside access to haproxy. You dont need to use pf to forward the traffic, as haproxy should do that.
Check on the haproxy stats that you are actually seeing connections counted as going through haproxy.
Then if firewallrules are allowing access to haproxy you 'should' receive a 503 error for the website backend when visiting that website with a browser as the server is 'down' according to haproxy. Due to the 405 HTTP response.
For this change the healthcheck on that website to something other than OPTIONS as it apparently does not support that.. Try HEAD or perhaps GET.. That might make the server green on stats instead of red.. Otherwise also try adding a Host header in the version field.. -
Hi,
Thank you so much that did the trick only on the website on the backend at the bottom instead of http I used the option basic and now it works, But whats odd when i type 123.com it loads but if i Type www.123.com shows the 503 service unavailable, i tried adding the www.123.com but then that just takes down the site so no luck thereThank you again :)
-
The
acl aclusr_host_matches_123.com hdr(host) -i 123.com
Does not exactly 'match' www.123.com so that that doesnt work is correct.
Adding a second acl line with the same aclname should work though.
-
Thanks for the reply,
when you mean adding a second acl line something like this?
see pictureBut not sure what you mean?
Thank you
-
Does your hostname 'match www' ? or was it 'www.123.com' ? If your configuring it to 'match' then it must match, perhaps you want it only to 'contain www', or 'start with www'? Though that would prevent you from forwarding multiple sites that contain www to different backends.. Probably best to just write the whole domain name you want it to match.
Yes what youve configured in that screenshot looks like what i asked. Wondering though if maybe the acl name should be different, things have changed a little in the acl/action parts. And reading that little text about 'acls with same name will be combined' makes me think your running an old haproxy package version, i havent used that one for a while.
-
Thank you so much It was something with the acl i fixed see picture, Yea its version 1.5 Haproxy im still using pfSense 2.2.4
Thank you