Squid 3.3.4 package for pfsense with ssl filtering
-
For what it's worth, I'm using latest Squid3-dev (3.3.10 pkg 2.2.2) with ssl intercept and am able to browse to that moto360 site with no error.
Can you give me a bit more details on how you resolved this. My SSL intercept seems to be working for most sites, but a few such as the ebay payments, or even the moto360 site (https://moto360.motorola.com/) fail with the same error about not recognising the CA.
Many thanks.
-
For what it's worth, I'm using latest Squid3-dev (3.3.10 pkg 2.2.2) with ssl intercept and am able to browse to that moto360 site with no error.
Can you give me a bit more details on how you resolved this. My SSL intercept seems to be working for most sites, but a few such as the ebay payments, or even the moto360 site (https://moto360.motorola.com/) fail with the same error about not recognising the CA.
Many thanks.
That's very odd, as I'm on exactly the same build, and using ssl intercept/transparent. Is there any specific configuration parameters you've changed/added in your squid setup?
Also do you have any problems in posting messages to https sites (e.g. this forum) when going through the proxy? I keep getting an error in posting, and thus have to use a machine that doesn't go through the proxy.
-
Ah, I actually had "do not verify remote cert" set, I did not realize. When I set that I get the same error at moto360.
Perhaps that particular root is not in the store that the squid package is using, if that's the case this should not be difficult to correct..For what it's worth, I'm using latest Squid3-dev (3.3.10 pkg 2.2.2) with ssl intercept and am able to browse to that moto360 site with no error.
Can you give me a bit more details on how you resolved this. My SSL intercept seems to be working for most sites, but a few such as the ebay payments, or even the moto360 site (https://moto360.motorola.com/) fail with the same error about not recognising the CA.
Many thanks.
That's very odd, as I'm on exactly the same build, and using ssl intercept/transparent. Is there any specific configuration parameters you've changed/added in your squid setup?
Also do you have any problems in posting messages to https sites (e.g. this forum) when going through the proxy? I keep getting an error in posting, and thus have to use a machine that doesn't go through the proxy.
-
Ah, I actually had "do not verify remote cert" set, I did not realize. When I set that I get the same error at moto360.
Perhaps that particular root is not in the store that the squid package is using, if that's the case this should not be difficult to correct..Thanks for spotting that and yes selecting that option allows me to visit such sites. With regards to the certificate squid is complaining about, it is from Thawte, and so one would have assumed it would be included. However, I did notice that the certificate's CN was "Thawte Inc", but non of the Thawte certificates in /usr/local/share/certs had exactly the same CN (the ones there are Thawte Premium Server, Thawte Primary Root etc).
Also are you able to post to https sites, such as this forum, when going through squid? I am still getting the following message unless I use a PC that bypasses the proxy:
Error 403
We're sorry, but we could not fulfill your request for /index.php?action=post2;start=270;board=15 on this server.
You do not have permission to access this server. Before trying again, close your browser, run anti-virus and anti- spyware software and remove any viruses and spyware from your computer.
Your technical support key is: 5688-9cbf-b40c-8ddc
You can use this key to fix this problem yourself.
If you are unable to fix the problem yourself, please contact the WEBMA5TER and be sure to provide the technical support key shown above.
UPDATE: Actually the above posting error seems to be an issue with my Chrome setup, as posting is working fine from IE.
-
I do not have any issue posting from firefox.
Should be pretty simple to just drop that root CA cert in the folder /usr/pbi/squid-amd64/share/certs. I'm just not sure about the naming convention used, seems to be based on a hash. Likely squid uses the filename to locate the correct cert. Just guessing here.UPDATE: Actually the above posting error seems to be an issue with my Chrome setup, as posting is working fine from IE.
-
Anyone figured out how to do load-balancing of squid traffic for 2.1.x or 2.2 alpha?
-
I already running squid3 ver 3.3.4, working normal can cache http/https with 1 interface ( use as proxy box ), how to use squid3 working with mikrotik ..
-
new rds protocol 8.0 not work more, any ideas?
-
Hi,
can anyone help me on why do I get "proxy refuses connection"??
Squid3-dev package
Squidguard3
System Patch ->squidguard fixConfigs:
# This file is automatically generated by pfSense # Do not edit manually ! http_port 10.66.106.65:8081 icp_port 7 dns_v4_first on pid_filename /var/run/squid.pid cache_effective_user proxy cache_effective_group proxy error_default_language el icon_directory /usr/pbi/squid-amd64/etc/squid/icons visible_hostname Proxy- cache_mgr it@skata.gr access_log /var/squid/logs/access.log cache_log /var/squid/logs/cache.log cache_store_log none netdb_filename /var/squid/logs/netdb.state pinger_enable on pinger_program /usr/pbi/squid-amd64/libexec/squid/pinger logfile_rotate 0 debug_options rotate=0 shutdown_lifetime 3 seconds # Allow local network(s) on interface(s) acl localnet src 10.66.106.64/26 uri_whitespace strip acl dynamic urlpath_regex cgi-bin ? cache deny dynamic cache_mem 200 MB maximum_object_size_in_memory 1024 KB memory_replacement_policy heap GDSF cache_replacement_policy heap LFUDA cache_dir ufs /var/squid/cache 4096 16 256 minimum_object_size 10 KB maximum_object_size 2048 KB offline_mode off cache_swap_low 85 cache_swap_high 95 cache allow all # No redirector configured #Remote proxies # Setup some default acls # From 3.2 further configuration cleanups have been done to make things easier and safer. The manager, localhost, and to_localhost ACL definitions are now built-in. # acl localhost src 127.0.0.1/32 acl allsrc src all acl safeports port 21 70 80 210 280 443 488 563 591 631 777 901 3128 3127 1025-65535 acl sslports port 443 563 # From 3.2 further configuration cleanups have been done to make things easier and safer. The manager, localhost, and to_localhost ACL definitions are now built-in. #acl manager proto cache_object acl purge method PURGE acl connect method CONNECT # Define protocols used for redirects acl HTTP proto HTTP acl HTTPS proto HTTPS acl allowed_subnets src 10.66.106.64/26 http_access allow manager localhost # Allow external cache managers acl ext_manager src 10.66.106.65 http_access allow manager ext_manager http_access deny manager http_access allow purge localhost http_access deny purge http_access deny !safeports http_access deny CONNECT !sslports # Always allow localhost connections # From 3.2 further configuration cleanups have been done to make things easier and safer. # The manager, localhost, and to_localhost ACL definitions are now built-in. # http_access allow localhost request_body_max_size 0 KB delay_pools 1 delay_class 1 2 delay_parameters 1 -1/-1 -1/-1 delay_initial_bucket_level 100 delay_access 1 allow allsrc # Reverse Proxy settings # Package Integration url_rewrite_program /usr/pbi/squidguard-squid3-amd64/bin/squidGuard -c /usr/pbi/squidguard-squid3-amd64/etc/squidGuard/squidGuard.conf url_rewrite_bypass off url_rewrite_children 16 startup=8 idle=4 concurrency=0 # Custom options before auth acl sglog url_regex -i sgr=ACCESSDENIED http_access deny sglog # Setup allowed acls # Allow local network(s) on interface(s) http_access allow allowed_subnets http_access allow localnet # Default block all to be sure http_access deny allsrc icap_enable on icap_send_client_ip on icap_send_client_username on icap_client_username_encode off icap_client_username_header X-Authenticated-User icap_preview_enable on icap_preview_size 1024 icap_service service_req reqmod_precache bypass=0 icap://127.0.0.1:1344/squidclamav icap_service service_resp respmod_precache bypass=0 icap://127.0.0.1:1344/squidclamav adaptation_access service_req allow all adaptation_access service_resp allow all
# ============================================================ # SquidGuard configuration file # This file generated automaticly with SquidGuard configurator # (C)2006 Serg Dvoriancev # email: dv_serg@mail.ru # ============================================================ logdir /var/squidGuard/log dbhome /var/db/squidGuard # dest blk_BL_adv { domainlist blk_BL_adv/domains urllist blk_BL_adv/urls log block.log } # dest blk_BL_aggressive { domainlist blk_BL_aggressive/domains urllist blk_BL_aggressive/urls log block.log } # dest blk_BL_alcohol { domainlist blk_BL_alcohol/domains urllist blk_BL_alcohol/urls log block.log } # dest blk_BL_anonvpn { domainlist blk_BL_anonvpn/domains urllist blk_BL_anonvpn/urls log block.log } # dest blk_BL_automobile_bikes { domainlist blk_BL_automobile_bikes/domains urllist blk_BL_automobile_bikes/urls log block.log } # dest blk_BL_automobile_boats { domainlist blk_BL_automobile_boats/domains urllist blk_BL_automobile_boats/urls log block.log } # dest blk_BL_automobile_cars { domainlist blk_BL_automobile_cars/domains urllist blk_BL_automobile_cars/urls log block.log } # dest blk_BL_automobile_planes { domainlist blk_BL_automobile_planes/domains urllist blk_BL_automobile_planes/urls log block.log } # dest blk_BL_chat { domainlist blk_BL_chat/domains urllist blk_BL_chat/urls log block.log } # dest blk_BL_costtraps { domainlist blk_BL_costtraps/domains urllist blk_BL_costtraps/urls log block.log } # dest blk_BL_dating { domainlist blk_BL_dating/domains urllist blk_BL_dating/urls log block.log } # dest blk_BL_downloads { domainlist blk_BL_downloads/domains urllist blk_BL_downloads/urls log block.log } # dest blk_BL_drugs { domainlist blk_BL_drugs/domains urllist blk_BL_drugs/urls log block.log } # dest blk_BL_dynamic { domainlist blk_BL_dynamic/domains urllist blk_BL_dynamic/urls log block.log } # dest blk_BL_education_schools { domainlist blk_BL_education_schools/domains urllist blk_BL_education_schools/urls log block.log } # dest blk_BL_finance_banking { domainlist blk_BL_finance_banking/domains urllist blk_BL_finance_banking/urls log block.log } # dest blk_BL_finance_insurance { domainlist blk_BL_finance_insurance/domains urllist blk_BL_finance_insurance/urls log block.log } # dest blk_BL_finance_moneylending { domainlist blk_BL_finance_moneylending/domains urllist blk_BL_finance_moneylending/urls log block.log } # dest blk_BL_finance_other { domainlist blk_BL_finance_other/domains urllist blk_BL_finance_other/urls log block.log } # dest blk_BL_finance_realestate { domainlist blk_BL_finance_realestate/domains urllist blk_BL_finance_realestate/urls log block.log } # dest blk_BL_finance_trading { domainlist blk_BL_finance_trading/domains urllist blk_BL_finance_trading/urls log block.log } # dest blk_BL_fortunetelling { domainlist blk_BL_fortunetelling/domains urllist blk_BL_fortunetelling/urls log block.log } # dest blk_BL_forum { domainlist blk_BL_forum/domains urllist blk_BL_forum/urls log block.log } # dest blk_BL_gamble { domainlist blk_BL_gamble/domains urllist blk_BL_gamble/urls log block.log } # dest blk_BL_government { domainlist blk_BL_government/domains urllist blk_BL_government/urls log block.log } # dest blk_BL_hacking { domainlist blk_BL_hacking/domains urllist blk_BL_hacking/urls log block.log } # dest blk_BL_hobby_cooking { domainlist blk_BL_hobby_cooking/domains urllist blk_BL_hobby_cooking/urls log block.log } # dest blk_BL_hobby_games-misc { domainlist blk_BL_hobby_games-misc/domains urllist blk_BL_hobby_games-misc/urls log block.log } # dest blk_BL_hobby_games-online { domainlist blk_BL_hobby_games-online/domains urllist blk_BL_hobby_games-online/urls log block.log } # dest blk_BL_hobby_gardening { domainlist blk_BL_hobby_gardening/domains urllist blk_BL_hobby_gardening/urls log block.log } # dest blk_BL_hobby_pets { domainlist blk_BL_hobby_pets/domains urllist blk_BL_hobby_pets/urls log block.log } # dest blk_BL_homestyle { domainlist blk_BL_homestyle/domains urllist blk_BL_homestyle/urls log block.log } # dest blk_BL_hospitals { domainlist blk_BL_hospitals/domains urllist blk_BL_hospitals/urls log block.log } # dest blk_BL_imagehosting { domainlist blk_BL_imagehosting/domains urllist blk_BL_imagehosting/urls log block.log } # dest blk_BL_isp { domainlist blk_BL_isp/domains urllist blk_BL_isp/urls log block.log } # dest blk_BL_jobsearch { domainlist blk_BL_jobsearch/domains urllist blk_BL_jobsearch/urls log block.log } # dest blk_BL_library { domainlist blk_BL_library/domains urllist blk_BL_library/urls log block.log } # dest blk_BL_military { domainlist blk_BL_military/domains urllist blk_BL_military/urls log block.log } # dest blk_BL_models { domainlist blk_BL_models/domains urllist blk_BL_models/urls log block.log } # dest blk_BL_movies { domainlist blk_BL_movies/domains urllist blk_BL_movies/urls log block.log } # dest blk_BL_music { domainlist blk_BL_music/domains urllist blk_BL_music/urls log block.log } # dest blk_BL_news { domainlist blk_BL_news/domains urllist blk_BL_news/urls log block.log } # dest blk_BL_podcasts { domainlist blk_BL_podcasts/domains urllist blk_BL_podcasts/urls log block.log } # dest blk_BL_politics { domainlist blk_BL_politics/domains urllist blk_BL_politics/urls log block.log } # dest blk_BL_porn { domainlist blk_BL_porn/domains urllist blk_BL_porn/urls log block.log } # dest blk_BL_radiotv { domainlist blk_BL_radiotv/domains urllist blk_BL_radiotv/urls log block.log } # dest blk_BL_recreation_humor { domainlist blk_BL_recreation_humor/domains urllist blk_BL_recreation_humor/urls log block.log } # dest blk_BL_recreation_martialarts { domainlist blk_BL_recreation_martialarts/domains urllist blk_BL_recreation_martialarts/urls log block.log } # dest blk_BL_recreation_restaurants { domainlist blk_BL_recreation_restaurants/domains urllist blk_BL_recreation_restaurants/urls log block.log } # dest blk_BL_recreation_sports { domainlist blk_BL_recreation_sports/domains urllist blk_BL_recreation_sports/urls log block.log } # dest blk_BL_recreation_travel { domainlist blk_BL_recreation_travel/domains urllist blk_BL_recreation_travel/urls log block.log } # dest blk_BL_recreation_wellness { domainlist blk_BL_recreation_wellness/domains urllist blk_BL_recreation_wellness/urls log block.log } # dest blk_BL_redirector { domainlist blk_BL_redirector/domains urllist blk_BL_redirector/urls log block.log } # dest blk_BL_religion { domainlist blk_BL_religion/domains urllist blk_BL_religion/urls log block.log } # dest blk_BL_remotecontrol { domainlist blk_BL_remotecontrol/domains urllist blk_BL_remotecontrol/urls log block.log } # dest blk_BL_ringtones { domainlist blk_BL_ringtones/domains urllist blk_BL_ringtones/urls log block.log } # dest blk_BL_science_astronomy { domainlist blk_BL_science_astronomy/domains urllist blk_BL_science_astronomy/urls log block.log } # dest blk_BL_science_chemistry { domainlist blk_BL_science_chemistry/domains urllist blk_BL_science_chemistry/urls log block.log } # dest blk_BL_searchengines { domainlist blk_BL_searchengines/domains urllist blk_BL_searchengines/urls log block.log } # dest blk_BL_sex_education { domainlist blk_BL_sex_education/domains urllist blk_BL_sex_education/urls log block.log } # dest blk_BL_sex_lingerie { domainlist blk_BL_sex_lingerie/domains urllist blk_BL_sex_lingerie/urls log block.log } # dest blk_BL_shopping { domainlist blk_BL_shopping/domains urllist blk_BL_shopping/urls log block.log } # dest blk_BL_socialnet { domainlist blk_BL_socialnet/domains urllist blk_BL_socialnet/urls log block.log } # dest blk_BL_spyware { domainlist blk_BL_spyware/domains urllist blk_BL_spyware/urls log block.log } # dest blk_BL_tracker { domainlist blk_BL_tracker/domains urllist blk_BL_tracker/urls log block.log } # dest blk_BL_updatesites { domainlist blk_BL_updatesites/domains urllist blk_BL_updatesites/urls log block.log } # dest blk_BL_urlshortener { domainlist blk_BL_urlshortener/domains urllist blk_BL_urlshortener/urls log block.log } # dest blk_BL_violence { domainlist blk_BL_violence/domains urllist blk_BL_violence/urls log block.log } # dest blk_BL_warez { domainlist blk_BL_warez/domains urllist blk_BL_warez/urls log block.log } # dest blk_BL_weapons { domainlist blk_BL_weapons/domains urllist blk_BL_weapons/urls log block.log } # dest blk_BL_webmail { domainlist blk_BL_webmail/domains urllist blk_BL_webmail/urls log block.log } # dest blk_BL_webphone { domainlist blk_BL_webphone/domains urllist blk_BL_webphone/urls log block.log } # dest blk_BL_webradio { domainlist blk_BL_webradio/domains urllist blk_BL_webradio/urls log block.log } # dest blk_BL_webtv { domainlist blk_BL_webtv/domains urllist blk_BL_webtv/urls log block.log } # dest Cat-Custom { domainlist Cat-Custom/domains redirect http://10.64.132.104/error.php?a=%a&n=%n&i=%i&s=%s&t=%t&u=%u log block.log } # dest Cat-Facebook { domainlist Cat-Facebook/domains redirect http://10.64.132.104/error.php?a=%a&n=%n&i=%i&s=%s&t=%t&u=%u log block.log } # rew safesearch { s@(google..*/search?.*q=.*)@&safe=active@i s@(google..*/images.*q=.*)@&safe=active@i s@(google..*/groups.*q=.*)@&safe=active@i s@(google..*/news.*q=.*)@&safe=active@i s@(yandex..*/yandsearch?.*text=.*)@&fyandex=1@i s@(search.yahoo..*/search.*p=.*)@&vm=r&v=1@i s@(search.live..*/.*q=.*)@&adlt=strict@i s@(search.msn..*/.*q=.*)@&adlt=strict@i s@(.bing..*/.*q=.*)@&adlt=strict@i log block.log } # acl { # default { pass !Cat-Custom !Cat-Facebook !blk_BL_drugs !blk_BL_gamble !blk_BL_hobby_games-misc !blk_BL_hobby_games-online !blk_BL_porn !blk_BL_sex_education !blk_BL_sex_lingerie !blk_BL_warez all redirect http://10.64.132.104/error.php?a=%a&n=%n&i=%i&s=%s&t=%t&u=%u log block.log } }
-
Hi all read through the whole post.
I have transparent ssl filtering working however I am having issues with update servers like windows update, adobe creative cloud, google update, and others
I tried to create an alias with some of the white listed links for windows updare however they are not working. Is there a better way to white list these update servers? because even if I get aliases working I have to manually find each server, find there link and add it to the aliases. Is there a better way for this? as there could be 100s of update servers in which would need to do this for.Thanks
-
Hi xtrego,
set the icap_service service_req reqmod_precache bypass=0 icap://127.0.0.1:1344/squidclamav to
icap_service service_req reqmod_precache bypass=1 icap://127.0.0.1:1344/squidclamavand do the same steps to the second row.
-
Bumping this issue, perhaps it should be broken off to a new thread but I'm not sure how.
I just updated to the latest squid package 2.2.8 and the issue remains.
The word "round-robin" appears in the cache_peer lines inside squid.conf this creates a situation where requests are alternately sent to the wrong peer when multiple web servers are added that are not serving the same site… ie they are not load balancing.I believe the word "round-robin" should either be removed or exposed as a checkbox in the gui under web servers tab.
I think there is a mistake in the reverse proxy config, I was having trouble so I read the squid.conf in pbi/…/etc and I found the directive
round-robin
even though I don't want that since my servers are independent of each other. I suggest either add a checkbox for that or remove that directive. Thanks! -
Squid is a 3.4.9, 3.5 is around the corner…
-
everything work fine.. but not good caching at all :(
-
Please add an option in the GUI to disable SSLv2 and SSLv3 (POODLE vulnerability). Thanks!
-
Please add an option in the GUI to disable SSLv2 and SSLv3 (POODLE vulnerability). Thanks!
Are you talking about the reverse HTTPS proxy?
You can already disable SSLv2 and SSLv3 with a trick. Anything you put on the line for 'Reverse HTTPS default site' will be copied to the relevant spot it the squid.conf file. So in stead of justwww.example.com ```you can put:
www.example.com options=NO_SSLv2,NO_SSLv3 cipher=ALL:!aNULL:!eNULL:!LOW:!EXP:!ADH:!RC4+RSA:+HIGH:+MEDIUM:!SSLv2
This way you get a grade B at https://www.ssllabs.com/ssltest/ I found no way yet to enable TLS 1.2 in squid.
-
I found no way yet to enable TLS 1.2 in squid.
For the record: with the latest pfSense 2.2 I do get TLSv1.2 as well. Probably because OpenSSL 1.0.1k is now included?
-
everything work fine.. but not good caching at all
IN my testing, I found that only managed a hit rate of about 5-7% with my company. The dynamic nature of today's web makes it very challenging for caches. Plus, with high-speed links and tons of bandwidth, Squid seems to get more use here as the base for SquidGuard filtering than it does for caching content.
-
Yup, I use Squid for the very same reason. Keep it setting to "null" for no local caching now since I have 110/20 Mbps speeds, don't need local cache. Now with ICAP and Clamd I am getting a bit more functionality out of it.
If there was a way to separate the dependency of Clamd and dans/e2guardian on squid, I would had installed the separate packages and not looked at Squid ever.
-
Good Evening,
Can anyone provide instruction on how to configure squidclamav to update definitions twice per day, noon & midnight?
I have installed the Cron package & can SSH in, but have been unable to determine the next steps & appropriate script to make it auto update.
Best-
Darren