@NOCling said in Separate file system (and pool) to isolate the logs, to not compromise the operating system !:
If you want to analyse Firewall Logs in a large environment, you need a external Syslog server.
You will not drop hundrets of GB Logs on a Firewall a start parssing thorugh there.
I more than agree with You strategically! We using external syllogism node, of course, Thank You!
But I am asking particulary about how to isolate the logs on the pfSense node.
Let me drop 5 cents about logs aggregation and analysis:
You will run a Graylog or other elastic seaech driven stuff to look into.
Search just the 1 line in a 40GB dayly log is no fun.
Of course, if You using Elasticsearch - even 10Gb / day would need a lot of resources on a separate node.
I recommend You using “ClickHouse + Vector.dev + Redash” instead than stack “Elastic Search + FluentD + Kibana” (or Elasticsearch + anything”): at initial point this give You superior (7-10x) better data compression, write optimization and on 1/3-1/4 less hardware resources. Operations like COUNT, SUM, and AVG over billions of rows execute 10-100x faster in ClickHouse compared to Elasticsearch….
And of course, better horizontal scaling with good consistency.