Sarg package for pfsense
-
Yes. Every 60 days.
-
-
hi guys, i've been reading this thread. I just installed Sarg today.
I'm using Squid Transparent with SquidGuard. Just standard config, with log turned on (log rotate also).
I can view Realtime just fine, but I can't seem to generate a report when I try forcing a sched with the following args:
-d
date +%d/%m/%Y
-date +%d/%m/%Y
I just get this:
Error: Could not find report index file.
Check and save sarg settings and try to force sarg schedule.Should I do anything special config to make it work?
-
Should I do anything special config to make it work?
yes, check all sarg config options, reports to generate and create a schedule to run.
Default sarg options has (yes) after it's description. Select all to create a default config.
-
Should I do anything special config to make it work?
yes, check all sarg config options, reports to generate and create a schedule to run.
Default sarg options has (yes) after it's description. Select all to create a default config.
Thank you. I had to simply select (ctrl+click to highlight) the config options then click save. I got confused because I thought they're already enabled since they already have a (yes) on them.
-
I have Sarg running on multiple pfsense boxes. One of my boxes has about 100 users behind it and the report will only work for about the first 4 hours after I wipe out the squid logs. After that I am guessing the squid log gets too big and the sarg report will no longer work.
I am using the -d arguments and I have tried limiting the number of users.
Any suggestions on how I can get sarg to accept a larger log file?
-
Any suggestions on how I can get sarg to accept a larger log file?
I have large files working fine.
try to run sarg on console to check what it returns.
-
Any suggestions on how I can get sarg to accept a larger log file?
I have large files working fine.
try to run sarg on console to check what it returns.
Seems to be working fine now. I just need to figure out my schedule because, like others my report is pretty empty at 00:00. I need to figure out Cron now.
I have highlighted what I am questioning. Is this rotating my squid logs even after I have set them not to rotate?
![cron sarg.PNG](/public/imported_attachments/1/cron sarg.PNG)
![cron sarg.PNG_thumb](/public/imported_attachments/1/cron sarg.PNG_thumb) -
Check on squid config because it's not created by sarg.
-
Check on squid config because it's not created by sarg.
This is my squid config. Rotation should be disabled.
-
I think it is working now. Thanks for all your help Marcelloc
-
Marcelloc, I am not sure if this is a bug or if I am doing something / missing something.
I would like to provide access to the Sarg reports to a few users. When I give them permissions via the user manager to the Sarg reports, it does not work fully.
The real time logs work, but when you try and view reports it just flickers non stop. Looks like it is trying to load the sarg reports frame inside the sarg reports frame.Attached is the permissions I am giving the user. Is there an easier way or is this a bug?
-
Looks like it is trying to load the sarg reports frame inside the sarg reports frame.
Reinstall sarg package, I've fixed it last week.
-
awesome! Thanks!!
edit: works like a charm!
-
Using nano 2.0.1 and SARG 2.3.2 pkg v.0.6.1.
No matter what I do, tried everithing I found in this forum.
I always get
Error: Could not find report index file.
Check and save sarg settings and try to force sarg schedulRunning sarg -x results in
SARG: sarg version: 2.3.2 Nov-23-2011
SARG: Reading access log file: /var/squid/logs/access.log
SARG: Records in file: 11460, reading: 100.00%
SARG: Records read: 11460, written: 11459, excluded: 0
SARG: Squid log format
SARG: Period: 22 Oct 2012
SARG: pre-sorting files
SARG: File /usr/local/sarg-reports/22Oct2012-22Oct2012 already exists, moved to /usr/local/sarg-reports/22Oct2012-22Oct2012.4
SARG: Cannot delete /usr/local/sarg-reports/22Oct2012-22Oct2012/d192_168_7_11.html - No such file or directorySaved, re-saved, re-re-re-saved the config with (yes) options.
Deleted and recreated report directories, gave them 777. Created a schedule with every possible combination of parameters, run it manually, scheduled,…
Each time the no index error.Running a schedule results in
php: /pkg_edit.php: The command '/usr/local/bin/sarg ' returned exit code '1', the output was 'SARG: Records in file: 11647, reading: 0.00%^MSARG: Records in file: 5000, reading: 42.93%^MSARG: Records in file: 10000, reading: 85.86%^MSARG: Cannot delete /usr/local/sarg-reports/22Oct2012-22Oct2012/d192_168_7_11.html - No such file or directory SARG: Records in file: 11647, reading: 100.00%'If something is written in these forums, I tried it. :(
Realtime works correctly but what I need i history data.
Any other test/debug I can try? -
- about a month later
-
I had the same problem as LoZio. To get mine to work I did the following -
- de-selected all of the options on the general tab and saved it
- forced an update on the schedule tab
- re-selected the options on the general tab and saved it
- forced an update on the schedule tab
This caused the index.html file to be generated in my /usr/local/sarg-reports folder. Up until this point everything else was working except for the index.html file.
-
Hi all,
I've just published sarg package for pfsense with squid,squidguard and dansguardian log Analysis as well real time report tab.
Squidguard functions are under devel yet but squid and dansguardians(as well as I tested) are working.
After almost everything done, I found an old sarg package published on forum by joaohf and merged some function calls from this old thread.
Another good point is that sarg is able to forward logs via email, so I'm planning to include it for nanobsd installs.
have fun and feedback! :)
att,
Marcello CoutinhoThanks a lot!
-
Hi,
I would like to use sarg package to get a better overview of the blocked sites from squidguard.
I do not have logging enabled on squid - just on squidguard to watch the blocked sites.In my company it is not allowed to log accessed sites. The log view of squidguard is not the best I think and so I would like to use squidguard.
On the sarg "general" tab I selected "squidguard" and so options on the multiple-choise lists. When saving the settings I got an error on the top right corner that the squid/access.log was not found.
I took a look at the sarg.inc and I think the problem could be somewhere on line 230. But I am not sure. I added a "break;" but without luck.
So my questions are:
Is it possible to use sarg to just "analyse" the blocked.log file of squidguard but no other log files ?Any help would be appreciated :-)
-
So my questions are:
Is it possible to use sarg to just "analyse" the blocked.log file of squidguard but no other log files ?Hi Nachtfalke,
I've enabled squidguard config options on gui, but I do not use squidguard. take a look on sarg config options and check manually how it should be configured to work with squidguard. I'll push a fix if you find a way to get it working only with squidguard reports.
The missing break was intentional as it requires squid to work.
att,
Marcello Coutinho -
I changed the following code on sarg.inc starting on line 227:
From:case 'squidguard': $squidguard_conf='squidguard_conf '.$sarg_proxy['squidguard_config']; $redirector_log_format='redirector_log_format #year#-#mon#-#day# #hour# #tmp#/#list#/#tmp#/#tmp#/#url#/#tmp# #ip#/#tmp# #user# #end#'; #Leve this case without break to include squid log file on squidguard option
To:
case 'squidguard': $access_log= $sarg_proxy['squidguard_block_log']; $squidguard_conf='squidguard_conf '.$sarg_proxy['squidguard_config']; $redirector_log_format='redirector_log_format #year#-#mon#-#day# #hour# #tmp#/#list#/#tmp#/#tmp#/#url#/#tmp# #ip#/#tmp# #user# #end#'; #Leve this case without break to include squid log file on squidguard option break;
Now I got this error on system log:
Nov 30 21:53:47 squid[41070]: Squid Parent: child process 41365 started Nov 30 21:53:46 squid[30925]: Squid Parent: child process 28838 exited with status 0 Nov 30 21:53:42 php: /pkg_edit.php: The command '/usr/local/bin/sarg ' returned exit code '1', the output was 'SARG: Records in file: 30911, reading: 0.00%^MSARG: Maybe you have a broken amount of data in your /var/squidGuard/log/block.log file SARG: getword loop detected after 255 bytes. SARG: Line="2012-11-12 17:40:37 [49110] Request(Einge_Internet/none/-) http://tools.google.com/service/update2?w=6:Ihy13C0hp8xIICE3I3l36cwhjObjYjH-7ezo0Kwjmqdp2WQIYaHezKLduIFlOC07QuSuqJStljIF_EJvqlNqH0mGJEvVnkreJQ2qbW71ZWEQEq24CssCY5d9Ij2SpjptLVmxkQea7O1ZlFABARa472hYaKBlD-inQ1Tv_mhFcwGtSnWPlcze4nm8kf-U3F9frIL5ODG5pU6wvGJhMf50_KfRnn_LxvTASxdUPr_pmKRUeElE6XcQz4FfZJtJxQFcuscJFDwxRAKgT4V4rztyV7DbVScLMNy5y_OfKwesqun5J5bg093aLt-twEi8bFZNxjQnPQSUqYuNivTmpnyQFw 172.17.183.27/- - POST REDIRECT" SARG: Record="http://tools.google.com/service/update2?w=6:Ihy13C0hp8xIICE3I3l36cwhjObjYjH-7ezo0Kwjmqdp2WQIYaHezKLduIFlOC07QuSuqJStljIF_EJvqlNqH0mGJEvVnkreJQ2qbW71ZWEQEq24CssCY5d9Ij2SpjptLVmxkQea7O1ZlFABARa472hYaKBlD-inQ1Tv_mhFcwGtSnWPlcze4n Nov 30 21:53:42 php: /pkg_edit.php: Sarg: force refresh now with args, compress() and restart action after sarg finish. Nov 30 21:53:32 php: /pkg_edit.php: [sarg] sarg_xmlrpc_sync.php is starting.
Not sure what that means ?
PS: Why is xmlrpc sync starting but I did not enable that !?
-
Not sure what that means ?
Maybe a too long line
PS: Why is xmlrpc sync starting but I did not enable that !?
Maybe a print message before the if :)
move
log_error("[sarg] sarg_xmlrpc_sync.php is starting.");
from line 441 to 445 after
if(!$synconchanges) return;
-
2.0.1 Release x86 w/ latest Sarg (which is working pretty well)
Was a solution found for the LDAP issue? I've read the thread a few times and didn't see anything definitive.
I've tried every GUI config possible, forcing updates over and over, tweaking the conf file, reinstalled Sarg, restarted pfSense. etc.
I ran the packet sniffer on the LAN adapter for hours and ran another one on the AD LDAP server.
No port 389 traffic from the pfSense box at all.
From what I see, LDAP is dead.I'll keep trying but I'm not sure where to look next.
-
Not sure what that means ?
Maybe a too long line
Tried again with a blank block.log file from squidguard with a short entry.
SARG does not generate me any reports on that file.The access.log from squid is working fine - but as I said I do not want that - or better I am not allowed to do that ;-)
So my conclusion is:
The sarg.inc file needs modification to find the block.log file from squidguard. In the sarg.inc the squidguard_block_log variable is created but it will not be used in further code.BUT it seems that SARG does not know how to interpret the squidguard log files - even if it has some additional options for that. Google couldn't help me until now. Will do further searches.
-
Was a solution found for the LDAP issue? I've read the thread a few times and didn't see anything definitive.
Not yet. It looks like a missing LDAP dependence on compile arts. :(
-
BUT it seems that SARG does not know how to interpret the squidguard log files - even if it has some additional options for that. Google couldn't help me until now. Will do further searches.
I agree. But It's hard to test without using squidgurd. I did not found a working setup on Google too.
-
BUT it seems that SARG does not know how to interpret the squidguard log files - even if it has some additional options for that. Google couldn't help me until now. Will do further searches.
I agree. But It's hard to test without using squidgurd. I did not found a working setup on Google too.
Just for information - I posted on the squidguard mailing list:
http://www.shalla.de/mailman/private/squidguard/2012-December/002369.html -
-
Just for information - I posted on the squidguard mailing list:
The list is private :)
This is the answer on my question. I will please him to show me his config for sarg and the versions of squidguard and sarg he is using.
Hi Nachtfalke > 1.) Is it possible to analyze/read squidguard's blocked websites log with > SARG ? Yes, it definitly is. They will be shown as blocked sites in SARG. I am using this exact setup and its working fine. If you like, I can send you my config file, alongside with the version numbers of the programms. Greetings B. Brandt
-
Was a solution found for the LDAP issue? I've read the thread a few times and didn't see anything definitive.
Not yet. It looks like a missing LDAP dependence on compile arts. :(
OK. Thank you. If time and attention-span allows I'll poke around a bit.
For now I'll try the Users Association option for manual IP/Name mapping.
I have an related Idea:
I'm fantasizing about a pfSense WINS-like feature that would store+associate Username/Machine Name/IP+MAC Addy
In theory it'd pull info from LDAP, pfSense DHCP & DNS or possibly local LAN DHCP & DNS.The idea is it'd be a single database that packages could use to pull User Info.
Another option -> Pushing data from this db into whatever table a package is using to store it's LDAP/User info.Is this worth posting as a forum suggestion? I can't tell.
-
I got this as answer from the mailing list - not sure if this will help me. Need some time to check what he said and the corresponding .conf files.
Hi Attached you will find my configs, they are from an ubuntu 10.04 system, running squid 2.7Stable7, squidguard 1.4 and sarg 2.3. Several pitfalls I remember: - pay special attention to the HTMLOUT of sarg-reports.conf - pay special attention to the stopped.log directives in squidguard.conf - triple check that the squid and squidguard log files are readable and the HTMLOUT is writable by sarg - there is a known bug in squidguard concerning some escape chars in urls that cause the squidguard log file to become malformatted. Sarg dies when this happens. Therefore I am using a self patched version of squidguard: http://51762846.de.strato-hosting.eu/bene/public/squidguard/ Try running sarg-reports as root in from the console: It should start with something like /usr/sbin/sarg-reports daily SARG: Init SARG: Loading configuration from /etc/sarg/sarg.conf SARG: Loading exclude host file from: /etc/sarg/exclude_hosts SARG: Loading exclude file from: /etc/sarg/exclude_users SARG: Parameters: SARG: Hostname or IP address (-a) = SARG: Useragent log (-b) = SARG: Exclude file (-c) = /etc/sarg/exclude_hosts SARG: Date from-until (-d) = 04/12/2012-04/12/2012 SARG: Email address to send reports (-e) = SARG: Config file (-f) = /etc/sarg/sarg.conf SARG: Date format (-g) = Europe (dd/mm/yyyy) SARG: IP report (-i) = No SARG: Input log (-l) = /var/log/squid/access.log SARG: Redirector log (-L) = /var/log/squid/stopped.log SARG: Resolve IP Address (-n) = No SARG: Output dir (-o) = /var/www/squid-reports/Daily/ SARG: Use Ip Address instead of userid (-p) = No SARG: Accessed site (-s) = SARG: Time (-t) = SARG: User (-u) = SARG: Temporary dir (-w) = /tmp SARG: Debug messages (-x) = Yes SARG: Process messages (-z) = No If something goes wrong and you don't know what to make of the error message, just post it here. Hope this helps Greetings B. Brandt
-
Ok, I did some further tests. the sarg.inc is - as far as I tested it - correct.
But for squidguard it means:
If logging in squid is disabled then SARG cannot display only the blocked URL squidguard reported.
So in my situation I cannot use SARG because I am not allowed to have the squid access.log file. :( -
I am still working on recovering from a disaster caused by this package. I figured I'd drop a note here as a possible warning for anyone that is using this package. It may be possible that this was a user issue, rather then the fault of the package.
I sadly can't provide many details at this point, if I can come across anything I will follow back up. Either case:
I just lost my pfsense box due to massive corruption caused by (indirectly?) Sarg. I had Sarg running for ~2 month, maybe a bit more. A few days ago I noticed issues with networking and started digging into it. I found pfsense to be unresponsive. I rebooted it and started getting a lot of wonderful errors..
Either case, somehow Sarg had created enough files to run me out of inodes. It was somewhere near 60GB's of data, and 9.7M (yes, MILLION) inodes in use.
(this is for a 3 user network)I believe I was using the stock out of the box configuration on it. Sadly, it was a pain to get setup in the first place, that once it did start working I never did go back and look at it again.
-
Either case, somehow Sarg had created enough files to run me out of inodes. It was somewhere near 60GB's of data, and 9.7M (yes, MILLION) inodes in use.
(this is for a 3 user network)I believe I was using the stock out of the box configuration on it. Sadly, it was a pain to get setup in the first place, that once it did start working I never did go back and look at it again.
Current sarg version has compress report files and remove reports older then x days.
Sarg reports use a lot of inodes.
On my setup, I've installed a second disc with zfs just for report files. On zfs disc, I got 30million inodes. -
Ok, I did some further tests. the sarg.inc is - as far as I tested it - correct.
But for squidguard it means:
If logging in squid is disabled then SARG cannot display only the blocked URL squidguard reported.
So in my situation I cannot use SARG because I am not allowed to have the squid access.log file. :(What changes you did to get squidguard working? can you push it ot github?
Try to point sarg to an access.empty.log file on squid config at sarg.inc. this may solve your problem.
-
Ok, I did some further tests. the sarg.inc is - as far as I tested it - correct.
But for squidguard it means:
If logging in squid is disabled then SARG cannot display only the blocked URL squidguard reported.
So in my situation I cannot use SARG because I am not allowed to have the squid access.log file. :(What changes you did to get squidguard working? can you push it ot github?
Try to point sarg to an access.empty.log file on squid config at sarg.inc. this may solve your problem.
I tried that with an access.log file which just contains some entries but this didn't help me on the SARG reports. It doesn't show me blocked entries newer than the access.log file entries.
So there isn't anything I could push on github ;-)
In general it is working with your config with squidguard but you need the access.log from squid. If this file isn't present and actual you cannot generate reports.
Is dansguardian doing that without squid access.log file ?
-
I have also run into the error that others are seeing:
Error: Could not find report index file.
Check and save sarg settings and try to force sarg schedule.Here's what I've done.
- Totally uninstalled Sarg pkg.
- Used "find" command to locate and remove every directory or file referencing sarg in the name.
- Upgraded to absolute latest (2nd release from today) pfsense package.
- Rebooted.
- Reinstalled Sarg.
- Selected all report options and report types on the Sarg page in pfsense.
- Hit Save.
- Set up a 1h schedule and saved it.
- Hit "force update" under the schedule.
ls -al /usr/local/sarg-reports/
total 4
drwxr-xr-x 2 root wheel 512 Dec 10 21:19 .
drwxr-xr-x 19 root wheel 512 Dec 10 21:19 ..No index file(s) of any kind appear there.
This is a drag. What does it take to get a simple package to just install and work the first time?
Does anyone have a solution on how to fix this manually?
Thanks in advance for any help you can offer.
ps - I did find this in system.log:
Dec 10 21:20:24 gw php: /pkg_edit.php: [sarg] sarg_xmlrpc_sync.php is starting.
Dec 10 21:20:32 gw php: /pkg_edit.php: Sarg: force refresh now with args, compress() and none action after sarg finish.
Dec 10 21:20:32 gw php: /pkg_edit.php: The command '/usr/pbi/sarg-i386/bin/sarg ' returned exit code '1', the output was 'SARG: Cannot set the locale LC_ALL to the environment variable' -
Caldwell, there is no bug on sarg package for squid and dansguardian logs.
just take a a look on forum for a working config that I'm using and check your squid access log config.
-
Nachtfalke,
Maybe a grep on squid log file for denied entries????
This way there will be only denied access to report.
Did you tried to select only denied sites on reports to generate?
-
Nachtfalke,
Maybe a grep on squid log file for denied entries????
This way there will be only denied access to report.
You think of a possibility that a script could do the grep on the access.log, just save the denied entries in a new file and delete the original one ?
Didn't try that but could be a possibility.Did you tried to select only denied sites on reports to generate?
Not sure if I did that. But I saw all sites so I suppose that I didn't try that. Perhaps I can try this if I find some spare time. I uninstalled SARG some days ago.