• In the past we have tracked by mac address, but with the ease and knowledge about mac spoofing, this hasn't been a great solution for us as of lately.  We need to be able to know what users are doing what.  Right now we are using a combo of the squid access log, and then looking up who got that IP at that time in the captive portal log.  My two questions are:

    1.  Does anyone know of a good way to archive the captive portal logs so they don't get lost each time the computer is reset?  The best way I have thought of so far is just a crontab to run a script to copy and time stamp the file to another location.

    2.  Is there a better way for tracking individual users that I'm not aware of here?  I've thought about using static arp, but each location has 250-500 users.

    Any suggestions would be appreciated.

  • send the captive portal log to syslog server and then pipe it into sql db then make an interface to search it or use one like phpsyslog-ng

  • If the squid package supports AUTH then require people to authenticate to the proxy.  That'll make it fairly trivial to track who visited where (assuming you enable logging of the authenticated user).

    This may break some things that perform web updates, so you may have to spend some time adding ACL rules that'll bypass the auth for certain destinations.

Log in to reply