What's new

logging URLs to separate server?

  • SNBForums Code of Conduct

    SNBForums is a community for everyone, no matter what their level of experience.

    Please be tolerant and patient of others, especially newcomers. We are all here to share and learn!

    The rules are simple: Be patient, be nice, be helpful or be gone!

RRands

Occasional Visitor
HI - I would like to log all URLs from my Asus Merlin to a logging server (or to the JFFS partition at a minimum). I did a quick search, but didn't find anything - is this a common or unusual request? Anyone know how I could get this done? Essentially, I am looking for a more permanent / longer log like what you can see in the Web UI (for Merlin) under "Adaptive QOS > Web History" (http://192.168.1.1/AdaptiveQoS_WebHistory.asp on my router - your ip may vary)

Any help or pointers are greatly appreciated!


-randy
 
HI - I would like to log all URLs from my Asus Merlin to a logging server (or to the JFFS partition at a minimum). I did a quick search, but didn't find anything - is this a common or unusual request? Anyone know how I could get this done? Essentially, I am looking for a more permanent / longer log like what you can see in the Web UI (for Merlin) under "Adaptive QOS > Web History" (http://192.168.1.1/AdaptiveQoS_WebHistory.asp on my router - your ip may vary)

Any help or pointers are greatly appreciated!


-randy
You could create a cron schedule that issues the basic record extraction
Bash:
sqlite3 /jffs/.sys/WebHistory/WebHistory.db "select datetime(timestamp, 'unixepoch', 'localtime') AS time, mac, url FROM History;"
or enhance the information:
and pipe the output direct to a flash drive attached to the router, or create a wrapper script to exploit rsync etc.

Whilst cron is old-skool, you may be able to extract the new records as they appear if it is deemed critical.
 
thanks, @Martineau - QQ for you - wouldn't this possibly lose some of the data (depending on the frequency that I run it), or create multiple entries of the same URL if I run it too frequently?
 
thanks, @Martineau - QQ for you - wouldn't this possibly lose some of the data (depending on the frequency that I run it), or create multiple entries of the same URL if I run it too frequently?
Indeed either is possible.

I wrote a script
which reports on the current contents of the History SQL database, so clearly if the database has been trimmed of old records between invocations of the script then data could be deemed lost.

However, if the data is exported, then it would be prudent to merge the exported records with the previously exported data or alternatively, keep track of the time of the last export, and alter the SQL extraction statement to only retrieve records that have been inserted into the SQL database since the last export.

To eliminate URL duplicates, you can pipe the sorted exported records through the uniq utility or if you export in .csv format to say an Excel spreadsheet, then it should be trivial to remove duplicates before creating/viewing the report.
 

Similar threads

Latest threads

Support SNBForums w/ Amazon

If you'd like to support SNBForums, just use this link and buy anything on Amazon. Thanks!

Sign Up For SNBForums Daily Digest

Get an update of what's new every day delivered to your mailbox. Sign up here!
Top