What's new

Can I export report from web history on 384.11 - Asus RT-88U

  • SNBForums Code of Conduct

    SNBForums is a community for everyone, no matter what their level of experience.

    Please be tolerant and patient of others, especially newcomers. We are all here to share and learn!

    The rules are simple: Be patient, be nice, be helpful or be gone!

Sir about those links also aren't working:

So here is my shoddy TrafficAnalyzer_Report.sh script to allow queries to be made on the Traffic Analyzer database
So here is my shoddy AiProtectionMonitor_Report.sh" script to allow queries to be made on the AiProtection Monitor database
As stated previously, the Pastebin links now show the appropriate URL to the script hosted on GitHub.
and I didn't understand this point:
View attachment 18845
I have the Firmware 384.11 should I install anything?
You will need to download v1.11 of the script as I have now updated the script to use the included firmware version v384.11 of the SQL utility (if it exists) rather than expect to use the Entware version.
 
As stated previously, the Pastebin links now show the appropriate URL to the script hosted on GitHub.

You will need to download v1.11 of the script as I have now updated the script to use the included firmware version v384.11 of the SQL utility (if it exists) rather than expect to use the Entware version.

Well if SQL is better how I can add it to the firmware? I already have v384.11 am I suppose to have SQL? if not why?
upload_2019-8-2_15-2-26.png
 
As stated previously, the Pastebin links now show the appropriate URL to the script hosted on GitHub.

You will need to download v1.11 of the script as I have now updated the script to use the included firmware version v384.11 of the SQL utility (if it exists) rather than expect to use the Entware version.

Well I have used the v1.11 and it worked now can I filter by Mac Address? what is the command?
and can I export it to excel file? because when i copy and paste all of the results inserted into 1 column.

Also is there Entware Version for AiProtection and TrafficAnalyzer_Report
 
Last edited:
As stated previously, the Pastebin links now show the appropriate URL to the script hosted on GitHub.

You will need to download v1.11 of the script as I have now updated the script to use the included firmware version v384.11 of the SQL utility (if it exists) rather than expect to use the Entware version.

Now what is Important for me is can I filter by Mac Address Let's say that I need to see report about specific mac address am I able to do that?
 
Well I have used the v1.11 and it worked now can I filter by Mac Address?
Humans usually prefer to use Hostnames in preference to IP addresses or even MAC addresses, so the reports will assume that MAC addresses are assigned reserved IPs/hostnames.
You can specify the 'ip=' filter directive with either a group name,hostname or explicit IP address.

See the script help
Code:
./WebHistory_Report.sh   -h
for a list of commandline options and examples.
Can I export it to excel file? because when i copy and paste all of the results inserted into 1 column.
Whilst the SQL database does indeed only use the MAC address as a unique 'key', filtering by MAC is currently not supported.

Export direct to '.csv' is also currently not supported, although I would suspect that if you copy'n'paste the output into a text file, then you should be able to reformat the data into a compatible format suitable for Excel or Google docs etc.

If I find time I will add the features to v1.12.

Also is there Entware Version for AiProtection and TrafficAnalyzer_Report
No idea what you meano_O
 
Humans usually prefer to use Hostnames in preference to IP addresses or even MAC addresses, so the reports will assume that MAC addresses are assigned reserved IPs/hostnames.
You can specify the 'ip=' filter directive with either a group name,hostname or explicit IP address.

See the script help
Code:
./WebHistory_Report.sh   -h
for a list of commandline options and examples.

Whilst the SQL database does indeed only use the MAC address as a unique 'key', filtering by MAC is currently not supported.

Export direct to '.csv' is also currently not supported, although I would suspect that if you copy'n'paste the output into a text file, then you should be able to reformat the data into a compatible format suitable for Excel or Google docs etc.

If I find time I will add the features to v1.12.


No idea what you meano_O

Alright thank you so much,
About the last point just forget about it ^_^
I have problem with the IP Results it return n/a also for host name

Also how to get Webhistory of the previous month?
 
Last edited:
I have problem with the IP Results it return n/a also for host name
Ahhh OK, :oops: well unless you can assign reserved IP(s) to the MAC address(es), then you will either need to modify the script yourself, or wait for v1.12

Also how to get Webhistory of the previous month?
As per the script's help; use the 'date=yyyy/dd' commandline argument.
 
Ahhh OK, :oops: well unless you can assign reserved IP(s) to the MAC address(es), then you will either need to modify the script yourself, or wait for v1.12


As per the script's help; use the 'date=yyyy/dd' commandline argument.
I have done that but still not reflecting any results while when I type 2019-08 I got the records so it just give me from the first of this month let's say we have entered the 9th month can I still get the records of 8th month?

also sometimes the console shows me that there are 16 records for example but those records aren't displayed and what is the correct syntax for the command?

- date=yyyy/mm/dd
- 'date=yyyy/mm/dd'
- 'date='yyyy/mm/dd

When I am trying to take backup I got this message:
mkdir: can't create directory '/opt/': No such file or directory
cp: can't create '/opt/var/WebHistory/WebHistory.db-Backup-20190805-093140': No such file or directory

(WebHistory_Report.sh): 12404 ***ERROR '/jffs/.sys/WebHistory/WebHistory.db' backup FAILED!


Martineau Man you are awesome you are just trying to help people and not ignoring them I really appreciate your effort.
 
Last edited:
I have done that but still not reflecting any results while when I type 2019-08 I got the records so it just give me from the first of this month let's say we have entered the 9th month can I still get the records of 8th month?

also sometimes the console shows me that there are 16 records for example but those records aren't displayed and what is the correct syntax for the command?
Code:
./WebHistory_Report.sh   date=2019/05

Processing '/jffs/.sys/WebHistory/WebHistory.db' database....please wait!
(WebHistory_Report.sh): 24268 v1.12 Web History starting.....

 NOTE: Columns in white are eligible for filters; red text indicates a match on the filters requested; (URLs are Xshell5/MobaXterm hyperlinks)

 Filter by Date, AND by current hour ==> '2019-05|10:'

   YYYY/MM/DD    HH:MM:SS    MAC address           Host Name           IP address        URL            
   2019/05/02    10:00:26    AC:XX:XX:XX:XX:XX    HP-Envy14           10.88.8.114       http://tps613.doubleverify.com
   2019/05/02    10:00:28    AC:XX:XX:XX:XX:XX    HP-Envy14           10.88.8.114       http://i.simpli.fi
<snip>


When I am trying to take backup I got this message:
Code:
mkdir: can't create directory '/opt/': No such file or directory
cp: can't create '/opt/var/WebHistory/WebHistory.db-Backup-20190805-093140': No such file or directory

(WebHistory_Report.sh): 12404 ***ERROR '/jffs/.sys/WebHistory/WebHistory.db' backup FAILED!
As previously stated, prior to v384.11 the original pre-reqs for the script was the Entware 'sqlite3' utility, consequently users had Entware installed, and this requires a USB disk to be mounted, hence custom scripts can safely use the Entware filesystem for permanent storage (of potentially large files).

However, I have now added the option for you to override the 'missing' Entware filesystem and specify the directory of your choice - with of course limitations i.e. '/tmp' is currently not allowed.
e.g.
Code:
./WebHistory_Report.sh    backup=/mnt/RT-AC68U

(WebHistory_Report.sh): 18044 '/jffs/.sys/WebHistory/WebHistory.db' backup completed successfully

Also I have added the ability to filter by MAC, and optionally create a .csv file.
Code:
./WebHistory_Report.sh   date=2019/05   report=web.csv   mac=xx:xx:xx:xx:xx

Processing '/jffs/.sys/WebHistory/WebHistory.db' database....please wait!
(WebHistory_Report.sh): 32752 v1.12 Web History starting.....
 NOTE: Columns in white are eligible for filters; red text indicates a match on the filters requested; (URLs are Xshell5/MobaXterm hyperlinks)

 Filter by Date, AND by MAC, AND by current hour ==> '2019-05|xx:xx:xx:xx:xx|11:'

 Report file (.csv format): 'web.csv'

   YYYY/MM/DD    HH:MM:SS    MAC address           Host Name           IP address        URL               
   2019/05/08    11:00:19    xx:xx:xx:xx:xx     HP-Envy14           10.88.8.114       http://20.client-channel.google.com
   2019/05/08    11:02:36    xx:xx:xx:xx:xx     HP-Envy14           10.88.8.114       http://15.client-channel.google.com
   2019/05/08    11:49:02    xx:xx:xx:xx:xx     HP-Envy14           10.88.8.114       http://webapps.stackexchange.com

NOTE: If there are many records in the database, and you wish to create a .csv file, then either of two preferred fast methods should be used.

i.e. having created the .csv, then using grep/awk/sed etc. will allow filtering at greater speed than displaying the records on the screen.

e.g. SQL can write 18,458 raw records to the .csv file in less than 1 second, and if the script is used (without displaying the records on screen) to format the timestamp field then it still only takes less than 2 seconds!

Method 1 simply dumps the database records as-is, with the corresponding column headers:
Code:
./WebHistory_Report.sh report=web.csv   noscript   nofilter

(WebHistory_Report.sh): 11682 v1.12 Web History starting.....
 NOTE: Columns in white are eligible for filters; red text indicates a match on the filters requested; (URLs are Xshell5/MobaXterm hyperlinks)

 Filter ALL i.e. no filter ==> ''

 Report file (.csv format): 'web.csv'

Total Records = 18458

real 0m 0.89s
user 0m 0.41s
sys 0m 0.49s
Code:
wc -l web.csv

18458 web.csv
Method 2 will convert the timestamp into a more human-friendly format (without column headers)
Code:
./WebHistory_Report.sh report=web.csv   nodisplay   nofilter

(WebHistory_Report.sh): 4531 v1.12 Web History starting.....
 NOTE: Columns in white are eligible for filters; red text indicates a match on the filters requested; (URLs are Xshell5/MobaXterm hyperlinks)

 Filter ALL i.e. no filter ==> ''

 Report file (.csv format): 'web.csv'

   YYYY/MM/DD    HH:MM:SS    MAC address           Host Name           IP address        URL
       
  ***No Display of records on screen requested***

Summary: Result count = 18458

real 0m 1.53s
user 0m 1.03s
sys 0m 0.50s
Code:
wc -l web.csv

18458 web.csv

Please download v1.12
 
Last edited:
Code:
./WebHistory_Report.sh   date=2019/05

Processing '/jffs/.sys/WebHistory/WebHistory.db' database....please wait!
(WebHistory_Report.sh): 24268 v1.12 Web History starting.....

 NOTE: Columns in white are eligible for filters; red text indicates a match on the filters requested; (URLs are Xshell5/MobaXterm hyperlinks)

 Filter by Date, AND by current hour ==> '2019-05|10:'

   YYYY/MM/DD    HH:MM:SS    MAC address           Host Name           IP address        URL           
   2019/05/02    10:00:26    AC:XX:XX:XX:XX:XX    HP-Envy14           10.88.8.114       http://tps613.doubleverify.com
   2019/05/02    10:00:28    AC:XX:XX:XX:XX:XX    HP-Envy14           10.88.8.114       http://i.simpli.fi
<snip>



As previously stated, prior to v384.11 the original pre-reqs for the script was the Entware 'sqlite3' utility, consequently users had Entware installed, and this requires a USB disk to be mounted, hence custom scripts can safely use the Entware filesystem for permanent storage (of potentially large files).

However, I have now added the option for you to override the 'missing' Entware filesystem and specify the directory of your choice - with of course limitations i.e. '/tmp' is currently not allowed.
e.g.
Code:
./WebHistory_Report.sh    backup=/mnt/RT-AC68U

(WebHistory_Report.sh): 18044 '/jffs/.sys/WebHistory/WebHistory.db' backup completed successfully

Also I have added the ability to filter by MAC, and optionally create a .csv file.
Code:
./WebHistory_Report.sh   date=2019/05   report=web.csv   mac=xx:xx:xx:xx:xx

Processing '/jffs/.sys/WebHistory/WebHistory.db' database....please wait!
(WebHistory_Report.sh): 32752 v1.12 Web History starting.....
 NOTE: Columns in white are eligible for filters; red text indicates a match on the filters requested; (URLs are Xshell5/MobaXterm hyperlinks)

 Filter by Date, AND by MAC, AND by current hour ==> '2019-05|xx:xx:xx:xx:xx|11:'

 Report file (.csv format): 'web.csv'

   YYYY/MM/DD    HH:MM:SS    MAC address           Host Name           IP address        URL              
   2019/05/08    11:00:19    xx:xx:xx:xx:xx     HP-Envy14           10.88.8.114       http://20.client-channel.google.com
   2019/05/08    11:02:36    xx:xx:xx:xx:xx     HP-Envy14           10.88.8.114       http://15.client-channel.google.com
   2019/05/08    11:49:02    xx:xx:xx:xx:xx     HP-Envy14           10.88.8.114       http://webapps.stackexchange.com

NOTE: If there are many records in the database, and you wish to create a .csv file, then either of two preferred fast methods should be used.

i.e. having created the .csv, then using grep/awk/sed etc. will allow filtering at greater speed than displaying the records on the screen.

e.g. SQL can write 18,458 raw records to the .csv file in less than 1 second, and if the script is used (without displaying the records on screen) to format the timestamp field then it still only takes less than 2 seconds!

Method 1 simply dumps the database records as-is, with the corresponding column headers:
Code:
./WebHistory_Report.sh report=web.csv   noscript   nofilter

(WebHistory_Report.sh): 11682 v1.12 Web History starting.....
 NOTE: Columns in white are eligible for filters; red text indicates a match on the filters requested; (URLs are Xshell5/MobaXterm hyperlinks)

 Filter ALL i.e. no filter ==> ''

 Report file (.csv format): 'web.csv'

Total Records = 18458

real 0m 0.89s
user 0m 0.41s
sys 0m 0.49s
Code:
wc -l web.csv

18458 web.csv
Method 2 will convert the timestamp into a more human-friendly format (without column headers)
Code:
./WebHistory_Report.sh report=web.csv   nodisplay   nofilter

(WebHistory_Report.sh): 4531 v1.12 Web History starting.....
 NOTE: Columns in white are eligible for filters; red text indicates a match on the filters requested; (URLs are Xshell5/MobaXterm hyperlinks)

 Filter ALL i.e. no filter ==> ''

 Report file (.csv format): 'web.csv'

   YYYY/MM/DD    HH:MM:SS    MAC address           Host Name           IP address        URL
      
  ***No Display of records on screen requested***

Summary: Result count = 18458

real 0m 1.53s
user 0m 1.03s
sys 0m 0.50s
Code:
wc -l web.csv

18458 web.csv

Please download v1.12


Thanks for your amazing effort,

About exporting csv file I have typed the code but I can't find the .csv file?
 
Code:
./WebHistory_Report.sh   date=2019/05

Processing '/jffs/.sys/WebHistory/WebHistory.db' database....please wait!
(WebHistory_Report.sh): 24268 v1.12 Web History starting.....

 NOTE: Columns in white are eligible for filters; red text indicates a match on the filters requested; (URLs are Xshell5/MobaXterm hyperlinks)

 Filter by Date, AND by current hour ==> '2019-05|10:'

   YYYY/MM/DD    HH:MM:SS    MAC address           Host Name           IP address        URL           
   2019/05/02    10:00:26    AC:XX:XX:XX:XX:XX    HP-Envy14           10.88.8.114       http://tps613.doubleverify.com
   2019/05/02    10:00:28    AC:XX:XX:XX:XX:XX    HP-Envy14           10.88.8.114       http://i.simpli.fi
<snip>



As previously stated, prior to v384.11 the original pre-reqs for the script was the Entware 'sqlite3' utility, consequently users had Entware installed, and this requires a USB disk to be mounted, hence custom scripts can safely use the Entware filesystem for permanent storage (of potentially large files).

However, I have now added the option for you to override the 'missing' Entware filesystem and specify the directory of your choice - with of course limitations i.e. '/tmp' is currently not allowed.
e.g.
Code:
./WebHistory_Report.sh    backup=/mnt/RT-AC68U

(WebHistory_Report.sh): 18044 '/jffs/.sys/WebHistory/WebHistory.db' backup completed successfully

Also I have added the ability to filter by MAC, and optionally create a .csv file.
Code:
./WebHistory_Report.sh   date=2019/05   report=web.csv   mac=xx:xx:xx:xx:xx

Processing '/jffs/.sys/WebHistory/WebHistory.db' database....please wait!
(WebHistory_Report.sh): 32752 v1.12 Web History starting.....
 NOTE: Columns in white are eligible for filters; red text indicates a match on the filters requested; (URLs are Xshell5/MobaXterm hyperlinks)

 Filter by Date, AND by MAC, AND by current hour ==> '2019-05|xx:xx:xx:xx:xx|11:'

 Report file (.csv format): 'web.csv'

   YYYY/MM/DD    HH:MM:SS    MAC address           Host Name           IP address        URL              
   2019/05/08    11:00:19    xx:xx:xx:xx:xx     HP-Envy14           10.88.8.114       http://20.client-channel.google.com
   2019/05/08    11:02:36    xx:xx:xx:xx:xx     HP-Envy14           10.88.8.114       http://15.client-channel.google.com
   2019/05/08    11:49:02    xx:xx:xx:xx:xx     HP-Envy14           10.88.8.114       http://webapps.stackexchange.com

NOTE: If there are many records in the database, and you wish to create a .csv file, then either of two preferred fast methods should be used.

i.e. having created the .csv, then using grep/awk/sed etc. will allow filtering at greater speed than displaying the records on the screen.

e.g. SQL can write 18,458 raw records to the .csv file in less than 1 second, and if the script is used (without displaying the records on screen) to format the timestamp field then it still only takes less than 2 seconds!

Method 1 simply dumps the database records as-is, with the corresponding column headers:
Code:
./WebHistory_Report.sh report=web.csv   noscript   nofilter

(WebHistory_Report.sh): 11682 v1.12 Web History starting.....
 NOTE: Columns in white are eligible for filters; red text indicates a match on the filters requested; (URLs are Xshell5/MobaXterm hyperlinks)

 Filter ALL i.e. no filter ==> ''

 Report file (.csv format): 'web.csv'

Total Records = 18458

real 0m 0.89s
user 0m 0.41s
sys 0m 0.49s
Code:
wc -l web.csv

18458 web.csv
Method 2 will convert the timestamp into a more human-friendly format (without column headers)
Code:
./WebHistory_Report.sh report=web.csv   nodisplay   nofilter

(WebHistory_Report.sh): 4531 v1.12 Web History starting.....
 NOTE: Columns in white are eligible for filters; red text indicates a match on the filters requested; (URLs are Xshell5/MobaXterm hyperlinks)

 Filter ALL i.e. no filter ==> ''

 Report file (.csv format): 'web.csv'

   YYYY/MM/DD    HH:MM:SS    MAC address           Host Name           IP address        URL
      
  ***No Display of records on screen requested***

Summary: Result count = 18458

real 0m 1.53s
user 0m 1.03s
sys 0m 0.50s
Code:
wc -l web.csv

18458 web.csv

Please download v1.12

Please I hope you do this also on AiProtection Script and Traffic Analyzer Script it would be awesome currently I am testing out the new features and it looks like everything work like a charm..!

Regards.
 
Please I hope you do this also on AiProtection Script and Traffic Analyzer Script it would be awesome currently I am testing out the new features and it looks like everything work like a charm..!

WebHistory_Report.sh script features have now been ported to both AiProtection_Report.sh and TrafficMonitor_Report.sh

All three scripts have been updated to v1.13
 
Alright I am going to test them out
Whoops :oops:, forgot to port the 'backup=' modification to both AiProtection_Report.sh and TrafficMonitor_Report.sh

v1.14 now also displays the backup location/name on completion of a successful backup.
 
One thing I didn't understand until now, maybe I didn't noted it...
Can I find the records of previous months? because as I said looks like it just reflect the records of current month.

Any Guidelines?

Regards.
 
One thing I didn't understand until now, maybe I didn't noted it...
Can I find the records of previous months? because as I said looks like it just reflect the records of current month.
As shown in post #29 I extracted the database records for 'May 2019'.

I suggest you dump all the database records to a .csv file using Method 2, then you can then use grep/awk etc. to see if there are any records for the previous month, which may then indicate a problem with the script.

If the records are not being returned, then you should provide a screen print of the script output/.csv file showing the actual filters used.
 
As shown in post #29 I extracted the database records for 'May 2019'.

I suggest you dump all the database records to a .csv file using Method 2, then you can then use grep/awk etc. to see if there are any records for the previous month, which may then indicate a problem with the script.

If the records are not being returned, then you should provide a screen print of the script output/.csv file showing the actual filters used.

I have tried to filter for previous month which is July but no results as shown in Screenshots:
Capture.jpg


I though the problem because I didn't set 'time=' to get the results in anytime and this is the result:
Capture2.jpg


I didn't understand the point of "greb/awk" even I don't know how to use it or what it is can you explain please?

Note: even in the GUI when I navigate to Web History I can see only the records of current month not the previous.

Actually there's another problem I have noted:
When I type ./WebHistory_Report.sh nofilter noscript I see the results coming in some of them in 2/8/2019 and so on but after everything finish I just can see 7/8/2019 & 6/8/2019 while I am sure I have saw 5/8/2019 while the list was filling up.

Regards.
 
Last edited:
I didn't understand the point of "greb/awk" even I don't know how to use it or what it is can you explain please?
Although not all utilities are available on the router, I suggest you review Unix commands by Category to see for which purpose you would use awk/sed and grep etc.
NOTE: Weirdly, grep is missing in the above? but it does appear in this more comprehensive list List ofUnix commands

So for example rather than use the script to perform the filtering you can use the Unix utilities

e.g. Use 'grep' to filter out any Web History records that contain the string 'google', and use the 'head' utility to only print the first 10 matching results:
Code:
./WebHistory_Report.sh nofilter noscript | grep google | head -n 10

(WebHistory_Report.sh): 14364 v1.14 Web History starting.....

2019-01-31 13:43:11|1548942191|58:C5:CB:05:5A:4B|android.clients.google.com
2019-01-31 16:44:49|1548953089|58:C5:CB:05:5A:4B|inbox.google.com
2019-01-31 16:31:37|1548952297|58:C5:CB:05:5A:4B|www.google.com
2019-01-31 16:43:07|1548952987|58:C5:CB:05:5A:4B|www.googleapis.com
2019-01-31 16:45:41|1548953141|58:C5:CB:05:5A:4B|clients4.google.com
2019-01-31 16:44:48|1548953088|58:C5:CB:05:5A:4B|play.googleapis.com
2019-08-07 08:27:20|1565166440|54:60:09:0A:17:F4|tools.google.com
2019-01-31 16:37:39|1548952659|58:C5:CB:05:5A:4B|safebrowsing.googleapis.com
2019-06-21 22:48:29|1561157309|48:45:20:D7:A6:22|clientservices.googleapis.com
2019-06-21 22:48:29|1561157309|48:45:20:D7:A6:22|accounts.google.com
This example adds an additional filter to the above matching records and eliminates any 'google' records that contain 'android'
Code:
./WebHistory_Report.sh nofilter noscript | grep google | grep -v android | head -n 10

(WebHistory_Report.sh): 25585 v1.14 Web History starting.....

2019-01-31 16:44:49|1548953089|58:C5:CB:05:5A:4B|inbox.google.com
2019-01-31 16:31:37|1548952297|58:C5:CB:05:5A:4B|www.google.com
2019-01-31 16:43:07|1548952987|58:C5:CB:05:5A:4B|www.googleapis.com
2019-01-31 16:45:41|1548953141|58:C5:CB:05:5A:4B|clients4.google.com
2019-01-31 16:44:48|1548953088|58:C5:CB:05:5A:4B|play.googleapis.com
2019-08-07 08:27:20|1565166440|54:60:09:0A:17:F4|tools.google.com
2019-01-31 16:37:39|1548952659|58:C5:CB:05:5A:4B|safebrowsing.googleapis.com
2019-06-21 22:48:29|1561157309|48:45:20:D7:A6:22|clientservices.googleapis.com
2019-06-21 22:48:29|1561157309|48:45:20:D7:A6:22|accounts.google.com
2019-06-22 07:31:55|1561188715|48:45:20:D7:A6:22|www.googleapis.com
This example adds a further filter that eliminates any of the above matching records that occurred on the 31st day of any month in 2019:
Code:
./WebHistory_Report.sh nofilter noscript | grep google | grep -v android | grep -vE "2019-..-31" | head -n 10

(WebHistory_Report.sh): 22111 v1.14 Web History starting.....

2019-08-07 08:27:20|1565166440|54:60:09:0A:17:F4|tools.google.com
2019-06-21 22:48:29|1561157309|48:45:20:D7:A6:22|clientservices.googleapis.com
2019-06-21 22:48:29|1561157309|48:45:20:D7:A6:22|accounts.google.com
2019-06-22 07:31:55|1561188715|48:45:20:D7:A6:22|www.googleapis.com
2019-06-19 09:36:24|1560936984|48:45:20:D7:A6:22|www.google.co.uk
2019-06-21 22:48:39|1561157319|48:45:20:D7:A6:22|clients2.google.com
2019-06-21 23:04:34|1561158274|48:45:20:D7:A6:22|safebrowsing.googleapis.com
2019-06-22 07:32:30|1561188750|48:45:20:D7:A6:22|update.googleapis.com
2019-05-10 08:52:03|1557478323|48:45:20:D7:A6:22|storage.googleapis.com
2019-06-22 07:52:53|1561189973|48:45:20:D7:A6:22|pagead2.googlesyndication.com
Hopfully you can see that the use of the utilities (in this case 'grep') can provide very complex filtering.

NOTE: The dates are in 'YYYY-MM-DD' format.

I suggest you run either of the commands and provide the output.
Actually there's another problem I have noted:
When I type ./WebHistory_Report.sh nofilter noscript I see the results coming in some of them in 2/8/2019 and so on but after everything finish I just can see 7/8/2019 & 6/8/2019 while I am sure I have saw 5/8/2019 while the list was filling up.
As I am in the UK, the dates you have quoted would to me be the 2nd,6th,7th August and 5th August, not 8th February, 8th July,8th August and 8th May.
 
Last edited:

Support SNBForums w/ Amazon

If you'd like to support SNBForums, just use this link and buy anything on Amazon. Thanks!

Sign Up For SNBForums Daily Digest

Get an update of what's new every day delivered to your mailbox. Sign up here!
Top