What's new

RT-N66U (fw:378.50) breaks download of large files with key filter enabled

  • SNBForums Code of Conduct

    SNBForums is a community for everyone, no matter what their level of experience.

    Please be tolerant and patient of others, especially newcomers. We are all here to share and learn!

    The rules are simple: Be patient, be nice, be helpful or be gone!

vnenov

New Around Here
I have enabled key filtering for 7-8 words on my new RT-N66U with Merlin 378.50 and I could not download anymore files larger than several megabytes from the Internet.

Google Chrome would give me an error "Failed - Network error", IE will also stop the download with an error. I could replicate the error consistently on both wirelesses and wired clients (both Windows 7 and Ubuntu 14.04).

The only way to download a large file was to ssh to the router itself and use wget.

First, I did not realize what was the cause and reset to defaults, which fixed the issue. Then I started to perform the configuration again, step by step, testing between the steps and the issue re-appeared right after I enabled the key filtering. This was fixed with disabling the key filtering.

Anyone else experiencing the same issue? Almost thought that I bought a lemon router and was planning to return it to the store first thing today.
 
https://github.com/RMerl/asuswrt-merlin/blob/master/release/src/router/rc/firewall.c#L3139

brought this up in #netfilter

fprintf(fp, "-I FORWARD -p tcp --sport 80 %s -m string --string \"%s\" --algo bm -j REJECT --reject-with tcp-reset\n"

pekster: wtf is "fprint" ?
pekster: And yes, that's probably going to cause problems if I'm reading that right. Binary downloads have arbitrary data, probably including the byte sequences 0x25 and 0x73 in succession
pekster: That looks like a horrible thing to do
pekster: This said, the "rule" (read: the segment that resembles a snippit of a complete rulese that's being fed through some random API you found) is highly suspect
pekster: ruleset*
pekster: You do realize that there's a 256^2 chance of any 2 random bytes containing that string, yes?
sinshiva: i'm really not a dev, just trying to get people the right information :p i'm not surprised about this, though, i'll pass it on
pekster: That rule is idiotic, is the point I'm making
sinshiva: yeah, i get it
pekster: Let's take a random binary file, which just happens to be the source code for sqlite right now:
pekster: grep -c '%s' ~/src/sqlite-autoconf-3080704.tar.gz
sinshiva: i wouldn't have recognized that error myself, but i understand you
pekster: 41
pekster: There are 1 occurrances of the string "%s" in that download. Because it's a gzip'd tar
pekster: 41*
pekster: Yea
pekster: Basically, don't do web filtering with the 'string' match
pekster: It's very problematic, and unlikely to work right for a wide variety of situations, such as smart people trying to trick it by splitting the request into >1 packet
pekster: I'm curious though, what feature is this supposed enact?
sinshiva: keyword filter
sinshiva: is what it's officially called
pekster: Well then %s would probably be fed through sprintf(3) at some point
pekster: Still, that's probably also a really horrible idea since any arbitrary string is more and more likely to occur randomly in a large binary file
pekster: So yea, that "feature" should be in a web proxy that knows what the MIME concent of the arriving page is

it sounds like this feature actually needs to be killed off

in unrelated news, i need to file a bug report with the dev of IRC Explorer for not allowing me to copy large swathes of text...
 
Last edited:
The %s is a placeholder for a string substitution (and the rest of of the statement was contained on the next line)

fprintf(fp, "-I FORWARD -p tcp --sport 80 %s -m string --string \"%s\" --algo bm -j REJECT --reject-with tcp-reset\n", timef, filterstr);

the first %s is replaced with the value of timef
the second %s is replaced with the value of filterstr

so this is just part of a loop to generate a rule for each value in the 'keyword_rulelist' variable (can be seen in the code above that line)

What words are you filtering on? Keyword filtering can also be pretty CPU intensive, so you may want to try and scale back the number of words.
 
Last edited:
I used the following keywords:
yu-gi-oh
yugioh
pokemon
naruto
porn
sex

The above keywords crashed the downloads of almost any zip file I tried to download, larger than 2-3 megabytes.

I had used the same keywords on RT-N16 with Tomato 1.28 beta in a feature called Access Restriction -> HTTP Request. There were no issues back then.

Thanks.
 
i think the problem is that words like 'sex' and 'porn' have too few characters leading to a greater likeliness in occurring randomly in a binary stream

since the iptables string match won't know to disregard that in a binary stream, etc., this method of filtration is erratic
 
Last edited:
Have you tried it on anything other than 378.50?

I just set your keywords and filter on for my fork running on an AC68R. Downloaded the zip of Merlin's github repository (1.1GB) without any problems.
 
I have not tried on anything other than 378.50. Has anyone else replicated my findings on RT-N66U with this firmware?

For now I will keep the keyword filtering disabled. I understand that this method of filtering has many shortcomings.
 
When I tried 378.50, when i add something to url filter or keyword filter, it broke my IPv6. I could ping and traceroute IPv6 addresses from router (RT-N66U), but not from PC. When disabled filter(s), IPv6 worked fine. No problems with 376.48_3 or official firmwares.
 
I just encountered the same problem with 378.55, didn't use 378.50.
Any large download fails. I used to use the url filter previously with stock and this did not happen. Trying out that instead and see what happens.
 
I just encountered the same problem with 378.55, didn't use 378.50.
Any large download fails. I used to use the url filter previously with stock and this did not happen. Trying out that instead and see what happens.

I have the same issue. Took me a while to discover it since I made other changes in my network in the last weeks. Websurfing seems unaffected, but larger files or timely downloads were impossible to deal whith. I had 378.52 and now 378.55 firmware with the same problem. Looks like disabling this solves the problem.
Large files are not the only problem. Downloading apple appstore files (even small apps), streaming videos and even downloading the firmware (30 MB) at mediafire resulted in failed downloads.
After disabling the keyword filter I could successfuly donwload the latest Merlin firmware from mediafire and appstore files into my network devices. I am still testing the results...
 
I have the same issue. Took me a while to discover it since I made other changes in my network in the last weeks. Websurfing seems unaffected, but larger files or timely downloads were impossible to deal whith. I had 378.52 and now 378.55 firmware with the same problem. Looks like disabling this solves the problem.
Large files are not the only problem. Downloading apple appstore files (even small apps), streaming videos and even downloading the firmware (30 MB) at mediafire resulted in failed downloads.
After disabling the keyword filter I could successfuly donwload the latest Merlin firmware from mediafire and appstore files into my network devices. I am still testing the results...
URL filter works the way its supposed to and results in no such issues.. Even my appstore downloads were failing with keyword filter.
 
Last edited:
Failed downloads are more likely to be either an ISP or a modem issue. Packet losses can cause stalls or dropped TCP connections.
 
Failed downloads are more likely to be either an ISP or a modem issue. Packet losses can cause stalls or dropped TCP connections.
In this case the ISP or Modem was not the problem. Testing out, I switched to my old Lynksys WRT54GS router whith no downloading problems.
 
Failed downloads are more likely to be either an ISP or a modem issue. Packet losses can cause stalls or dropped TCP connections.

I also had this problem (firmware 378.55) , and solved it as well by disabling the keyword filter. Unfortunately I found this post thread after fixing the problem :)

Here are the symptoms I encountered and steps I took (using UVerse 6MB as my provider):

1) At some point in the past I had added keyword 'sex' to the filter to help block my 12 year old.
I did not correlate this change to my downloads stopping, unfortunately.
Certain Updates in Apple Store on Ipad stopped working; also large updates - e.g. The Windows 10 ISO download did not work.

2) At first I thought it was a CDN issue between AT&T and the CDN routing, because if I was connected to my work VPN, or other VPN it would work.

3) Ran wireshark and traced a download - it would always stop in the same place with the Resets being sent.

4) Took the router out of the loop and connected 1 machine to the modem directly. It worked.

3) I then put the router back in the mix, swapped my 2 ASUS routers out, and same symptoms of failure.

4) I then ordered a 2210-02 Modem off of ebay to replace the NVG510 AT&T sent me - same symptoms.

5) Finally started tinkering with all of the settings in the Firewall section of the Router Firmware, turning them all off - the downloads would then work. Then went back to turn on each individual setting with a retest and narrowed it down to the keyword filter.

So, it does seem like the logic in the url keyword filter does cause problems if the binary randomly contains the sequence.

I hope this helps someone out.

If someone comes up with a fix, the Win10 ISO download (64-bit) , with keyword filter "sex" is a good test of a fix - it breaks quickly after starting the download - forgot the exact size, but around 10-20MB.

-DoubleAx
 
Last edited:
I'm running build 380.62_1 on my RTAC-68U and experienced the same issue. I've had the router for years used the keyword filtering for the first time the other day. I blocked the words "sex" and "porn". After that folks in my house couldn't download or update apps to their IPhones (it would get to halfway and then stop) and Netflix on devices started getting the "Can't Play Title...Try again later" error. I was pulling my hair trying to figure out what was going on..thinking it was an ISP issue...until I found my this thread and it jogged my memory on the change I had done. Undid the keyword filtering and voila...all good again.

Seems crazy that after so long this is still an issue. In any case, I think a much better solution to filter content is to use DNS based filtering service like OpenDNS. I just added their DNS IPs to the router and it filters out the bad stuff.
 

Similar threads

Latest threads

Support SNBForums w/ Amazon

If you'd like to support SNBForums, just use this link and buy anything on Amazon. Thanks!

Sign Up For SNBForums Daily Digest

Get an update of what's new every day delivered to your mailbox. Sign up here!
Top