What's new

Is Your Router's Transmit Power Juiced?

  • SNBForums Code of Conduct

    SNBForums is a community for everyone, no matter what their level of experience.

    Please be tolerant and patient of others, especially newcomers. We are all here to share and learn!

    The rules are simple: Be patient, be nice, be helpful or be gone!

Let's break this down and decode the weasel words...

All devices (AP or client) operating in any U-NII band must be secured to prevent unauthorized software modification and to ensure it operates as approved to prevent harmful interference.

If a client or AP operates in the UNII-2 operating band - this being channels 52 thru 64, then the device must not allow 3rd party firmware/software changes.

(not clear if this includes the UNII-2 Extended Channels, 100 thru 140, but my guess here is yes)

It does not change anything about UNII-1 (Channels 36 thru 48) or the UNII-3/SM band (149 thru 165) with regards to the item mentioned with UNII-2

The exact methods used to secure the software are left to the manufacturer, but must be documented in their application for equipment authorization to the FCC. The FCC is not setting specific technical security requirements since they are likely to change over time, but rather defining the capabilities that should be implemented by manufacturers.

The OEM must state that the software may not be altered by third parties, and state how this is done - and they leave this open for the OEM's to sort this out.

This is interesting - There's a number of ways to do this - SecureBoot, Signed Code, etc... and must of this also conflicts with certain provisions in GPL2/3... for 802.11 based chipsets that support the bands in question, and run a split MAC, this is more of a challenge from a practical perspective, the alternate approach would be to implement the MAC and Baseband code, along with RF in hardware, bypassing the GPL restrictions for devices and code that link to and utilize GPL2/3 libraries.

GPL3 is more of a problem, BTW...

They do make note that more detailed security requirements may be necessary later as software-defined radio technology develops.

In other words, from an extreme perspective, there be dragons here, but again being practical about things, may means perhaps, but likely not...

They also declined to implement rules that would force manufacturers to render a device inoperable if unauthorized modifications were made, citing additional complexity and costs resulting in questionable benefits above the software security being mandated.

yep... because FCC can define rules, but they cannot define implementation.

So it sounds big and scary - but perhaps not as much as one would suspect.

sfx
 
Here's the discussion from the R&O:

Because the current and future use of the 5 GHz U-NII bands is heavily reliant on the successful implementation of the Commission's technical rules, the Commission proposed to require that manufacturers implement security features in any digitally modulated device capable of operating in any of the U-NII bands, so that third parties are not able to reprogram the device to operate outside the parameters for which the device was certified.

Because 5 GHz U-NII devices are able to operate across such a wide swath of spectrum, any device could potentially be reprogrammed to operate outside of its certified frequency range. Accordingly, the Commission adopted its proposal in the NPRM that manufacturers must take steps to prevent unauthorized software changes to their equipment in all of the U-NII bands. It leaves the precise methods of ensuring the integrity of the software in a radio to the manufacturer, but requires the manufacturer to document those methods in its application for equipment authorization and declines to set specific security protocol or authentication requirements at this time, so as not to hinder the development of the technology used to provide such security, or to be unduly burdensome on manufacturers.

The Commission acknowledges that it may have to specify more detailed security requirements at a later date as software driven radio technology develops. The Commission directed OET to provide guidance, through the Knowledge Data Base (KDB) on what types of security measures work effectively, and what types do not, as well as on the level of detail the FCC will typically need to evaluate the authorization request.

The Commission reiterated its observation in the NPRM that some radios are designed so that they can communicate directly with each other,rather than through a control point, and thus they could function as either a “master” that initiates a network or as a “client” device within the network. The Commission also believes that it is important to ensure that client devices cannot be unlawfully reprogrammed to perform the functions of an access point. Thus, the Commission concludes that all devices that operate under the U-NII rules must be subject to the device security requirements.

The Commission believes the enhanced security measures will be effective, and conclude that there is no need for a reactive scheme such as disabling devices that are modified or tampered with, as urged by some commenters. The Commission intends to enforce its security protocol requirement carefully and vigorously.

Transmitter ID. The Commission declines to require U-NII devices to transmit identifying information. While the Commission's experience in the field has indicated that a transmitter ID requirement would help to more quickly identify and locate devices that cause harmful interference, the Commission is not persuaded that the benefits accrued from such a requirement would outweigh the costs to implement it at this time. One of its primary goals throughout this proceeding is to prioritize eliminating the occurrence of harmful interference in the first instance. The Commission's adoption of enhanced security requirements, directly addresses this priority, whereas a transmitter identification requirement does not. However, if harmful interference continues to be a problem the Commission will reevaluate the costs and benefits associated with a transmitter ID requirement, recognizing that it may be necessary to implement more costly solutions to eliminate the harmful interference if devices operating in the band continue to cause harmful interference.
 
You would know better than I. What does ASUS do?

At this point, very little. The firmware code is open-source (except for the usual proprietary bits). Currently, when a router is first configured, the regional settings are copied from the bootloader into nvram. When the firmware code initializes the wireless radio, it reads the regional information from nvram, and configures the wireless interface (through the closed-source wireless driver) based on the region set in nvram.

So right now, nothing prevents someone from forcing his US router to use a different region that allows additional channels (such as the Japan region for the 2.4 GHz band) or different power limitations. Those are enforced by the wireless driver based on the selected region. Unlike for example a DVD player, you can change your device's region.

So, if the FCC requirements were to force manufacturers into ensuring that a router couldn't operate in an illegal region (thus bypassing US restrictions), that would imply a few possible scenarios:

1) Prevent any kind of third party firmware from being flashed. This means a signed bootloader, and the end of open source firmware replacements - at least for devices sold in the US.

2) Just ensure that a region cannot be modified. That could be done by having the closed-source wireless driver read its region setting from a read-only location (for example the bootloader) rather than from nvram. As long the driver remains closed-source, even third party firmware authors would be limited by this (except for those who own an actual SDK, like the DD-WRT developers)

3) Or put the burden strictly on the chip manufacturer, which means having the radio's own baseband/firmware code be impossible to replace. Or, perhaps having different SKUs for US-targeted products, where the wireless controllers would be hardcoded to match US requirements

At this point, it's all very vague, and maybe there's really nothing to worry about. But there's enough precedent to make me worried about what might happen down the road.
 
Thanks for the comments, gentlemen.

RMHC: I have always wondered how manfs get away with exposing region controls to users. Can you provide any insight?
 
Related: There's not much to stop 2.4GHz product users from changing the region code and using that upper channel that is too close to FAA radars to be used.
 
Even if a radar frequency is used, wouldn't it not cause an issue especially since the wifi range is so short? (seems more likely that if those frequencies are being actively used by the government, then the background noise on those channels will be high enough to degrade wifi performance. Think of it like using an FM transmitter, it will broadcast on a licensed spectrum, but you will likely not see someone try to broadcast on the same frequency as the high powered broadcasts from the major radio stations, (you will likely not overpower them unless you are touching it to the antenna, and you will simply degrade your own audio quality.
 
Even if a radar frequency is used, wouldn't it not cause an issue especially since the wifi range is so short? <snip>

the issue with weather radar is its not just transmit , the Doppler radar receives it signal back and its the over powered wifi signal that interfere with the Doppler radar reception , so increasing the power on those channels would cause more issues for the radar , not the local wifi
 
Even if a radar frequency is used, wouldn't it not cause an issue especially since the wifi range is so short? (seems more likely that if those frequencies are being actively used by the government, then the background noise on those channels will be high enough to degrade wifi performance. Think of it like using an FM transmitter, it will broadcast on a licensed spectrum, but you will likely not see someone try to broadcast on the same frequency as the high powered broadcasts from the major radio stations, (you will likely not overpower them unless you are touching it to the antenna, and you will simply degrade your own audio quality.

It's not OK to ignore the regulations. The WiFi gear is the offending transmitter if used on prohibited frequencies or at higher than permitted power.
Slim, but there's an important safety reason for all of this.
You wouldn't shine laser at an airliner would you? They're only 2mW.
 
Will the radar really pickup a wifi access point even if the signal is at a miniscule level like 500-1000mw? (not that someone would want to use a channel if some radar or other high powered transmitter is spewing out a ton of noise on it).
 
It depends. Everyone raises the noise floor for the radar, even if operating far from the radar. You also have the issue of distance. It could be a 10kw weather radar, but it might be trying to pick up a signal from rain reflected 50km away, where as your 40-200mw offender might be located 1km from the weather radar and would likely be much stronger than the return signal that the weather radar is looking at.

Or you could be 30km away and be well below the existing noise floor.

That is why the 5GHz U-NII range has a DFS requirement. You might be operating just fine with your gear running channel 14 all day long blasting out 1w of Tx power...or even with an illegal amp pumping out 4-10w and it might impact NOTHING. Or you might be blinding or corrupting a radar even at 25mw of Tx power.

It would be nice to see a DFS accomodation for the higher 2.4GHz channels. It would allow two non-overlapping 40MHz channels with DFS or 4 non-overlapping 20MHz channels then. I suspect most transmitters COULD safely operate on channel 14 without causing an issue, but most is not all.

That or if there is a way to accomodate below channel 1. More 5GHz is nice and I think the way to go, but I think there is much to recommend adding even the small amount of spectrum needed for 80MHz of frequency in the roughly 2.4GHz range...even if it is non-contiguous and its really some stuff down around 2300MHz (though 80Hz contiguous would be better than 80MHz non-contiguous). I am also intrigued about the 3.6GHz range in terms of how much there is, what is around there and just how well it penetrates. I assume not as well as 2.4GHz, but better than 5GHz.

yeah, sure, just another radio to pack in to something, but I mean, seriously, cellular modems these days are up in the pentaband range (higher now maybe?), what is 3 bands? Especially if it adds more bandwidth and possibly better penetrating bandwidth than 5Ghz.
 
That also reminds me, if some wifi adapters can detect those signals and automatically switch channels, why don't those companies ever make the wifi adapters send raw data to the system. That would make an awesome 2.4 and 5GHz software defined radio

e.g., an RTL-SDR for 22MHz - 2.2GHz, and then a wifi adapter for the wifi spectrum.

with the way wifi is going, especially with upcoming routers attempting to add an additional 5GHz radio, it just seems like they are going to have to come up with a way to add more wifi channels that are not crippled.
 
True-ish. One of the issues with SDRs, outside of a certain range of frequencies, is that antenna gain is dependent upon the frequency.

So it is hard/impossible to design a single antennas to rule them all.

So you COULD make an antenna that works resonably well at 2.4GHz, would work not as well, but probably okay at 5GHz, but would be crap at 900MHz.

Part of why one should really ask, that 7dBi dual band antenna...what is the gain at? 2.4 or 5GHz? Because the gain will not be the same at the non-optimized frequency.

Just look at cell antennas that work across a very large frequency range from 700MHz to 2.4GHz. Often times if you look at the gain based on frequency, they have NEGATIVE gain at some frequencies.

I think, if you wanted a decent setup, you are going to be getting in to multiple antennas per radio so that it can switch between antenna paths based on the frequency range that the SDR is tuning to.

This may also be why 3.6GHz isn't used for wifi at all (WISP only AFAIK) and why you also don't see 900MHz lumped in with Wifi (of course, there again WISP and some other custom uses, but it isn't a standard). AFAIK, one of the advantages of 2.4GHz, plus the 5GHz range (which is actually some of the 5.2-5.9GHz) is that the wave lengths are close enough to a doubling going from 5.2-5.9 down to 2.4GHz that you can design a dual band antenna that has good gain in one band and okay gain in the other.

Utilizing something like 900MHz, or 3.6GHz would require seperate antennas to have resonable gain.

Its a total stab in the dark. It might have nothing to do with that or I just might be flat out wrong. My antenna theory on things like QWL/HWL, etc is soft at best.
 
There's a zillion things doable. But without standards like IEEE 802.11, consumers would be stuck with costly proprietary products. Enterprise customers who need functions beyond those in 802.11 (like fast secure roaming and controller managed WiFi), get roped in to suppliers like Cisco and Aruba.

For consumers, for the sake of interoperability of products, the least common denominator advances slowly.

More spectrum: Yes, there's a move on 3.6GHz and others. But the US FCC (and others) lust for the $$$ from auctioning spectrum to carriers who then charge we the customer for those auction costs ($Billions and $Billions). Think about it: The US Government is claiming to own and thus selling the ether waves that nature gave us.
 
Well some of it like 3.6 and 900MHz is usable right now. Heck, 900MHz is ISM just like the 2.4GHz that we use now. I don't believe that the requirements are the same as the 2.4GHz chunk (though maybe they are), but it is all unlicensed and could be used for wifi. It also often is used for unlicensed operation of point to point links for things. I know the big issues with 900MHz is both limited spectrum (I think it is only something like 28MHz is open there, but I am too lazy to look it up to know if my speculation is correct) and also noisy spectrum because there are so many 900MHz devices and 900MHz penetrates obscenely well.

3.6GHz I know has some encumberances on it, but it is unlicensed use and used sometimes by WISPs.

I realize things take time, but I am curious if there will be a push at some point to actually start using some of the other unlicensed spectrum out there.

White space looks like it is going to be opened up for unlicensed usage, though I suspect that'll deffinitely be WISP and P2P link territory.

Even if using lower frequencies as part of a standard chip set in consumer devices isn't ever really a thing, I would be curious if something like 3.6GHz might be viable to "add" to Wifi standards. No idea on penetration/absorbtion or anything else like that. I assume it is roughly between 2.4 and 5.2-5.9GHz...but better than 5GHz and worse than 2.4GHz could be of a lot of benefit in some ways. 2.4GHz sometimes penetrates too well (communal living, IE apartment complexes and townhouses) and 5GHz often doesn't penetrate well enough (people who don't live in one room studio apartments)...so something in the middle might be nice.

Of course bandwidth would be the pressing question. How much and contiguous or not. From what I see Channel 131-138 are available, so 40MHz of bandwidth...which isn't much, but if we offered it as 20MHz only you'd have two non-overlapping wifi networks possible...its an option (every little thing and all that), or even allow 40MHz, with worse penetration than 2.4GHz...that might not be as big a deal.

http://en.wikipedia.org/wiki/IEEE_802.11y

Those are pretty step transmit power...even if it is low power the FCC opening it to unlicensed operation that must be allowed to accept interference from licensed operators and keep unlicensed use low power, like 250mw or something (maybe even just indoor use? *cough* *cough*)

Anyway, something I hope the FCC thinks about as it doesn't look like there is much more spectrum out there that doesn't have a big player incumbent upon it beyond what has already been opened in 900MHz, 2.4GHz and 5.2-5.9GHz (until you start talking the tens of GHz bands). Some in the 600MHz range should be opening in the next few years from the UHF sales. FCC might give ~90-100MHz in the 5.2-5.9GHz range depending on the decision on "car networking" and if DFS is ever truely implemented, then some of those DFS 5GHz channels could be more widely used since most gear can't/won't use them now.

That is about it though.
 
900MHz FCC regulations do not permit operating with 20MHz per channel as in 2.4GHz. Indeed, the US/Canada 900MHz ISM band is 902-928MHz. Not enough room for two 802.11-like channels. And the upper few MHz are not usable due to old high power paging systems at about 930MHz.

The major use of the 902-928MHz band is for countless SCADA (telemetry) systems for utilities like water/power. They don't transmit often but it's important data.

Hams can legally use high power in part of that band, but it's not popular.
 
900MHz FCC regulations do not permit operating with 20MHz per channel as in 2.4GHz. Indeed, the US/Canada 900MHz ISM band is 902-928MHz. Not enough room for two 802.11-like channels. And the upper few MHz are not usable due to old high power paging systems at about 930MHz.

The major use of the 902-928MHz band is for countless SCADA (telemetry) systems for utilities like water/power. They don't transmit often but it's important data.

Hams can legally use high power in part of that band, but it's not popular.

There's been work on 802.11ah, which brings 802.11 down into the 900MHz band using 1MHz/2MHz channels for the Internet of Things initiatives - 11ah also support 4, 8, and 16MHz for countries that have more bandwidth allocated in that band (in the US, it's 26Mhz).

The link budget is different that what we see in the A/B/G/N/AC arena, and the MAC is optimized for power efficiency and link utilization - The other interesting thing compared to other existing 802.11 deployments is that the MAC is scheduled - which allows it to scale compared to where we are presently.

Draft 2.0 should be out soon (i.e. mid-2014).
 
(I suppose we should stop this chat as it's gone off-topic). Or start a new thread.

I've read about 11ah for 900MHz. I did a project years back with Alvarion's BreezeNet 900. It is/was a 2MHz channel width. Had to be freq. hopping to be FCC type certified in that band. Ideal SNR, it got about 1.25Mbps net yield. At slower speeds (it was rate-adpative), it got about 250Kbps at 6 miles, from 100 ft. up in a lighthouse ashore to a police boat. Ashore was a long yagi, maybe 5 ft. long, about 11dBi as I recall. On the boat was about 5dBi omni. The yagi was fixed at a mooring location. Elsewhere 900MHz "Access Points" were placed on high spots like tall roofs, bridges, etc., to provide wide area coverage - like whole harbor.

Those antenna sizes are due to the low freq. At 2.4GHz, same sized antennas would have been double the gain, more than offsetting the higher propagation loss.

So there will be an expectation setting needed for 11ah in 900MHz hopping. But is a very good thing to finally have an IEEE/ISO standard for inter-vendor interoperability in that band. Never has been.
 
Last edited:
I really don't see the reason behind the 17dBm limit on certain 5GHz channels.

Wait. What? Some channels have better transmit power than others? Is this true for 2.4GHz as well? Is this documented somewhere?

I live on 4 acres. I can "see" my neighbor's wifi--sometimes. Channel overlap is not an issue for me. So if there are some channels allow/provide more transmit power, I want to be on THOSE.

Is this documented somewhere or did I read this wrong?
 
You're quoting old posts in which the information has changed, afaik.
 

Latest threads

Support SNBForums w/ Amazon

If you'd like to support SNBForums, just use this link and buy anything on Amazon. Thanks!

Sign Up For SNBForums Daily Digest

Get an update of what's new every day delivered to your mailbox. Sign up here!
Top