What's new

Added a third WiFi antenna on 2x2 card

  • SNBForums Code of Conduct

    SNBForums is a community for everyone, no matter what their level of experience.

    Please be tolerant and patient of others, especially newcomers. We are all here to share and learn!

    The rules are simple: Be patient, be nice, be helpful or be gone!

Dr Strangelove

Occasional Visitor
I have a TomatoUSB Linksys e4200v1 router.
It has 450Mbit 5GHz WiFi

I have a laptop with a 2x2 Intel Wireless-AC 7260 WiFi card.
It's wired to the two WiFi antennas of the original(old) WiFi card and connects at 300Mbit

It has a real ~120Mbit bandwidth when positioned approx 3metres from the router
The Ethernet in the laptop is only 10/100Mbit (No Gigabit).

In my laptop I have 2(two) 3G antenna which are not used as I have no 3G modem in my Laptop.

Is it possible to use one of the 3G antennas and create a third WiFi antenna for my 2x2 WiFi card in my Laptop?

Is this even possible considering the WiFi card is spec'd as a 2x2 card?

Thank you for any info/help.
 
You can't use a 3G/4G antenna. It is designed for a different frequency.

Where would you attach it on a 2x2 card?
 
Just double up on one of the existing connection points on the PCIe half card was my initial 'uninformed' thought. True it only has two coloured 'chicken wires' attached.

Interesting, I read about the antennas being 'tuned' to given frequencies, but had not idea this was to do with the 'wire' running around the edge of my notebook monitor/screen.

I have had my notebook screen apart before and it just looked like a couple of wires in 'cardboard' running around the edge of the monitor screen.

They are tuned in some way?

I might be learning something I never knew. :D
 
Last edited:
Yes. Antennas are "tuned" for different frequency bands. You also can't just stack antennas as you propose.

For 3x3 operation, you need a 3x3 card and three dual-band 2.4/5GHz antennas.
 
Now doing more reading and note that the 2.4GHz frequency requires an antenna of approx 12.5cm in length.

Now I have two cables attached to the PCIe WiFi half card which does both 2.4GHz and 5GHz with a connection speed of 300Mbits using 40MHz channels.

Does this mean the other cable is for 5GHz which would have to be a different length again or do they just 'average' the cable length for both.

Just wondering, as I don't quite know/understand how 2x2 (two streams) and two frequency could all get along so well if both 5GHz and 2.4GHz require different cable lengths and yet I'm still able to get 300Mbits connection speed on 2.4GHz or 5GHz with 40MHz channels.

Sorry, I'm one of those people that 'need' to know things, else I go nuts :D
 
Sorry, but last reply. Cables are different than antennas. Your notebook has two dual-band antennas. They operate in both 2.4 and 5 GHz bands.

To learn how, Google or hit Wikipedia.
 
Yerp. Already reading "Radio Frequency and Antenna Fundamentals"

Some times you just have to find what is the right question, before you can start finding answers.. :)

Thank you should be able to work it all out now.
 
Is it possible to use one of the 3G antennas and create a third WiFi antenna for my 2x2 WiFi card in my Laptop?

Is this even possible considering the WiFi card is spec'd as a 2x2 card?

the 3G/4G antenna's are tuned for 850/900/1900/2100 MHz - totally different purpose here for them.

In any event, inside a laptop - you've already have what you need - two antennas for 2.4GHz and 5GHz.

sfx
 
two antennas wired to same source - greatly changes the impedance and matching. This causes a poor VSWR (voltage standing wave ratio). That's essentially the amount of power reflected back from a mismatched antenna (pair). Likewise, an antenna intended for a different frequency band will have a poor VSWR. In both cases, the end result is weaker signals in both TX and RX.

It's kind of like connecting a skinny pipe to a fat pipe.
 
two antennas wired to same source - greatly changes the impedance and matching. This causes a poor VSWR (voltage standing wave ratio). That's essentially the amount of power reflected back from a mismatched antenna (pair). Likewise, an antenna intended for a different frequency band will have a poor VSWR. In both cases, the end result is weaker signals in both TX and RX.

It's kind of like connecting a skinny pipe to a fat pipe.

Yup.

As for dual band antenna for Wifi, a perk here is that the two frequencies in common use in wifi are resonably close to twice the wave length apart. ~2.4-2.49GHz and ~5.2-5.9GHz. This makes it easier to have a single antenna that has good gain in one band and okay gain in another based on half/quarter wave length design for a half dipole antenna. The thing is, it will NOT work equally well in both bands. Generally it is tuned for 2.4GHz and you get somewhat lower (~2-3dBi) gain in the 5GHz band. However, it'll work decently in both, just a bit better in one than the other.

A cell antenna is designed for completely different frequencies and it actually does NOT work equally well in all of them. This is one of the add-on conumdrums to cell technology and the huge spread of frequencies that telecos have to work with. An antenna that has 2-3dBi gain in 850MHz might have a -2dBi as in NEGATIVE gain in 2600MHz. Well, that and the higher frequencies that some, like Sprint, have to work with like 2600MHz penetrate just as poorly as 2.4GHz wifi. Of course the difference there is much higher gain antennas on the cell towers as each one covers a sector instead of having to be omni directional, they can have different antennas for each band on the tower (so they can be tuned for exactly that frequency) and the towers (and cell phones) typically have much higher transmit powers than Wifi. Your cell phone isn't going to have a 1w transmitter on it...but it is likely higher power than your wifi radio by a factor or two...and the transmitter on the cell tower might be 100x more powerful than your phone's transmitter...which does mean very lopsided Tx/Rx abilities over cellular as the cell tower is listening to a very faint whisper from your phone and responding with a bullhorn.

Anyway, if you want faster speeds, either get an 11ac class router to go with your 11ac class wifi card. Try different drivers (I have had a lot of problems with Intel 7.x.x.x series drivers for my 7260ac card. I had to use earlier 6.x.x.x series drivers with my router, otherwise it was VERY slow).

By comparison with my TP-Link WDR3600 at 3m away with my Intel 7260ac and Windows 8.1 I get ~200Mbps on 5GHz and ~180Mbps on 2.4GHz, both at a 300Mbps link rate.
 
2 or 3 or even 5 dBi of antenna gain is a nit. The path loss in home WiFi is comonly 60dB or more, so antenna gain as a percentage of the path loss is almost negligible.

A 12dBi omni antenna (4 ft. long) is significant. But its vertical directionality is just 7 degrees or so. It's best for flat earth people where there are no tall buildings or terrain. Don't put it on the roof.

point to point link horizontally directional (like yagis and patch antennas) are, IMO, the only types that are useful in WiFi.
 
I'd disagree to the extent that at medium high to high attenuation situations, often times just a few dBi of extra gain can be the difference between linking at, say, 54Mbps and 24Mbps or whatever. Within a home/office its probably unlikely that a 2-4dBm difference in signal strength is likely to result in being able to establish a connection and not, but it could make a resonable difference in actual throughput...just depending on a lot of other factors.

From a personal experience/perspective I think when it matters the most is when you are talking about outdoor APs and/or your base station. With a client device the orientation can vary so much that you want a pretty low gain, highly omnidirectional antenna, like a regular half wave length (ie ~2.3dBi) dipole antenna. That way, within reason, most orientations are going to give roughly the same gain/loss.

On a basestation though I feel like you always want the highest gain antennas for the setup you'll be using. So long as you don't mind the visual impact big antennas can have. 14dBi omni inside a house is obviously ridiculous and wouldn't work well. However, 7-9Dbi if it is solely for use on a single floor of a house is probably optimal. 3-5dBi if you want to try to cover more than one floor of a house...which of course covering more than one floor of a house is already not a good idea...but if you are going to try.

Outdoor APs I say as big as you can go, especially if as you pointed out it is flat coverage. With my setup I could probably justify 9dBi omnis. I don't think I could go higher gain because my property isn't overly flat and near the AP, since it is sticking down from the eves of my 1st story roof, would probably have really crummy gain, even if you are close.

With an outdoor AP its generally line of sight so every 6dBi increase in antenna gain would net a rough doubling in range (assuming you actually were not stepping down transmit power any).
 
I'd disagree to the extent that at medium high to high attenuation situations, often times just a few dBi of extra gain can be the difference between linking at, say, 54Mbps and 24Mbps or whatever. Within a home/office its probably unlikely that a 2-4dBm difference in signal strength is likely to result in being able to establish a connection and not, but it could make a resonable difference in actual throughput...just depending on a lot of other factors.

From a personal experience/perspective I think when it matters the most is when you are talking about outdoor APs and/or your base station. With a client device the orientation can vary so much that you want a pretty low gain, highly omnidirectional antenna, like a regular half wave length (ie ~2.3dBi) dipole antenna. That way, within reason, most orientations are going to give roughly the same gain/loss.

On a basestation though I feel like you always want the highest gain antennas for the setup you'll be using. So long as you don't mind the visual impact big antennas can have. 14dBi omni inside a house is obviously ridiculous and wouldn't work well. However, 7-9Dbi if it is solely for use on a single floor of a house is probably optimal. 3-5dBi if you want to try to cover more than one floor of a house...which of course covering more than one floor of a house is already not a good idea...but if you are going to try.

Outdoor APs I say as big as you can go, especially if as you pointed out it is flat coverage. With my setup I could probably justify 9dBi omnis. I don't think I could go higher gain because my property isn't overly flat and near the AP, since it is sticking down from the eves of my 1st story roof, would probably have really crummy gain, even if you are close.

With an outdoor AP its generally line of sight so every 6dBi increase in antenna gain would net a rough doubling in range (assuming you actually were not stepping down transmit power any).

if 3 dB more antenna gain makes it work - great. But that says there's not much more than 3dB of link "margin" to accommodate fades and orientation changes. RF engineering goes for much larger margins.
 
Sure, but more signal, more speed in most cases. Its less margin for orientation changes and fades, but you'd still be better off than if you had 3db less signal strength to start with.
 

Similar threads

Latest threads

Support SNBForums w/ Amazon

If you'd like to support SNBForums, just use this link and buy anything on Amazon. Thanks!

Sign Up For SNBForums Daily Digest

Get an update of what's new every day delivered to your mailbox. Sign up here!
Top