It's been a while since I posted.
Last year we had a discussion here in So AZ about channel settings. We decided at that time to use CH -2 with a channel width of 10 MHz. 20 MHz was brought up by someone to which we responded that to do so on CH -2 would put us operating out of band. Since the ham spectrum for 13 CM is 2.3 GHz – 2.45 GHz, the issue must be with the upper limit fudging into part 15 space, which starts at 2.4 GHz if my data is correct. So, really, it's an out of the ham only spectrum issue, not an out of band issue, but still legal if we stay within the part 15 power/EIRP limits. Correct me if I'm wrong.
So another discussion came up recently which caused me to investigate 2.4 GHz wi-fi channel width. I read that each 2.4 GHz channel is 22 MHz with the CH 1 center freq at 2.412 and the lower edge at 2.401. This got me to thinking about CH -2, and -1. It looks like using CH -2 at 10 MHz would still put the upper edge into the part 15 space by 2 MHz. If that's the case, from what I'm reading, even at 10 MHz we'd have to abide by part 15 and the only way to get around the part 15 power limitations would be to set the channel width to 5 MHz. However, that upper 2 MHz would be quite a bit lower in peak power compared to the center frequency. How does that play into the law?
I'd like confirmation or correction on these suppositions.
--
Bob, W7REJ
The USA 2.4GHz ham band is devided into two parts an upper and lower, the upper segment (which AREDN uses) is 2390MHz to 2450MHz
It isn't a Part 15 conflict just a Part 97 band edge issue.
Its the lower bound of this upper segment you are contending with, and yes it's 22/11/5.5MHz actual width plus you want to leave a guard space from the edge.
That's right, The bottom edge is 2390 so I remember now the out-of-band issue but that doesn't address the fact that even on ch -2 at 10MHz the upper edge moves 2 MHz into part 15. Amateur radio is primary from 2390 - 2400 MHz, secondary from 2400 - 2402 MHz and primary from 2402 - 2417 MHz. A tiny 2 MHz is screwing things up, or am I wrong?
Since we're secondary from 2400 - 2402, does not the "thou shalt not cause interference" rule apply and therefore must limit our power/EIRP to part 15 rules? Seems this is still an issue for operation CH -2 at anything above a 5 MHz channel width. Am I wrong? Am I picking nits?
--
Bob, W7REJ
-2@10MHz is 2391-2401 --- My personal concern is that 1MHz is not enough band guard to meet rolloff to be far enough down at 2390MHz to not interfere out of band (same reason you don't run SSB 3KHz from the band edge 1MHz for a 11MHz signal seems small to me)
But ya ignoring Part 15 I'm not aware of any known interference problems on 2.4GHz but could yes alway exist with the primary owners, but that would be a transmitter by transmitter concern.
part 15 (i.e., unlicensed) users have to yield to licensed users (hams) in the 2400-2450 segment. But I have yet to read about an ARDEN operator banging on his neighbor's door and telling them to move up the band or he'll call the FCC.
One thing that rarely gets mentioned is that there is a amateur satellite segment around 2401 MHz. Oscar 51 had operation there (before it failed). I am not aware of any other birds in that range, although some are in planning stages, for example.this one is built and now scheduled for 2018 launch
http://space.skyrocket.de/doc_sdat/eshail-2.htm
one of the transponders on this bird is a linear translator that would repeat (anything) including narrow- band stuff like CW and SSB. Ch -1/-2 operation should really trash such communications - but it is geosynchronous over Africa so not going to pick up my signals ...
https://www.law.cornell.edu/cfr/text/47/97.303
The section you reference gives responsibilities of Part-97 users (i.e., us hams).
I am talking about the requirements/responsibilities of Part-15 users
The ARRL summarizes the dilemma I was talking about here (http://www.arrl.org/part-15-radio-frequency-devices#Myths) :
"Harmful Interference
"The FCC rules require the equipment manufacturer or importer to design and test his products to ensure that they do not exceed the absolute maximum limits. In addition, the FCC requires that Part 15 devices be operated in such a way that they not cause harmful interference. The operator of the Part 15 device is responsible for correcting the interference or to stop using the device if so ordered by the FCC. This can create a very difficult situation. Imagine that the neighbor of a ham goes to a local retail store and buys a Part 15 device. If the device causes harmful interference, the rules place the responsibility of proper operation and correction of the interference on the user. This can put a ham into the unenviable position of having to explain to a neighbor that the device he or she just bought at a local store is being used in violation of federal law! The resultant disagreement is not unexpected.
"
The word "yield" is my understanding of this requirement in Part 15
" ...interference must be accepted that may be caused by the operation of an authorized radio station ..."
Me either, it's very very unlikely to happen in most neighborhoods just because how many devices there are.
I have however considered brining the point up to companies who plan WIFI networks to have them as they are deploying commercial setups in the future take into account the local ham population but at the last time I pondered it I didn't think we had enough mass to warrant the push. There are also those local noise offenders (those with massive number of transmitters *cough* Disneyland *cough*) or those with numerous deployments (hotels) who could make good targets at the corporate level to push down policy to require them to migrate frequencies when contacted by a local ham (this one doesn't hurt that Marriot had an issue with FCC and jamming in 2015 that there is recent reason for hotels to fear interfering)
That said the addition of -2@ 5MHz and similar on 5.8 has greatly reduced (for me) the need of these two avenues, but it was certainly an option at the time I was looking into and remains an option to reduce some noise should my network layout needs change in the future.
Interesting point on the uplink, I might have to sit and do some calculations, that said with mesh tracking balloon launches we have seen it gets into the -80dbm to -95 dbm's by 80k feet I don't know what the rest of the math would be to geostationary orbit off hand for signal loss (does standard FSPL apply?) but could be an interesting point as we all want to be good ham neigbors we wouldn't want to wreck the fun of other projects.
Jim W8ERW
wider channel widths = generally increased speeds and throughput
Although, factoring in all the QRM and changing conditions of the environment, I wonder how big a factor this +3dB is. In the field, I've tested several 10+ mile links on 2, 3, and 5Ghz. I've consistently found 10Mhz measures the highest thoughput.
Bear with me on this point... What happens in the DSP, is a particular clock is cut in half to achieve the 10Mhz bandwidth (and quarter clock for 5Mhz). Then the symbol length (the time transmitting a bit) is doubled (hence the link bit rate is cut in half). There are still 64 carrier waves in 802.11n squeezed in whatever the bandwidth is. So we still have the same power per bit and still retain an 'orthogonal' signal (no interference between these very close carrier waves, the "O" in OFDM).
Because the symbol lengths are so long compared to 1 carrier wave with proportionally very short symbol lengths in the bandwidth, microwave multipath fading is much better tolerated. All the different wave paths have time to arrive at the receiver to capture the power in the symbol timeframe for a given carrier wave. Multiple oriented antennas can receive the energy, then combine to mitigate phase cancelation. This longer symbol time when cutting the bandwidth in half might also translate to less bit loss due to this multipath fading = higher throughput if the environment is particularly bad.
I don't know the timing tolerances, but I suspect the 802.11n standard 20Mhz timing and carrier wave counts are optimized for relatively short distances. At 10 miles, doubling the symbol length might make a significant difference in challenging environments (lots of buildings, hills, water surfaces, etc.).
Joe AE6XE
If you need 20MHz you picked the wrong band.
I understand perfectly, thus always looking for used 5GHz gear, and Im starting to see more on ebay now that many WISPs are upgrading to AC radios. Recently, I bought a pair of 5GHz Rockets for $50 shipped. Sometimes these sellers have lots of them and will take offers.
You said " It looks like using CH -2 at 10 MHz would still put the upper edge into the part 15 space by 2 MHz."
Channel 1 center frequency is 2412 and 22Mhz wide. How would -2 at 10Mhz be on the upper edge?
2412-11=2401 Lower Bound
Channel -2 is centered at 2397
10Mhz channel (well actually its 11MHz width) 11/2= 5.5
2397+5.5= 2402.5 MHz Upper Bound
2402.5 - 2401= 1.5MHz of overlap (and that Is not counting any ROLLOFFedges that are not part of the main signal)
See protocol section bandwidth column here: https://en.wikipedia.org/wiki/IEEE_802.11
The lowest common link rate is used when beacon packets and broadcast packets (OLSR) go out intended for all possible neighbor nodes to receive. Thus this 802.11b DSSS mode is used. 22Mhz/11Mhz/5.5Mhz beacon packets are going out 2 per second in 3.16.1.0 and later (or was that 3.16.1.1. and later?). OLSR packets are going out something like 1 every ~2 seconds. Unless you have a terrible link, the node would also be using the higher 802.11n rates and the data packets direct to a given neighbor are only consuming 20Mhz/10Mhz/5Mhz.
Maybe there is a way to disable 802.11b option and force the lowest rate to MCS0 802.11n. This would only consume the 20, 10, and 5 bandwidth.
802.11n is an OFDM signal (very different than DSSS kind of signal like FM is different than AM). One of the reasons spread spectrum technology was dropped, is because OFDM handles multipath fading considerably better. If we could drop 802.11b as the lowest rate used in 2.4Ghz, the LQ and NLQ values would more accurately reflect the link quality when sending data packets. Right now, it's like (analog analogy) we're measuring link quality with AM to determine the route paths, then sending the messages using FM (on 2Ghz and 900Mhz).
Joe AE6XE