View Single Post
  #6  
Old March 24th 07, 12:18 PM posted to rec.aviation.homebuilt
Robert Bonomi
external usenet poster
 
Posts: 50
Default theoretical radio range....

In article .com,
Andy wrote:
i have a radio rated at 10W and a radio rated for 5W output. mine 5
watter isn't a handheld but this is typical output for that genre.

assuming they are using the same antenna what is the theoretical range
difference between the two and what is the practical range
difference?


I know a ham who used to routinely work moon-bounce on VHF, with a rig
powered by a single 9v transistor radio battery. I think he had something
like 60 _milliwatts_ on transmit.

Good antenna's (and proper installations) make a bigger difference than RF
power. Years ago, I had a base-station installation that outperformed
virtually every other installation in the territory -- who were almost all
running 2-4.5x the power I was.

Now, "all else being equal", and for the same recieved RF signal level,
range will chage proportionally to the square-root of the change in
power level.

Caveat: 'all else' is *rarely* equal. wry grin

That said, the 10-watt rig would be expected to have an approximately 40%
greater working range than the 5-watter. *Assuming*, of course, that the
transmitter on the _far_end_ has sufficient power to reach _you_ at that
distance.

it seems the price difference is 2X to 3X. is the price
difference justified?


Depends on 'how badly' you need the extra range, doesn't it? *grin*

Only -you- can evaluate your needs/requirements.

i guess i'm asking "should i ebay the 10W unit and find a better use
for the remainder?"