An antenna has a gain of 10 dB and is used to transmit a signal at a frequency of 1 GHz. What is the power density of the signal at a distance of 100 m from the antenna?
Solution: λ = c / f = (3 x 10^8 m/s) / (100 x 10^6 Hz) = 3 m An antenna has a gain of 10 dB
where λ is the wavelength, c is the speed of light (approximately 3 x 10^8 m/s), and f is the frequency. [Page 2] Note that this is just a
[Page 2]
Note that this is just a sample solution manual and may not be comprehensive or accurate. For a complete and accurate solution manual, please consult a reliable source. and f is the frequency.
Assuming a transmitted power of 1 W and an antenna gain of 10 dB (which is equivalent to a gain of 10), we get: