Coaxial cables, antennas, amplifier inputs and outputs, typically have either a 50 ohm or 75 ohm characteristic impedance. Why?

It all comes down to the characteristic impedance value for maximum power transfer being 30 ohms, while the characteristic impedance for theoretical minimum attenuation (loss) is 77.5 ohms. 50 ohms is more or less in the middle, between these two values, so 50 ohms was settled upon as a standard characteristic impedance. It really is as simple as that.
When minimal loss is required in long coaxial cable runs, such as in cable or satellite TV, the concern is less with optimizing power transfer, so the higher-impedance 75 ohm cable and connectors are used in those systems.

Frequently, amateur radio operators get into arguments over the use of an antenna tuner located near the transmitter. Some argue that the antenna is not actually tuned by the tuner but that the transmitter is simply tricked into seeing a 50-ohm impedance at the transmitter output. They argue that the tuner must be placed at the antenna in order to actually tune the antenna. Figures 1 through 4 prove that this is not the case. Some argue that the antenna is not tuned unless it has a pure resistance of 50 ohms. Perhaps the terms resonant and tuned are confused. (from http://urgentcomm.com/test-amp-measurement-mag/maximum-power-transfer)

AD5GG works in the real world primarily as a board-level RF designer in the UHF (300 MHz - 6 GHz) range. Occasionally, he posts articles on this very site. Sometimes they're even worth reading.

[…] you want to know why 50 ohms was arrived at as a sort of standard for radio applications, you can read this post. In the simplest terms, the characteristic impedance is calculated by the formula on the right, […]

[…] you want to know why 50 ohms was arrived at as a sort of standard for radio applications, you can read this post. In the simplest terms, the characteristic impedance is calculated by the formula on the right, […]