How much output voltage, current and power can RF amplifiers provide? This question is often asked by novice test engineers as well as seasoned RF professionals. Depending on the application, there is often an underlying desire to maximize one of the three parameters; power, voltage or current. While one would think that a simple application of Ohm’s law is called for, this would only apply given ideal conditions, such as when an RF amplifier with a typical 50 Ω output resistance is driving a 50 Ω load. In this rare case where the load impedance perfectly matches the amplifier output impedance, the power delivered to the load is simply the rated power of the amplifier. There is absolutely no reflected power and thus, there is no need to limit or control the gain of the amplifier to protect it from excessive reflected power.

Unfortunately, such ideal conditions rarely apply in actual “real world” applications. Real amplifiers are required to drive varying load impedances. The mismatch between these “real” loads and the amplifier’s output impedance result in a percentage of the forward power being reflected back to the amplifier. In some cases, excessive reflected power can damage an amplifier and precautions that may affect forward power are required. Given these realities, how does one go about determining output voltage, current and power? Again Ohm’s law comes to the rescue, but with the caveat that the actual power delivered to the load (net forward power after the application of any VSWR protection less reflected power) must be determined before applying Ohm’s law. This Application note will highlight some of the major RF amplifier characteristics that impact forward power as well as net power allowing the use of Ohm’s law, even when conditions are far from ideal.