Selecting the correct input impedance
Now, let’s delve into configuring the channel input impedance. With certain oscilloscopes, users have the flexibility to choose between a 50 ohm and a 1 megaohm input impedance. The selection of the input impedance to match the impedance of the signal source or the probing setup is called “termination.” This is done on a per-channel basis through the scope interface. The “standard” impedance for an oscilloscope input is typically set at 1 megaohm, which is the appropriate choice when working with passive probes.
However, when active probes or a direct connect using a BNC cable become involved, the optional 50 ohm termination becomes relevant. Many test and measurement instruments, as well as RF devices, utilize 50 ohms as their standard termination. Selecting the correct input impedance is crucial because an incorrect setting can impact the measured signal amplitude. For instance, setting the termination to 1 megaohm instead of 50 ohms might result in observing double the expected voltage.
As a final note, it is good to keep in mind that the maximum safe input voltage can differ significantly between the two terminations. Setting the termination to 50 ohms, as opposed to 1 megaohm, often imposes a lower threshold for the maximum safe input voltage. Some oscilloscopes may lack native support for a 50 ohm termination, but in such instances, specialized feedthrough adapters can be employed to provide the required 50 ohm termination when necessary.