6
\$\begingroup\$

I understand the difference between passive and active probes, and between x1, x10, x100, ...

I also understand that at high frequencies the capacitance of the oscilloscope has to be considered, so a 1 MOhm probe in x1 or x10 cannot be used at, for example, 300 MHz, because a capacitance of 9 pF means a 59 ohm impedance, and a 50 Ohm input oscilloscope would be much better.

But I also saw that sometimes a coaxial cable is connected between the DUT (device under test) and the oscilloscope. When is better to use a coaxial? Why would I use a coaxial instead of a probe?

\$\endgroup\$
1
  • \$\begingroup\$ Intense dI/dT of magnetic fields, as when probing inside switching power supplies, need the probing-return-path to be coaxial with the signal. Otherwise the RETURN wire(even at the probe tip) forms a loop and large error voltages are imposed atop the signal you wish to examine. \$\endgroup\$ Commented Nov 20, 2018 at 18:11

3 Answers 3

7
\$\begingroup\$

This is not about what is "better" or not. It is a question whether the source of signal has low enough impedance or not.

In many cases the oscilloscope probing is meant to be "non-invasive", and not alter the signals. So if the signal have relatively high impedance, even 50-65 Ohms (as many modern signals on PCBs do), the scope probe must have an input impedance much higher than that. In some cases even 500 Ohms probe tip can be considered "high impedance probe", and there are "passive" probes that are 10:1, and look like a simple coaxial cable, see for example the P6150 Tektronix "probe". This probe is essentially a high-quality 50-Ohm coaxial cable with a tip that has a 450 Ohm series resistor, which forms 10:1 divider into 50-Ohm cable:

enter image description here

The resulting input impedance is just 500 Ohms, but for a 50-Ohms signal source it changes the signal just by 10%, which is frequently good enough. The major advantage of this kind of probes is that they are limited only by quality of cable (and bare input bandwidth of the scope), and P6150 works up to 9 GHz.

If you have signals with 2-3 kOhms or more at source, you must use more sophisticated active probes (if the range is 500 MHz ++), or passive probes, with mega-ohms impedance range if your signal frequencies are DC or low MHz.

However in some cases of devices the signals are straight 50-Ohm outputs, and they are supposed to be loaded with 50-Ohm loads for normal operations. In this case a direct cable connection to 50-Ohms coaxial scope inputs can and will be used. An example would be testing of signal quality (amplitudes, jitter, eye diagrams) on USB 3.0 channels (from USB.org):

enter image description here

\$\endgroup\$
1
\$\begingroup\$

The cable for an oscilloscope probe (connected to the a high impedance oscilloscope input) is (generally) not a standard coax cable. If it were, say, a standard 50 \$\Omega\$ coax cable, then when a sharp-edged or high frequency signal reaches the high impedance input of an oscilloscope (typically 1 M\$\Omega\$), the signal will be reflected back down the cable. This can not only result in undesired artifacts in the oscilloscope's graphic output, but might also affect the behavior of the circuit under test.

[Note: Many oscilloscopes also have a 50 \$\Omega\$ input. A standard 50 \$\Omega\$ coax can be used with this input, but the circuit under test must be of the sort that supports such a low impedance load, (for example a LISN).]

To avoid the undesirable effects of impedance mismatch between an oscilloscope probe cable and the 1 M\$\Omega\$ input impedance of an oscilloscope, oscilloscope probe cables are (often) made with a high resistance central conductor. The resistance of this conductor is not so high that the overall impedance of the oscilloscope plus cable is substantially increased above the standard 1 M\$\Omega\$, but is high enough to dampen reflections in the cable.

An article by Doug Smith The Secret World of Oscilloscope Probes discusses the construction of oscilloscope probe cables.

But I also saw that sometimes a coaxial cable is connected between the DUT (device under test) and the oscilloscope. When is better to use a coaxial? Why would I use a coaxial instead of a probe?

Use a standard 50 \$\Omega\$ coaxial cable if your oscilloscope has a 50 \$\Omega\$ input connector and either the circuit under test is designed to support a 50 \$\Omega\$ load being attached to it or you have a special "low impedance" probe designed to be connected to the 50 \$\Omega\$ input of the oscilloscope, and your circuit supports probing with your "low impedance" probe. (Typically, the total impedance of such a probe plus the cable is 500 \$\Omega\$).

You might also be able to use a 50 \$\Omega\$ coaxial cable between a circuit and the 1 M\$\Omega\$ input of an oscilloscope if the frequency is low and the edges of signals are not too sharp. (Otherwise you might get ringing).

Use a standard probe when the circuit under test does not support having a low impedance connected to it, or the signal frequency is high, or signal edges are sharp.

You may use a dedicated coax connector as a test point in a circuit if you have a coaxial cable that mimics the behavior of a standard probe and cable. (I will not describe how to mimic a 10X probe). The cable in this case should NOT be a standard 50 \$\Omega\$ coax, but should have a resistive center conductor, as described above, and in Doug Smith's article. (I have no idea where to obtain such a cable).

\$\endgroup\$
0
\$\begingroup\$

Basically, when it is possible to use coaxial, use it. Coaxial connectors are more reliable than a probe. With a probe, you need to hook it to a test pin or even sometimes hold it in place. You have a grater risk of slipping off of your test point or getting a dodgy connection. With a probe you also get ground from 'somewhere' but not necessarily at the right place. On a coax, you get ground from the connector, where it should be. (this can matter when propagation times become relevant)

Also, test points are often exposed to more dust, dirt, oxidation, etc as protected coax connectors. This can add small resistance / impedance to the link and result in measurement inaccuracies.

Finally, although the cable is the same in a probe and a coax (mostly), on a coax the connector is still shielded, whereas on a probe, the end point is not and can therefore pick up more RF noise. I you have a coax adapter for your probe and stick the probe in the coax connector, then only this point applies.

\$\endgroup\$
6
  • \$\begingroup\$ You say "when it is possible to use coaxial, use it", but when is that possible? For example, if I'm using a pulse generator, then a DUT and then an oscilloscope. In the input of the DUT I can put a T to connect 2 cables/probes, so one would be for the output of the pulse generator and the other one for the input of the oscilloscope. When a probe or a coaxial would be better? \$\endgroup\$
    – Dylan
    Commented Nov 20, 2018 at 17:44
  • 1
    \$\begingroup\$ @Dylan I'm not sure I understand your comment... but I would for sure choose a T and two coax over one coax and a probe. \$\endgroup\$
    – Vince
    Commented Nov 20, 2018 at 18:00
  • \$\begingroup\$ Perfect, but why? \$\endgroup\$
    – Dylan
    Commented Nov 20, 2018 at 18:01
  • \$\begingroup\$ Because there is a good quality reliable connector available, which provides low resistance/impedance and rf shielding. I would use that over a probe, of which the tip is basically an antenna and relies on the conductive properties of whatever test pin you stick it on \$\endgroup\$
    – Vince
    Commented Nov 20, 2018 at 18:09
  • \$\begingroup\$ Ohhh perfect! So if you're using a circuit (for example, the DUT connected to a parallel resistance), you would use a probe because there is not a good quality reliable connector available or why? \$\endgroup\$
    – Dylan
    Commented Nov 20, 2018 at 18:17

Not the answer you're looking for? Browse other questions tagged or ask your own question.