4
\$\begingroup\$

I'm trying to measure some leakage currents of both a diode (expected around 5 nA) and a MOSFET (20 nA).

My issue is my meter's measurement accuracy is only around 10 uA, and I need an accurate reading.

Is it possible to reliably amplify this current to a measurable value, or perhaps convert it to a voltage using a potential divider, while still getting an accurate measurement?

\$\endgroup\$
3
  • \$\begingroup\$ You say accurate but say nothing about the applied voltages. Even if you got a diode leakage measurement of 4 nA, for example, does it matter to you if that was with a reverse voltage of 1 V or 10 V? Also, accuracy is something that needs to be calibrated and traceable to standards. Precision is a different thing. And just 1 V between two pins of a very very clean epoxy package can yield 1 pA, dead-bug and not soldered into a board. It all gets worse from there. Consider your construction and cleaning carefully. And why do you need calibrated accuracy? \$\endgroup\$ Commented Apr 17 at 20:45
  • \$\begingroup\$ @periblepsis I assume they mean "an accurate measurement" in the colloquial sense, i.e. a meaningful, useful data point. For such small leakages, room temperature is going to have enough of an influence that anything beyond 3 sig figs is meaningless anyway. Valid question about voltage bias condition though \$\endgroup\$ Commented Apr 18 at 0:25
  • 1
    \$\begingroup\$ Any chance you can connect several of those diodes/FETs in parallel to increase the leakage current? 50nA for 10 diodes in parallel is easier to measure than 5nA for a single one. Assuming an average is good enough for you. \$\endgroup\$
    – Michael
    Commented Apr 18 at 9:12

3 Answers 3

7
\$\begingroup\$

My issue is my meter's measurement accuracy is only around 10uA,

Maybe on your lowest current range perhaps. Have you looked carefully at what your lowest voltage range will do?

I have a cheapo 3.5 digit DMM, but it has a 200 mV range, with a 10 MΩ input resistance. That's a full scale of 200m/10M = 20 nA, with a resolution of 10 pA.

I have a meter that's only slightly more expensive, where the equivalent lowest voltage range is a very high input impedance, presumably a CMOS buffer with lower leakage current than I can be bothered to measure properly. I often use it with a 100 MΩ external shunt resistor for 1 pA resolution.

The ultimate way to measure a low current without resorting to buying fA amplifiers is to measure the change of voltage across a high quality capacitor, so ideally PTFE or polystyrene dielectric, though any type of plastic film will usually give good results. Choose one large enough that you can read its voltage stably within the few seconds it takes a DMM to settle, but small enough that the leakage current will change its voltage in a reasonable time, you don't really want to wait hours. Measure the capacitor voltage, disconnect the meter and connect the unknown leakage current, wait a number of minutes, then remeasure the voltage. Leakage current = dQ/dt = CdV/dt.

Of course any attempt to measure sub pA will be beset by surface leakage currents (fingerprint contamination) and changes in charge distribution around the measuring setup (just moving while wearing synthetic clothing).

\$\endgroup\$
1
  • \$\begingroup\$ As a variation, if one places the capacitor in series with the supply (start with the capacitor shorted), the capacitor voltage will start at zero and increase with the amount of charge pushed through it. Accurately measuring e.g. 50.7mV across a cap will be much easier and more accurate than trying to accurately determine the difference between e.g. an initial supply voltage of 8.87 and a final voltage of 8.36. \$\endgroup\$
    – supercat
    Commented Apr 18 at 21:03
5
\$\begingroup\$

Is it possible to reliably amplify this current to a measurable value, or perhaps convert it to a voltage using a potential divider, while still getting an accurate measurement?

Yes, this should be possible with a transimepedance amplifier, which some circuits can measure pA. Choose an opamp with a very low input bias current, I'd probably go to lower than 10pA. You may have to calibrate the circuit with a known resistance (like a 100M resistor and a 10M to calibrate out the voltage offsets of the amplifier. After that to find the current you measure the voltage and then use whatever gain resistor Vout/Rgain=I to find the current.

enter image description here https://www.ti.com/lit/an/sboa268a/sboa268a.pdf?ts=1713330630956

\$\endgroup\$
2
\$\begingroup\$

I have designed ICs with a quiescent current in the 300nA range. The tricky part is that this isn't a constant current (long times of low current with very brief high current spikes every 100ms or so) so to get an accurate measurement, we would RC filter and measure voltage across the R (chosen so the drop was about 100mV) with a DMM in high Z (>1GΩ) mode.

You could use a similar tactic here; choose a resistor so the leakage current generates a voltage in the 10-100mV range. Then measure that voltage (again, high impedance, not with 1MΩ input impedance) and then measure the actual resistance of the resistor. I=V/R.

Edit: as an example, assuming you can accurately measure 10mV, just put a 2MΩ resistor in series with the DUT. You measure the resistance of the resistor, and it comes out to R=2.14MΩ. Let's say you care about the leakage at 5V reverse bias, so you set your voltage source to 5.00V across the diode and resistor combination with the diode reverse biased. You measure the voltage across the resistor and find that it's 15.58mV. if you want to get very accurate, you could raise your supply voltage by 16mV to make the true voltage across the diode 5.000V and measure again, but it's close enough for first order.

Now, we divide 15.58mV by 2.14MΩ, and we find the leakage to me 7.28nA.

\$\endgroup\$

Not the answer you're looking for? Browse other questions tagged or ask your own question.