I’ve been using the Otii Arc for a while now and am quite pleased with the level of detail I now get in power profiling my electronics.
There is a problem though, at times I need to use a device with very low consumption and sleep and (relatively) high when transmitting. Difference being something of a couple micros and a couple hundred millis. Odd thing is, when I attached my external supply of 9V I noticed a lower power draw than would practically be possible (datasheets).
This spiked my interest, so what I did is just grab a 1M resistor and use Otii to measure the current. With 3,3V I expected 3,3mA, but I got much less the more voltage I applied as a source. Refer to the image below for the results.
Have you calibrated between each measurement?
There is a voltage dependence (supply voltage that is) and you need to re-calibrate when you change power voltage.
I hope this helps you!
It appears that calibrating the otii does indeed work!
Think I read over that part…
I think it’s a good idea to automatically calibrate at a changing input voltage.
Hi Bjorn, why does the otii has good banana sockets on front of the device and a DC barrel jack on the back. Those have a significant resistance. I suggest to use banana sockets on the back too.
The DC jack may have significant resistance but keep in mind that this is before regulation. The DC barrel jack makes it easier for the regular developer to simply plug in an off-the-shelf DC adapter, even though it’s not an optimal connector from the electrical performance perspective.
Starting from firmware 1.0.1 the Arc box will keep two sets of zero offset calibrations, one for USB supply and one for DC jack supply. If the DC jack senses a valid supply being present the box will offset calibrate both using the USB supply and the DC jack supply, otherwise only USB supply will be calibrated.
The units are factory calibrated using a 9V DC power supply.
If the no-load voltage on the DC input has changed then a new calibration should be performed. Having this done automatically when input voltage changes may be problematic as the equipment might not have had time to warm up (all calibration should preferably be performed after a warm-up period). Having an automatic calibration kick in may also be problematic because the output has to be disabled while the calibration is running. Automatic calibration is something we may look into in the future though.