Long term data logging with OTII

Hi there

I’m currently looking into using the OTII to measure energy use over longer term time periods… here’s what we’re currently thinking:

  • Using a headless linux setup on a x86 machine with the OTII attached to a standard DC bench supply.

  • OTII with premium license to both source and sink current (if charging)

  • Timescales would include tests for up to one week.

When we have run a cycle of tests, if there are anomalies in the power consumption we would like to be able to pull off the logs and examine them. This could be in a native format, or as a CSV file - we don’t actually care. I guess it would be useful to be able to open the native format so we can examine with the OTII software on another machine.

My questions are:

1/ Does the above sound reasonable - do you have any things to watch out for, or suggestions on how best to achieve this.

2/ Data logging - what would the data requirements be? Has it ever been tested over this length of time?

3/ Hardware requirements for running the linux client over a week long period? (RAM / storage / CPU)

Thank you in advance

Hello Mr/Mrs maybe Otii

  1. We don’t fully support this yet as it’s currently not possible to reserve a premium license from the command line. If you can work with X-forwarding over ssh you can launch the GUI and reserve the license and then run it headless though. Currently a license is reserved for 3 days when you reserve it, but it’s refreshed as long as otii or otiicli is running. So it’s only if you don’t use it the reservation will be lost. In the next software release this will be changed so you can reserve for much longer durations if needed. Adding support to the CLI to be able to reserve a license is of high priority for us.
  2. Running projects for up to a week is definitely possible. You can choose yourself if you either extract the data in your script and save it in any format reasonable for you. Or, you can save the data as an otii project from the script and then open it somewhere else in the GUI. Something to note is that a week worth of recording is quite a lot of data and the UI will become sluggish from it, especially if you try to view a lot of it at the same time.
    • RAM
      The amount of RAM used depends a lot on your application. If you are only making one very long recording then you should easily get by with 100MB. If you are however making a lot of shorter recordings, the RAM usage will go up. I don’t think it will impact the CLI very much though. Most RAM is used for containing the data currently seen on the screen in the GUI application. But to be on the safe side, perhaps at least 512 MB free after Linux has consumed it’s part is recommended.

    • Storage
      It depends a bit on which channels you enable in your recording. If you for example only record the main current and nothing else it will consume about 48 kb storage per second of recording, which becomes around 4 gigs per day. This is just for storing the project while running, so you need additional storage when saving the project. The amount varies from about 2-4 gigs per day. The data is compressed so if you have very similar data over long times (i.e. idle current) it might become smaller than if you have data that varies a lot.

    • CPU
      CPU usage is not an issue when running the client in command line. I would recommend an i3 or better, or an equivalent CPU from AMD.

If you could further describe what you mean with anomalies and how you plan to detect them I think we could advice better in this area.

Something which may be of interest is that we are currently working on a module for our scripting engine which will make it easy to setup Otii in a continuous integration environment. In it you can specify a number of tests. Each test checks that the energy consumed between a start and stop time is between a minimum and maximum value. It also generates a junit.xml file which you can consume with e.g. Jenkins.

I hope this helps, if you have further questions don’t hesitate to ask them!

Thanks for the extremely quick response… my replies below!

1/ Yes… I agree it is a high priority. It would be nice to have a fully capable piece of hardware upfront without the licensing issues, even if it meant a much higher entry price to save encountering these problems.

2/ Is the likely bottleneck when viewing large amounts of data the processor or disk access? It seems being able to quickly scan through a lot of data would be a useful feature when looking for excessive power consumption, visually at least.

By anomalies, I mean if the test has completed but has used too much power due to a firmware bug, finding out when / where this has happened. Truthfully we will probably end up running a query on the logged data using a python script (i.e. looking for more than X Wh of energy within X seconds of time).

For now, we have one OTII and have found it very useful so far. We were thinking of buying a second for our test processes (longer term firmware testing).

Thanks again

  1. Afraid I can’t help you with changes to the business model. Any suggestions or discussions are welcome though at sales contact form.

  2. It’s a combination. It needs to read a lot of data from disk, but it also needs to do processing of it to determine what to display. The reason for this processing is so a peak is not lost if you zoom out a lot. A workaround for this might be to split the data into several projects consisting of a day, or half a day each.

Regarding anomalies detection you can run that while the recording is ongoing, or if you prefer when it has completed. The advantage of doing it directly with otii script is that you can crop the project before you save it. E.g. if the anomaly occurred during the third day you can crop so you only keep a few hours (or less) of data where the anomaly occurred instead of keeping the entire 7 days.

I’m glad that you find our product useful!

I’ve used the Otii to record for several days before on a few occasions.
The first ran for a week and then the machine crashed. Chris pointed me to the temp file and I could recover this data. However, the recovered exported to CSV data file was ~40GB, which very few programs will open. I had to script something in Matlab to specifically open it.
The subsequent attempts have had issues, windows updates causing a restart, power config turning off USB ports… etc etc

Since then I’ve been very careful to avoid doing these super long tests, unless I absolutely need to.

I’d think about what you want from this test, and reduce it down. Do you really need a weeks worth of samples? Or do you just want the outliers? Can you trigger an output to use? Output to UART? (which are sync’d to the waveforms) Can you use something like Chris has suggested, and only have the data for the sections where something interesting has occurred?

The Otii guys were pretty good about sending me beta versions of stuff, fixed bug FW, specific-fix App versions, and trial scripts… (Thanks again guys)

Food for thought anyway…

Hi,

I am just curious about the same topic, was there an update for the feature to measure over long period of time say weeks to months.

Best Regards,
Sujith

Hi Sujith,

Welcome to the forum.
Yes, there has been several improvement since this was posted.
You can down sample the data, before exporting it, reducing the number of data points.
With our new Otii 3 software together with our new Otii Ace Pro hardware, it is also possible to set a lower sample rate to start with.
This is just some of all the improvements that has been implemented.

Best regards,
Björn

Hi,

Is it possible to set a lower sample rate at the start of a recording with an Otii Arc Pro too? Or is it possible only with an Ace?

Thank you very much,
Best regards.
Giovanni

Hi Giovanni,

Welcome to the forum!
The possibility to set sample rate before the recording is only possible with Otii Ace.
However, both Otii Arc and Otii Ace recordings can be downsampled when the recording is finished.

Best regards,
Björn

Thank you very much Björn