APPNOTE: A Description of the Adaptive Storage Technique

Article ID: 011119lb
Last Reviewed: July 6, 2023

The information in this article applies to:

  • All Ranger Data Loggers.

Note that the example uses a data rate of one sample per cycle; this is applicable to the Power Master series of Loggers such as the PM7000, PM6000 and PM3000.

Target Audience

This article is aimed at:

  • All users of Ranger Data Loggers.

Summary

The patented adaptive storage technique has proven to be an exceptionally good performer in the field. The main attributes of this technique is its ability to accurately reproduce trend data and at the same time sample at a fast enough rate over long periods of time to faithfully reproduce anomalies and deviations from the trend.The ability to do this is not present in any other storage technique.This is a very rigorous requirement for conventional recording techniques and becomes more and more rigorous as the recording time increases. The following discussion explains the functionality of the adaptive storage technique.

Introduction

The patented Adaptive Store used in all Ranger Data Loggers is designed to make the best use of the store available, in reconciling two conflicting requirements:

  • Provide long term trend data, observing worst extremes of maximum and minimum values seen, and
  • Provide detail where new activity occurs, i.e. detect and capture “transients”.

If “transients” are slow moving, and the “long term” is relatively short, then the above requirements can both be met with a conventional Data Logger operating a sample and store process at a fixed sample rate. In technical terms, if the sample rate chosen can give a long enough recording period given the number of channels in use and the amount of store available, AND if the maximum frequency of the input signal can be defined to be less than half the sample rate, then a regular sample and store process does provide an adequate record of the input signal, from which the actual input signal can be reconstructed.

If the above conditions cannot be met at the same time, something else must be done.

In the Ranger PM-6000, there is a way of improving on the regular sample and store process, “Adaptive Store”.

Adaptive Store

Adaptive Store does not require any prior knowledge of signal conditions. The only parameter it takes is the total time of the recording.

It then applies a storage rate of 1 cycle for the whole recording period. The PM-6000 samples 128 times per cycle and calculates the true RMS value over the cycle period for the adaptive store process.

This sample rate is applied regardless of the number of channels.Thus for 4 channels, recording for 1 week, a total of 60*86400*7*4 =145.152 million samples are taken. At two bytes per sample, a store requirement of nearly 300 Mbytes would be required in the classical sample and store method.

The Ranger’s adaptive store process reduces this number by reducing the number of times it records anything. If a value can be predicted from past history, the new value is not recorded. All the time a set of values CAN be predicted, it is sufficient to define them on the basis of the past history, how the past history is being used, the length of time for which the prediction is valid, and the prediction tolerance (or better still, the worst case deviations from the prediction). In Ranger language, we call such a combination of Data a “record”.

Clearly there are a number of factors to be considered:

  • Recording time requested               )
  • Number of channels in use              ) Basic statistics
  • Amount of store available               )

These items set the frequency with which recording of some sort can occur. That also depends on how much store is used each time a”record” is placed in memory.

If some kind of prediction tolerance is to be applied in order to distinguish “more useful” values from “less useful”ones, then we also have to include in our list of factors:

  • Typical noise on the signal                  ) Data
  • Dynamics of any apparent trends        ) Dynamics

Clearly the process has to be able to perform equally well with large signals and large signal activity, and small signals, etc. It should be able to distinguish transients whether they are simple steps from an otherwise static signal, or on top of some trend already covered in noise.

The Ranger operates by initially

  • Assessing the Basic Statistics to give the typical worst-case “record” time.
  • Dividing the total available store into 2, and allocating one part to “transients”.

During recording, Ranger assesses the “normal signal activity” within a “worst case record time”, and attempts to define a prediction tolerance based on that activity. To begin with, the tolerance is set tight, so that predictions fail frequently. When the signal can no longer be predicted within the”tolerance”, a “record” is taken, and the statistics revised.

The control loop is designed to set the tolerance at the level which will NOT cause normal activity to fail, yet will respond the moment a signal fails outside the “normal activity”envelope. Thus, when the system has established the right tolerance, and that tolerance is confirmed after each record, a departure from prediction of the normal activity envelope will be sufficient to be recognised as a transient.

When a transient is seen, loop parameters are modified again taking into account

  • Whether this is a new transient, and
  • Its size.

If this is not a new transient, the system desensitises itself by increasing the tolerance level. If it is a new one, it actually INCREASES sensitivity to allow detail on this transient to be captured.

Over and under detection of transients

If signal activity continues to increase over a long period of time, an excess of transients may be detected and stored. In this case the system becomes insensitive to rapid pulse type signals(though the worst case envelope still reflects them), while remaining responsive to step type excursions. At the other extreme, for the situation of so much “normal signal activity” that small step changes are undetectable, the store allocated to transients remains unused. If the unused store builds up, it is reallocated to normal recording.

Worst Case envelope and multiple predictions

At all times, the extremes of the signal seen by the sampling process are included in the “worst case envelope”. This envelope comprises the maximum and minimum deviations from the best prediction that the Ranger has been able to use to describe the signal activity in the relevant record. Thus the max/min plot from the Pronto software shows the extremes of signal excursion, and all samples taken lie within that envelope. (i.e. all > 30 million samples over the whole week in the example mentioned on page 2).

The uncertainty as to actual signal value at any one time depends upon the difference between the maximum and minimum lines, i.e. the height of the envelope. For a given length of time allocated to each record (which is the ultimate constraint imposed by finite store),the height of the envelope is controlled by the quality of the prediction.

Ranger caters for the UNPREDICTABILITY of future signal activity by employing its 13 predictive mechanisms, against which all samples are tested as they are received. Though they are all loosely based on past history, some of those predictions are worse than others, and the poor ones are discarded.

This method of ANTICIPATING the possible signal path and testing each sample for conformity:

  • Spreads the computational load out uniformly over time
  • Allows for immediate reaction to transients
  • Works with extremely long recording periods

Figure 1 below shows actual results of adaptive and point store on real data when both used the same amount of store. You can see that the adaptive process is much better at catching anomalies by comparing the two graphs.

Adaptive Store

The Ranger Adaptive Store System has shown itself to be the most powerful automatic data compression system seen in any of today’s Data Loggers.