**3. Complementary approach – the multi-scale entropy measure**

An alternate approach to measuring the temporal complexity involves the application of the multi-scale entropy metric suggested by (Costa, 2005). This differs from the just described context-witching metric in that it measures the complexity of a temporal behaviour or signal over a wide range of fundamental periods. Whereas the single-scale metric works best over a single-decade frequency spectrum scale, a multi-scale metric offers up the possibility of looking at complexities at a variety of time scales, ranging over potentially orders of magnitude. This is definitely useful but it can't be boiled down into a single complexity metric; instead we will need to depict this graphically over what amounts to a logarithmic frequency scale.

The basis for the multi-scale entropy metric is that many real-world behaviours often occur over time scales of varying dynamic ranges. Costa originally applied this to a biomedical application, trying to extract the temporal complexities of cardiac-driven circulatory systems. When such a pulsed cardiac signal is multi-scale, it is actually composed of a fundamental pulse and various arrhythmias, leading to a complicated spectrum of events. The signal actually appears buried amongst competing behavioural periodicities at different time scales and so it becomes that much harder to extract the information.

At first glance, we would imagine that a Fourier transform would work well to extract the periods but in fact a typical FFT algorithm actually works best over a limited dynamic range. By expanding the scope to a multi-scale level, Costa showed that this complexity measure has use in a real-world application and we contend that it may also prove useful

Entropic Complexity Measured in Context Switching 359

The multi-scale aspect senses the different temporal frequencies in the underlying signal, comparable to what the contest-switching metric does, but instead pairs or groups the data points to measure a different coarse graining effect. This works out fairly straightforwardly in terms of a autocorrelation. The essential algorithm groups adjacent time samples together in a window of length *Scale* as the coarse graining or moving average measure. Then it counts the number of times, *n*, that the amplitude will change from one coarse-grained time

If the amplitudes don't change for a given coarse-grain then it is predictable and the entropy

݈ܿ݉݁ݔ݅ݕݐሺ݈ܵܿܽ݁ሻ ൌ െ ቀሺௌାଵሻ

A graph of the multi-scale entropy will appear flat if it is measuring "1/f" (van der Ziel, 1950) or the so-called pink noise as the underlying behaviour. Pink noise shows a predictable constant change of amplitude density per scale factor; in other words it has a constant energy per frequency doubling while white noise shows constant per frequency

In comparison to the structure-less noise, if structure does exist in the signal, you will see observable changes in the entropy from one scale factor to the next. For example a superimposed sine wave would show a spike downward in sample entropy when it crossed

A simple interpretation suggests that we scale the measured results relative to the *1/f* noise part of the signal. The *1/f* noise includes the greatest variety of frequencies of any behaviour known and therefore the highest entropy (Milotti, 2002). So by providing a good visualization or graph that plots the *1/f* asymptotic value we can immediately gauge the complexity of a signal. Costa *et al* discuss the difficulty of distinguishing between randomness and increasing complexity, which has importance in the realm of event-driven

This is a key insight and one echoed by researchers in complexity theory (Gell-Mann, 1994) in that the most interesting and challenging complexity measures occupy the middle range of the complexity scale. In other words, the most ordered signals can be described by a few harmonic periods and at the other extreme, the complexity reduces to simple stochastic measures akin to statistical mechanics (Reif, 1965). In between these extremes, we require a

*these two extreme states."* 

*"In fact, entropy-based metrics are maximized for random sequences, although it is generally accepted that both perfectly ordered and maximally disordered systems possess no complex structures. A meaningful physiologic complexity measure, therefore, should vanish for* 

ሺௌሻ <sup>ቁ</sup> (6)

will be low. To calculate the sample entropy they calculate

over each of the scale factors *Scale* = 1 .. *maxScaleFactor*.

step to the next.

**3.1 Usage domains** 

interval (Montroll & Shlesinger, 2002).

a harmonic in the scale factor.

different level of sophistication.

systems.

for cyber-physical applications such as a complex event-driven system. In this case, the time scales can range from fast interrupt-processing, to human-scale interactivity, to the even more sporadic environmental influences.
