**2.3 Example of use**

356 Applications of Digital Signal Processing

Fig. 4. Pseudo-code algorithm for calculating the context-switching entropy using a library

package FFT routine

We evaluated hypothetical systems with 15 concurrent tasks. One system featured asynchronous events. The other system had tasks cyclically scheduled at different periods. The asynchronous system always had higher switching entropy. The context switching metric distinguishes complicated but synchronized architectures from those with complex temporal behaviour

For this metric, by computing the entropy of the phase spectrum in addition to the amplitude may help in discriminating between complexity and noise. That would be straightforward to include because right now we discard the phase information. Any stationary signal that shows asymmetry must have some peculiar phase relationships going on. So it might be easier to discount randomness in favour of more complex phase relationships if we were to include both the amplitude spectrum and the phase spectrum in the final metric.

Another simple alternative that works on the timing alone is the **gzip** program. This looks at the distribution complexity in times and calculates the entropy metric. The usage of this application is trivial. Take the output file, if a sequence of discrete inputs (zeros and number of values), run it through **gzip** and record the size (Benedetto, et al, 2003). Comparisons of two different sequences of identical length will suggest that the less complex program is the one of the smaller zipped size.
