III. Time domain methods

Time domain statistical procedures can be used to describe periodicity. The autocorrelation function is extremely useful in detecting periodicities when the time series is characterized by a relatively pure sinusoid uncontaminated by other random influences. Inspection of the autocorrelogram for the 0.2 Hz sine wave illustrated in Figure 1b reflects the periodicity of the wave form. In Figure 1b the autocorrelations oscillate between +1.0 and -1.0 every 10 lags. The term lag represents the displacement of the time series in terms of time sampled sequential data points. Thus, the pattern of the 0.2 Hz sine wave would be correlated 1.0 with a time-shifted version of itself when the time lag is equivalent to the period of the sine wave (i.e., 10 data points or 5 seconds). The magnitude of the autocorrelation remains 1.0 every 10 lags and does not attenuate even when the time series is correlated with a time-shifted version which is displaced by 50 data points! This is a characteristics of a deterministic time series which is not representative of physiological and behavioral periodic processes.

Similarly, a cross-correlation is the correlation of one time series with a time-lagged version of a second time series. The cross-correlation function provides information regarding the statistical dependence of one series on another. If the two time series are identical, the peak value of the cross-correlation function will be unity at the lag that makes the two series identical and less than unity at all other lags. In most cases, since the second series is not simply a time-shifted version of the first series, the peak value of the cross-correlation will be less than unity. Cross-correlation techniques lose their effectiveness and sensitivity to assess the communality between two series when the difference between the series is more than a temporal displacement.

We can evaluate the limitations of the autocorrelation method in detecting periodicities, when we inspect the autocorrelation functions of various time series. For example, with a perfect sine wave, it is clear that the method provides an accurate description of the periodicity, although the method is merely an alternative approach of describing an obvious periodicity. However, when this method is applied to psychophysiological variables such as respiration and heart rate, its utility is more dubious.

Figure 2a illustrates the chest circumference changes associated with respiration. The amplitude of chest circumference was sampled at 2.0 Hz (every 500 msec). Visual inspection of the time series indicates a relatively stable breathing pattern of approximately one breath every six or seven seconds. As illustrated in Figure 2b the autocorrelation function supports this observation with the greatest magnitude correlation at 12 lags and at multiples of 12. Visual inspection of Figures 1b and 2b illustrate the differences between the autocorrelation function for a deterministic sine wave and stochastic process of respiration. Recall that if the past history of a signal totally determines its future behavior, it is said to be deterministic. Since the wave form in Figure 1a depicts a pure sine wave, the process is totally predictable and, therefore, deterministic. In contrast, physiological signals are neither simple sine waves nor totally determined by their past history. The respiration signal described in Figure 2 is periodic, although its past behavior does not totally predict the future values in terms of amplitude, period, and phase of the signal. As the time-shift gets longer, the autocorrelations of stochastic periodic processes become smaller. In the respiration example in Figure 2b, the peaks of the autocorrelation function decrease from approximately .6 when time-shifted one cycle of the sine wave to approximately .3 when time-shifted five cycles.

Figure 2

The above examples demonstrate the effectiveness of the autocorrelation method for quantifying the periodicity and stochastic nature of the process. However, these examples are limited to processes with a clearly observable rhythmic component. The characteristics of physiological processes depart from the above examples. Physiological processes are complex and often are composed of multiple components, some of which are not sinusoidal. The autocorrelogram of physiological processes is, therefore, difficult to interpret. A time series of sequential heart period values provides an example. Unlike respiration data, heart period reflects a number of periodic influences including respiratory (i.e., respiratory sinus arrhythmia) and blood pressure (i.e., Traube-Hering- Mayer wave) and aperiodic influences including metabolic demands (see Kitney & Rompelman, 1980).

Figure 3a illustrates the heart period of a subject with high heart period variability including a prominent rhythmicity associated with respiration (i.e., respiratory sinus arrhythmia). In contrast, Figure 4a illustrates a subject with low heart period variability. Note that the periodicity in the autocorrelogram in Figure 3b is not as prominent as in Figure 2b and in Figure 4b there is no apparent periodicity at lags which would be associated with either the Traube-Hering-Mayer wave (i.e., approximately 10 to 15 seconds or 20 to 30 lags) or respiratory sinus arrhythmia (i.e., approximately 2.5 to 8.0 seconds or 5 to 16 lags). From these two examples it becomes clear that as the signal becomes more complex, time domain methods of assessing and describing rhythmicity become increasingly difficult.

Figure 3

Figure 4

In spite of the obvious complexity of physiological time series, some periodicities may become obvious during visual inspection when the data are plotted over a long period of time. For example, in the study of circadian rhythms, descriptive time domain methods have been useful. One rhythmometric method, cosinor analysis, has been frequently applied. This method is based on two assumptions: 1) the circadian rhythm accounts for a major source of the variance of the process being studied and 2) the periodicity is relatively symmetrical and can be approximated by a cosine wave. Cosinor (group-mean cosine-vector) analysis was developed by Halberg (e.g., Halberg et al., 1972) to describe the time series with a number of operationally defined parameters. In this analysis a cosine curve is fitted to the data by least squares and a number of parameters are extracted. The rhythm-adjusted mean is called the mean level or mesor and is defined as a value midway between the peak and trough of the fitted cosinusoidal wave. The amplitude is quantified as half the difference between maximal and minimal values of the fitted cosinusoidal wave. Acrophase is a measure of peak time relative to some reference time described as a phase angle.

Back to Title Page

Back to Home Page