Figure 7.7 Weighted regression example. The size of the rolling window is still 32 observations. However, here the observations have been weighted using a linear function of time and a decay rate of 3%

Not surprisingly, when W = I, i.e. when all observations have equal weight, equations (7.8) to (7.10) can be condensed into the OLS equation. The most popular functions, which are provided by the tool pack, are linear or exponential but other functions such as sigmoids can be considered. A decay rate determines the speed at which the weight of an observation decreases with time.11

In order to use the sensitivity estimation tool pack, select one of the weighting options in the rolling regression menu (Figure 7.3). For instance, select a linear weighting and enter a weight decay of 3%. Figure 7.7 shows the result of the estimation.

Weighted least squares estimation induces less autocorrelation in the estimates than ordinary least squares estimation. Depending on the decay rate, shadow effects, lag and persistence problems are considerably reduced.

However, WLS does not provide a way to model the sensitivities time series. It still measures past (weighted) average sensitivities rather than predicting future ones. In addition, all sensitivity coefficients in the regression equation are identically affected by the weighting, regardless of their rate of change as the weights only depend on the position of the observation in the time series. Consequently, constant sensitivities suffer from large weight discount rates while highly variable sensitivities suffer from small decay rates. There is no single weight discount rate that is adapted to all factor sensitivity coefficients when their variances differ and some trade-off has to be made.

0 0

Post a comment