next up previous contents
Next: When should locally weighted Up: Pros and cons vs. Previous: Prediction speed

Interference and old data

Neural networks can suffer from a problem called interference. As mentioned earlier, a net is trained by repeatedly presenting data to it. The net then adjusts its parameters slightly to better represent the new data. The problem is that when a new set of data in one region of the input space is repeatedly presented to the net, it can forget the mapping it learned in other regions of the input space. This problem is referred to as interference. The forgetting behavior can be useful if the function being modeled is changing over time, because the new data will eventually erase the effects of the out-dated information.

Memory based learning does not suffer from interference. Since there is no training phase, there is no ``loss of importance'' of older data. Its all in memory. If the underlying function is changing through time, the confidence intervals provided with memory based learning can be used to explicitly determine when data should be discarded. Vizier does not do this automatically, but its confidence intervals can be used for this purpose (see the section on confidence intervals later).

   table398
Table 1: Comparison between memory based learning and neural networks



Jeff Schneider
Fri Feb 7 18:00:08 EST 1997