Next: The Information Provided by
Up: Using Locally Weighted Learning
Previous: Interference and old data
- Always! We believe it is a good idea to approach all function
approximation and data modeling first with locally weighted learning. It
is a fast, efficient method of selecting features, validating models,
estimating noise, and providing good insights into the relationships between
variables.
- What if a global parametric model can be obtained? Use
the model locally and see Rule 1! Even if you have a global
parametric model for your system, it often helps to fit it locally.
Doing so can overcome inaccuracies in the global model caused by
assumptions that only hold over a limited range of the state space.
- What if the data set is extremely large? Buy more
computer memory and processing and see Rule 1! The tree based
algorithms used by Vizier make memory based learning efficient even in
the face of large data sets. If the data set really is too big to
keep around and process, it may be necessary to resort to other
incremental learning algorithms that discard the data after training.
- What if predictions must be very fast? Use caching
methods and see Rule 1! When locally weighted learning produces
accurate predictions, but faster speed is required, it is often useful
to cache the predictions in a fast lookup scheme. The alternative is
resorting to another function approximator that specializes in fast
prediction times (neural networks can reach sub-millisecond prediction
times).
Jeff Schneider
Fri Feb 7 18:00:08 EST 1997