Properties of Backprop
Error function defines a surface over the weight space, and weights are modified over gradient of the surface.
Local minima may exist in decision surface: this means that there is no convergence theorem for backprop (weight space is large enough that this rarely happens in practice, though).
Nets take a lot of training and a lot of training examples. Moreover, while nets do generalize overtraining can be a real problem.