In particular, Nate Silver at 538 spent a lot of time pointing out that polling errors of several points are not uncommon, and that polls can all be off in the same direction. Which seems to be what happened.
The 538 website prediction (which is still visible) gave Trump over 25% chance of winning the day of the election, and had never given him less than a 10% chance all year, as you can see in the graph.
Lumping 538 in with other organizations that did over-estimate Hillary's chances is especially ironic. The week before the election, Nate was getting beaten up by other poll aggregators because they thought Hillary was almost certain to win. They were accusing him of pretending Trump had a chance in order to keep viewers watching. And Nate was vigorously arguing in return that the other folks were over-estimating Hillary's chances, and that Trump had a narrow but non-trivial path to winning.
I think the basic problem is that people are terrible at
understanding probabilities.
They seem to think that if something happens, the "real" probability
was 100%, and if it doesn't happen, the "real" probability was 0%.
25% probability events happen.
In fact, they happen about 1/4 of the time!
10% probability events happen about 1/10 of the time!
People are terrible at understanding this.
If a 25% probability event
happens, that doesn't mean the probability was wrong.
But that's how
people tend to interpret it: whatever has probability over 50% will
definitely happen, and under 50% definitely won't happen.
If you think about how Trump won, he achieved really small majorities in each of several crucial states that weren't predicted to go his way, but were expected to be close. That sounds to me like a series of low-probability events that ended up all happening; he ran the table, to use the phrase Nate Silver was using.
So, Nate Silver was actually right.