complexity

how to measure complexity? given a particular behavior or phenomena or message, how do you measure its information content? how objective can this definition be made?

Shannon used probability/thermodynamics to define content as the negative-entropy: -\\Sigma\\lg p_i of the message. but what predicts the probabilities? the message "101010101010..." has little content but high entropy. If we `changed coordinate systems', replacing "10" with "2", then our message "222222..." has the right size. One could define the content by the size of the output from gzip, or any other data compression algorithm. This is a step in the right direction, but which algorithm is Right? What is the content of the first 100000 digits of \\pi (3.1415...)?

Kolmagorov solved this by counting the size of the algorithm: the complexity of a message is the size of the smallest program that can create that message. so digits of \\pi can be compressed into a small program that uses a mathematical series. but which language do we use?

Meta System Transition Theory (as outlined in the PCP) claims to have objective definitions of complexity and even progress, but so far i haven't found them.

but they all have the same problem: noise appears to be complex. how can meaning be made objective? how can we write a program to find something if we don't know what it will be?

what is the difference between different and random? it is meaning, and this is subjective. how can you show that noise is not a message from aliens?

a noise-generating program might be very small, but it won't generate the `right' output---it may appear similar. what is the meaning of a photograph of sand on the beach? does it include fragments of histories of rocks and waves? what about something `truly random': radiation? what if i encrypt the picture, but you don't have the key? is that noise?

my bookmarks on the subject are here.