From:	IN%"peter@Nexus.YorkU.CA"  "Peter Roosen-Runge"  3-NOV-1989 17:33:04.70
To:	cs100006@YUSol
CC:	
Subj:	

Return-path: peter@Nexus.YorkU.CA
Received: from JNET-DAEMON by YUSol; Fri, 3 Nov 89 17:32 EDT
Received: From YORKVM1(MAILER) by YUSOL with Jnet id 7741 for CS100006@YUSOL;
 Fri,  3 Nov 89 17:32 EDT
Received: from YUOrion by VM1.YORKU.CA (Mailer R2.02A) with BSMTP id 2554; Fri,
 03 Nov 89 17:29:34 EST
Received: from nexus.yorku.ca by Orion.YorkU.CA; Fri, 3 Nov 89 17:32 EDT
Received: by nexus.yorku.ca id 6164; Fri, 3 Nov 89 17:29:13 EST
Date: Fri, 3 Nov 89 17:29:08 EST
From: Peter Roosen-Runge <peter@Nexus.YorkU.CA>
To: cs100006@YUSol
Message-Id: <89Nov3.172913est.6164@nexus.yorku.ca>

Path: yunexus!utzoo!utgpu!jarvis.csri.toronto.edu!mailrus!uwm.edu!cs.utexas.edu!
   uunet!munnari.oz.au!bruce!lloyd
From: lloyd@bruce.OZ (lloyd allison)
Newsgroups: comp.ai.neural-nets
Subject: Re: Kolmogorov Complexity references (was Data Complexity)
Keywords: randomness, inductive inference, minimum encoding
Message-ID: <1632@bruce.OZ>
Date: 30 Oct 89 01:28:15 GMT
Article-I.D.: bruce.1632
Posted: Sun Oct 29 20:28:15 1989
References: <7305@sdcsvax.UCSD.Edu>
Organization: Monash Uni. Computer Science, Australia
Lines: 16
 
try also
J. Rissanen Stochastic Complexity
and
 
C. S. Wallace and P. R. Freeman  Estimation and inference by compact coding
 
Journal of the Royal Statistical Society (series B) V 39 No3 1987
 
pages 223-239 and 252-265 (Rissanen)
and   240-265 Wallace and Freeman
 
 
also  C. S. Wallace and D. M. Boulton  An information Measure for Classification
Computer Journal 11(2) 185-194 1968
 
zzzzz

From:	IN%"peter@Nexus.YorkU.CA"  "Peter Roosen-Runge"  3-NOV-1989 17:33:25.85
To:	cs100006@YUSol
CC:	
Subj:	

Return-path: peter@Nexus.YorkU.CA
Received: from JNET-DAEMON by YUSol; Fri, 3 Nov 89 17:33 EDT
Received: From YORKVM1(MAILER) by YUSOL with Jnet id 7743 for CS100006@YUSOL;
 Fri,  3 Nov 89 17:33 EDT
Received: from YUOrion by VM1.YORKU.CA (Mailer R2.02A) with BSMTP id 2556; Fri,
 03 Nov 89 17:29:40 EST
Received: from nexus.yorku.ca by Orion.YorkU.CA; Fri, 3 Nov 89 17:32 EDT
Received: by nexus.yorku.ca id 6159; Fri, 3 Nov 89 17:28:27 EST
Date: Fri, 3 Nov 89 17:28:13 EST
From: Peter Roosen-Runge <peter@Nexus.YorkU.CA>
To: cs100006@YUSol
Message-Id: <89Nov3.172827est.6159@nexus.yorku.ca>

Path: yunexus!utzoo!attcan!utgpu!jarvis.csri.toronto.edu!mailrus!shadooby!samsun
   g!brutus.cs.uiuc.edu!apple!sun-barr!decwrl!hplabs!hp-sdd!ucsdhub!sdcsvax!beow
   ulf!demers
From: demers@beowulf.ucsd.edu (David E Demers)
Newsgroups: comp.ai.neural-nets
Subject: Re: Kolmogorov Complexity references (was Data Complexity)
Keywords: randomness, inductive inference, minimum encoding
Message-ID: <7305@sdcsvax.UCSD.Edu>
Date: 28 Oct 89 17:20:47 GMT
Article-I.D.: sdcsvax.7305
Posted: Sat Oct 28 13:20:47 1989
Sender: nobody@sdcsvax.UCSD.Edu
Reply-To: demers@beowulf.UCSD.EDU (David E Demers)
Organization: EE/CS Dept. U.C. San Diego
Lines: 98
 
 
I recently promised to provide some follow up references on
Kolmogorov Complexity, following a thread discussing how
one might measure "complexity" of a model, such as a
neural net.
 
For Kolmogorov Complexity in general, the best place to
start would be with
an introduction to the notions of Kolmogorov Complexity and
its application to a number of different problems which can
be found in:
 
Li, M.  and Paul Vitanyi, Kolmogorov Complexity and Its
Applications, in
Handbook of Theoretical Computer Science, (J. van Leeuwen, Ed.)
North-Holland, 1989.
 
A preliminary version of the same work is in the Proceedings
of the 3d IEEE Structure in Complexity Theory Conference, 1988.
 
The same authors have a book out by Addison-Wesley,
An Introduction to Kolmogorov Complexity and its Applications (1989).
 
For original work, see the references in the bibliography of
the above book.  Let me point out the following which
are probably most important:
 
Kolmogorov, AN, "Three approaches to the quantitative definition
of information" Problems in Information Transmission 1, 1-7, 1965;
 
Shannon & Weaver, The Mathematical Theory of Communication, U. of
Illinois Press, 1949 (for basics of information theory)
 
Kolmogorov, A.N. "Logical Basis for information theory and
probability theory", IEEE Trans. on Info. Theory ?? 1968 (sorry)
 
Solomonoff, R.J., "A formal theory of inductive inference",
Information & Control 30, 1-22 and 224-254 (1964).
 
Chaitin, G.J., "Randomness and Mathematical Proof"
Scientific American, 232 (May 1975), pp47-52
 
Chaitin, G.J., "Algorithmic Information Theory", IBM J. Res.
Dev. 21, 350-359 (1977)
 
Chaitin, G.J.,  Information, Randomness and Incompleteness.
(World Scientific Pub, 1987)
 
other surveys are:
Zvonkin, A.K. and L.A. Levin, "The complexity of finite
objects and the development of the concepts of information
and randomness by the means of the Theory of Algorithms",
Russ. Math. Survey 25, 83-124 (1970)
 
Schnorr, C.P., "A survey of the theory of random sequences",
in Basic Problems in Methodology and Linguistics (Butts, R.E. Ed.)
1977, pp. 193-210.
 
 
Application of Kolmogorov Complexity, or similar ideas, can
be seen in work of Valiant, Gold, Rissanen & others (maximum
likelihood, maximum entropy...).  I
might mention
Rissanen, J., "Modeling by the shortest data description",
Automatica 14, 465-471 (1978)
for a practical suggestion for statistical modeling - minimum
description length - which has two components, one essentially
computing the size of the model and the other computing the size of
the data when encoded using the theory.
 
APPLICATIONS:
 
Can state and prove Goedel's incompleteness theorem using
Kolmogorov Complexity,
 
Can derive a prime-number theorem, # primes less than n
is on the order of n/ (log n (log log n)^2) ,
 
can state and prove the pumping lemma for regular languages
in a couple of lines,
 
and so on...
 
In the neural net world, Tenorio & Lee used a variant of
Rissanen's minimum description length along with a
stochastic growth method for determining when to add
new nodes to a network (TR-EE 89-30, June 1989, Purdue University
School of Electrical Engineering)
 
Hope this is of interest!
 
Dave DeMers
Dept. of Computer Science & Engineering C-014
and Institute of Non-Linear Science
UCSD
La Jolla, CA  92093
demers@cs.ucsd.edu
demers@inls1.ucsd.edu