H.H. Bauer's
knowledge-filter model
of scientific progress makes clear where I think that science can be
improved with the use of computational discovery programs: not so much
in the application of textbook knowledge, but rather at the science
frontiers where contributions get made to the primary (journal)
literature. This literature is filled with often unreliable
inferences from experimental evidence to models, patterns, empirical
descriptions, and the like. At this stage, both knowledge-driven and
data-driven programs can make significant contributions to making a
variety of inferences in frontier science more reliable, as well as
faster and maybe even cheaper.
This filter model resolves the seeming paradox: whereas much science
is highly reliable (it puts Man on the moon, it cures many diseases),
this body of knowledge is textbook science which largely consists of
knowledge of the form "if we do X, Y will happen." Frontier science -
tomorrow's textbook science - may put Man on Mars,
but only after much filtering because of its inherent unreliability.
This is where a scientist/computer collaboration can accelerate
progress in frontier science.
Reference:
Scientific Literacy and the Myth of the Scientific Method, Henry
H. Bauer,
University of Illinois Press
, Urbana IL, 1994.