From salzberg@osprey.cs.jhu.edu Fri Oct  7 14:43:06 EDT 1994
Article: 24511 of comp.ai
Path: cantaloupe.srv.cs.cmu.edu!nntp.club.cc.cmu.edu!godot.cc.duq.edu!news.duke.edu!news-feed-1.peachnet.edu!darwin.sura.net!jhunix1.hcf.jhu.edu!blaze.cs.jhu.edu!osprey.cs.jhu.edu!not-for-mail
From: salzberg@osprey.cs.jhu.edu (Steven Salzberg)
Newsgroups: comp.ai
Subject: new release of PEBLS system available
Date: 4 Oct 1994 08:15:31 -0400
Organization: The Johns Hopkins University CS Department
Lines: 82
Message-ID: <36rh13$628@osprey.cs.jhu.edu>
NNTP-Posting-Host: osprey.cs.jhu.edu

----------------------------------------------------------
			ANNOUNCEMENT

	A new release of the PEBLS system, PEBLS 3.0,
        is now available via anonymous FTP.
----------------------------------------------------------

     PEBLS is a nearest-neighbor learning system designed for
applications where the instances have symbolic feature values.  PEBLS
has been applied to the prediction of protein secondary structure and
to the identification of DNA promoter sequences.  A technical
description appears in the article by Cost and Salzberg, Machine
Learning journal 10:1 (1993).

     PEBLS 3.0 is written entirely in ANSI C. It is thus capable of
running on a wide range of platforms.  Version 3.0 incorporates a
number of additions to version 2.1 (released in 1993) and to the
original PEBLS described in the paper:

     S. Cost and S. Salzberg.  A Weighted Nearest Neighbor 
     Algorithm for Learning with Symbolic Features,
     Machine Learning, 10:1, 57-78 (1993).

     PEBLS 3.0 now makes it possible to draw more comparisons between
nearest-neighbor and probabilistic approaches to machine learning, by
incorporating a capability for tracking statistics for Bayesian
inferences.  The system can thus serve to show specifically where
nearest-neighbor and Bayesian methods differ.  The system is also able
to perform tests using simple distance metrics (overlap, Euclidean,
Manhattan) for baseline comparisons.  Research along these lines was
described in the following paper:

     J. Rachlin, S. Kasif, S. Salzberg, and D. Aha.  Towards a Better
     Understanding of Memory-Based and Bayesian Classifiers.  {\it
     Proceedings of the Eleventh International Conference on Machine
     Learning} (pp. 242--250).  New Brunswick, NJ, July 1994, Morgan
     Kaufmann Publishers.

TO OBTAIN PEBLS BY ANONYMOUS FTP
--------------------------------

     The latest version of PEBLS is available free of charge, and may
be obtained via anonymous FTP from the Johns Hopkins University
Computer Science Department.

     To obtain a copy of PEBLS, type the following commands:

     UNIX_prompt>  ftp blaze.cs.jhu.edu
[Note: the Internet address of blaze.cs.jhu.edu is 128.220.13.50]
     Name: anonymous
     Password: [enter your email address]

     ftp>  bin
     ftp>  cd pub/pebls
     ftp>  get pebls.tar.Z
     ftp>  bye

[Place the file pebls.tar.Z in a convenient subdirectory.]

     UNIX_prompt> uncompress pebls.tar.Z
     UNIX_prompt> tar -xf pebls.tar

[Read the files "README" and "pebls_3.doc"]


For further information, contact:

               Prof. Steven Salzberg
               Department of Computer Science
               Johns Hopkins University
               Baltimore, Maryland 21218
               Email:  salzberg@cs.jhu.edu

PEBLS 3.0 IS INTENDED FOR RESEARCH AND EDUCATIONAL PURPOSES ONLY.
PEBLS 3.0 may be used, copied, and modified freely for this purpose.
Any commercial or for-profit use of PEBLS 3.0 is strictly prohibited
without the express written consent of Prof. Steven Salzberg,
Department of Computer Science, The Johns Hopkins University.
-- 
Steven Salzberg, Assistant Professor       Johns Hopkins University
Department of Computer Science             Baltimore, MD 21218
salzberg@cs.jhu.edu