next up previous
Next: The User Model Up: A Personalized System for Previous: Interactive Constraint-Satisfaction Search

   
The ADAPTIVE PLACE ADVISOR

In this section, we first present an overview of the ADAPTIVE PLACE ADVISOR's functioning, then follow with details about its components.3 The system carries out a number of tasks in support of personalized interaction with the user; in particular, it: The responsibilities for these tasks are distributed among various modules of the system, as shown in Figure 1. The Dialogue Manager generates, interprets, and processes conversations; it also updates the expanded query after each user interaction. The Retrieval Engine is a case-based reasoning system [1] that uses the expanded query to retrieve items from the database and to measure their similarity to the user's preferences. The User Modeling System generates the initial (probabilistic) query and updates the long-term user model based on the conversation history. The Speech Recognizer and the Speech Generator handle the user's input and control the system's output, respectively.
  
Figure 1: Components of the ADAPTIVE PLACE ADVISOR and their interactions.
\begin{figure}
\setlength{\epsfxsize}{6.0in}
\centerline{\epsfbox{figs/APA.ps}}
\end{figure}

To find items to recommend to the user, the PLACE ADVISOR carries out an augmented interactive constraint-satisfaction search. The goal of the entire conversation is to present an item that will be acceptable to the user. During the constraint-satisfaction portion, the system carries out a conversation to find a small set of such items. During the search phase, two situations determine the system's search operators and thus its questions. First, an under-constrained specification means that many items match the constraints, and the system must obtain more information from the user. Second, if there are no matching items, the system must relax a constraint, thus allowing items to contain any domain value for the relaxed attribute.4 The system ends the search phase when only a small number of items match the constraints and are highly similar (based on a similarity threshold) to the user's preferences. Item presentation (in similarity order) begins at this point, with a similarity computation used to rank the items that satisfy the constraints. The search and item presentation process is also influenced by the User Modeling System and thus is personalized. The main mechanism for personalization is through the expanded query, a probabilistic representation of the user's preferences, both long-term (over many conversations) and short-term (within a conversation). We will often just refer to this as the ``query,'' but it always refers to constraints that are both explicitly and implicitly specified by the user. Thus, the query is ``expanded'' beyond the explicit (short-term) constraints using the (long-term) constraints implicit in the user model. In a sense, the initial query represents what constraints the system thinks the user will ``probably want.'' The system incrementally refines this query in the course of the conversation with the user, setting explicit, firm constraints as the user verifies or disconfirms its assumptions. Over the long term, the User Modeling System updates the user model based on the user's responses to the attributes and items offered during a conversation. The Retrieval Engine searches the database for items that match the explicit constraints in the query. It then computes the similarity of the retrieved items to the user's preferences as reflected in the expanded part of the query. Depending on the number of highly similar results, the Retrieval Engine also determines which attribute should be constrained or relaxed. In sum, the system directs the conversation in a manner similar to a frame-based system, retrieves and ranks items using a case-based reasoning paradigm, and adapts the weights in its similarity calculation based on past conversations with a user, thereby personalizing future retrievals and conversations. In this section, we present the details of the ADAPTIVE PLACE ADVISOR's architecture. After describing the user model, we elaborate on the Retrieval Engine then the Dialogue Manager. Finally, we discuss how the system updates the user model as the user interacts with it.

 
next up previous
Next: The User Model Up: A Personalized System for Previous: Interactive Constraint-Satisfaction Search
Cindi Thompson
2004-03-29