Hillsdale, NJ: Lawrence Erlbaum (
1992)
Copy
BIBTEX
Abstract
Psychology and philosophy have long studied the nature and role of
explanation. More recently, artificial intelligence research has
developed promising theories of how explanation facilitates learning
and generalization. By using explanations to guide learning,
explanation-based methods allow reliable learning of new concepts in
complex situations.
This volume addresses fundamental issues in generating and judging
explanations: When to explain, what constitutes an explanation, how to
build explanations, and how to evaluate candidate explanations. It
argues that standard models, which are neutral to context and
experience, are inadequate to deal with the problems that arise in
everyday explanation. It presents an alternative theory in which
context---involving explainer beliefs, goals and experience---is
crucial in generating and judging explanations. It examines the role
of anomalies in motivating explanation, describes a model of
pattern-based anomaly detection, and shows how an analysis of the
content of anomalies, explanations, and explanation purposes can be
used to guide generation and evaluation of explanations by a
case-based explanation system. The theory is implemented in ACCEPTER,
a computer system that understands stories, detects anomalous events,
guides retrieval and adaptation of stored explanations, and evaluates
candidate explanations.