W. Averell Harriman School for Management and Policy
State University of New York at Stony Brook
Stony Brook, New York 11794-3775
The typical user of this system may well be a domain expert. However, while this user is well versed in the domain itself, and perhaps in his or her own problem-solving techniques, he or she is not an expert in the quantitative methods used by the problem-solving system. For example, a factory-floor scheduler might have demonstrated and respected expertise in his or her daily tasks; however, he or she is unlikely to be acquainted with the mathematical intricacies of scheduling software.
The research question is how to automate explanation, when the recipient of the explanation does not share all of the concepts that support the reasoning of the problem-solving system. An important assumption of this line of work is that explanation does not have to involve imparting knowledge about reasoning mechanisms per se; it can also be ``translating'' from one problem-solving system to another. This assumption leads to a number of questions:
The long-term project will investigate these questions, and embody the answers in a prototype system. The development of the prototype will build on a previous explanation facility, QEX, that was created to provide justifications for the decisions of a mathematical scheduling system. The project will investigate the formulation of domain-independent knowledge structures, and postulate a way to generate the domain-specific instantiations of these structures in a principled manner. In addition, an investigation will be conducted into the feasibility of a general approach to perform the two major tasks of this type of explanation: determining the facts that are the basis of the explanations, and ``translating'' these facts (which are derived from the representation of the quantitative problem-solving system) into concepts that are familiar to the non-technical user.
To identify possible practical applications of this type of explanation, information will be gathered about quantitative systems that are currently in use in business and manufacturing environments. Several such systems will be chosen for the development of explanation facilities in the context of the future grant proposal. It is important to find problem-solving systems that are different enough (in terms of methodology and domain) so that the generality of the explanation approach will be tested.
Slotnick, S. A. and J. D. Moore. "Explaining Quantitative Systems to Uninitiated Users". Expert Systems with Applications, Vol. 8, No. 4, pp. 475-490. 1995.
The importance of explanation in decision-support systems has been recognized from the early days of expert system development. As pointed out by Swartout and Moore (1993), ``second-generation'' explanation facilities go beyond their predecessors in the areas of knowledge representation (explicitly representing knowledge needed for explanation) and explanation generation (more sophisticated use of discourse structures to use different perspectives, tailoring explanations to the user, answering follow-up questions). Using techniques from natural-language generation, Moore and Paris (1993) treat explanation generation as a problem-solving activity in its own right. They developed a planning mechanism to generate explanations, using plan operators to represent knowledge about communicative goals. Moore (1994) developed an explanation system that can interpret and respond to follow-up questions from the user, employing heuristics that enable the system to carry on a dialogue with the user.
In order to explain the ``reasoning'' of a quantitative problem-solving system, it is necessary to determine which numerical values are ``significant'' for explanation. Kosy and Wise (1984) developed a technique for explaining a business planning system based on a financial model. The main idea behind this system is based on looking at the difference between two variables (changes over time, differences betweeen reality and expectations, etc.). Roth, Mattis and Mesnard (1991) use this method in an explanation facility for a project management system, combining textual and graphical presentations.
Moore, J. D. Participating in Explanatory Dialogues: Interpreting and Responding to Questions in Context. Cambridge, MA: MIT Press. 1994.
Moore, J. D. and C. L. Paris. ``Planning Text for Advisory Dialogues: Capturing Intentional and Rhetorical Information.'' Computational Linguistics, Vol. 19, No. 4, pp. 651-695. 1993.
Roth, S. F., J. Mattis and X. Mesnard. ``Graphics and Natural Language as Components of Automatic Explanation.'' In J. Sullivan and S. Tyler, editors, Intelligent User Interfaces, pp. 207-237. Reading, MA: Addison-Wesley. 1991.
Swartout, W. R. and J. D. Moore. ``Explanation in Second Generation Expert Systems''. In J.-M. David, J.-P. Krivine and R. Simmons, editors, Second Generation Expert Systems. Berlin: Springer-Varlag, 1993.