EXPLANATION OF QUANTITATIVE SYSTEMS: A GENERAL APPROACH TO PROVIDING JUSTIFICATIONS TO NON-TECHNICAL USERS

Susan A. Slotnick

W. Averell Harriman School for Management and Policy
State University of New York at Stony Brook
Stony Brook, New York 11794-3775

CONTACT INFORMATION

Email: slotnick@pegasus.har.sunysb.edu
Voice: (516) 632-6599
Fax: (516) 632-9813

PROGRAM AREA

Usability and User-Centered Design.

KEYWORDS

Explanation, expert systems, interactive systems.

PROJECT SUMMARY

The project supported by this grant is the planning of a full-scale grant proposal that will support the development of a general approach to the explanation of advice-giving systems that use mathematical or other quantitative problem-solving methods. This project is of practical as well as research interest. The practical aspect arises from the existence of quantitative problem-solving systems for real-world applications that are intended for users who are not well versed in the quantitative problem-solving method. A common concern about these systems is that an uninitiated user, who is familiar with the domain, but not with the technical subtleties of the quantitative system, will be reluctant to use a decision-support aid that he or she does not understand. If the reasoning of the problem-solving system can be explained in terms with which the user is familiar, then the tool is more likely to be used. We might even speculate that more quantitative systems might be brought to real-world applications if they could be easily explained.

The typical user of this system may well be a domain expert. However, while this user is well versed in the domain itself, and perhaps in his or her own problem-solving techniques, he or she is not an expert in the quantitative methods used by the problem-solving system. For example, a factory-floor scheduler might have demonstrated and respected expertise in his or her daily tasks; however, he or she is unlikely to be acquainted with the mathematical intricacies of scheduling software.

The research question is how to automate explanation, when the recipient of the explanation does not share all of the concepts that support the reasoning of the problem-solving system. An important assumption of this line of work is that explanation does not have to involve imparting knowledge about reasoning mechanisms per se; it can also be ``translating'' from one problem-solving system to another. This assumption leads to a number of questions:

The long-term project will investigate these questions, and embody the answers in a prototype system. The development of the prototype will build on a previous explanation facility, QEX, that was created to provide justifications for the decisions of a mathematical scheduling system. The project will investigate the formulation of domain-independent knowledge structures, and postulate a way to generate the domain-specific instantiations of these structures in a principled manner. In addition, an investigation will be conducted into the feasibility of a general approach to perform the two major tasks of this type of explanation: determining the facts that are the basis of the explanations, and ``translating'' these facts (which are derived from the representation of the quantitative problem-solving system) into concepts that are familiar to the non-technical user.

To identify possible practical applications of this type of explanation, information will be gathered about quantitative systems that are currently in use in business and manufacturing environments. Several such systems will be chosen for the development of explanation facilities in the context of the future grant proposal. It is important to find problem-solving systems that are different enough (in terms of methodology and domain) so that the generality of the explanation approach will be tested.

PROJECT REFERENCES

Slotnick, S. A. Heuristic Scheduling and Explanation of Quantitative Systems. Ph.D. Thesis, Carnegie Mellon University. 1994.

Slotnick, S. A. and J. D. Moore. "Explaining Quantitative Systems to Uninitiated Users". Expert Systems with Applications, Vol. 8, No. 4, pp. 475-490. 1995.

AREA BACKGROUND

The current project draws from two main sources. The interactive explanation generation is based on work by Moore (1994) and Moore and Paris (1993). The approach to explaining quantitative models is based on that of Kosy and Wise (1984) and Roth et al. (1991). This project applies the theory and techniques of these two approaches to explanation, and adds a capability of moderating between different problem-solving approaches in order to explain the reasoning of the advice-giving system to a user who is not familiar with the mechanics of that system.

The importance of explanation in decision-support systems has been recognized from the early days of expert system development. As pointed out by Swartout and Moore (1993), ``second-generation'' explanation facilities go beyond their predecessors in the areas of knowledge representation (explicitly representing knowledge needed for explanation) and explanation generation (more sophisticated use of discourse structures to use different perspectives, tailoring explanations to the user, answering follow-up questions). Using techniques from natural-language generation, Moore and Paris (1993) treat explanation generation as a problem-solving activity in its own right. They developed a planning mechanism to generate explanations, using plan operators to represent knowledge about communicative goals. Moore (1994) developed an explanation system that can interpret and respond to follow-up questions from the user, employing heuristics that enable the system to carry on a dialogue with the user.

In order to explain the ``reasoning'' of a quantitative problem-solving system, it is necessary to determine which numerical values are ``significant'' for explanation. Kosy and Wise (1984) developed a technique for explaining a business planning system based on a financial model. The main idea behind this system is based on looking at the difference between two variables (changes over time, differences betweeen reality and expectations, etc.). Roth, Mattis and Mesnard (1991) use this method in an explanation facility for a project management system, combining textual and graphical presentations.

AREA REFERENCES

Kosy, D. W. and B. P. Wise. ``Self-explanatory Financial Planning Models.'' In Proceedings of the Fourth National Conference on Artificial Intelligence, pp 176-181. Menlo Park, CA: AAAI. 1984.

Moore, J. D. Participating in Explanatory Dialogues: Interpreting and Responding to Questions in Context. Cambridge, MA: MIT Press. 1994.

Moore, J. D. and C. L. Paris. ``Planning Text for Advisory Dialogues: Capturing Intentional and Rhetorical Information.'' Computational Linguistics, Vol. 19, No. 4, pp. 651-695. 1993.

Roth, S. F., J. Mattis and X. Mesnard. ``Graphics and Natural Language as Components of Automatic Explanation.'' In J. Sullivan and S. Tyler, editors, Intelligent User Interfaces, pp. 207-237. Reading, MA: Addison-Wesley. 1991.

Swartout, W. R. and J. D. Moore. ``Explanation in Second Generation Expert Systems''. In J.-M. David, J.-P. Krivine and R. Simmons, editors, Second Generation Expert Systems. Berlin: Springer-Varlag, 1993.

RELATED PROGRAM AREAS

Adaptive Human Interfaces. Intelligent Interactive Systems for Persons with Disabilities

POTENTIAL RELATED PROJECTS

Development of a sophisticated natural-language interface to complement the problem-solving aspects of this approach to quantitative explanation.