Department of Computer Science
University of Minnesota
200 Union Street SE -- Room 4-192
Minneapolis, MN 55455
The primary component of the project is the "embeddable application" architecture. Embeddable applications are multimedia widgets with both programmatic controls and user interfaces for controls. For example, a video embeddable application has user interfaces to control playing, pausing, reverse and fast-forward play, and random access. It also has programmatic controls that allow other parts of the application to control the video display and that allow the embeddable application to be synchronized with other embeddable applications (through links to a common clock). The user interface for the controls is adjustable. The user can cycle through different interfaces when appropriate (a technique adapted the SUIT system) and the application designer can specify that redundant interfaces be hidden (e.g., two tightly synchronized video streams can share a single control set).
The goal of this part of the project is to design a model for multimedia interface components that can be assembled into a complete interface at media discovery time. An application browsing the world wide web, for example, should be able to negotiate with a media server to select a set of streams and their synchronizations. Then, the application can assemble embeddable applications into a semi-custom displayer for that set of streams. Other key issues include the various composition methods for embeddable applications and the possibility of storing embeddable applications in a dynamically loadable form at well-known servers to support a wider range of browsing flexibility.
Finally, the project is also investigating event models for toolkits that support distributed multimedia applications to determine which models are most effective for providing high performance and support for collaboration.
Konstan, J.A. "An Agent-Event Architecture for Graphical User Interface Toolkits," Ph.D. Dissertation, Computer Science Division, University of California, Berkeley, 1993.
Rowe, L.A., Konstan, J.A., et. al., "The Picasso Application Framework," Proceedings of UIST '91, Hilton Head, SC, 1991.
In my opinion, the future is very much focused on establishing media connections (rather than download-and-play interfaces) and its success will depend heavily on how well multiple streams of information can be integrated. As a simple example, users should be able to request audio tracks for some image sets in different languages, or text captions, and still view a coherent presentation.
Substantial excitement exists over the idea that we may be able to define media rich enough to provide highly interactive and adaptive interfaces. These interfaces may even contact other sites on the network to request implementations of media players not available. Both the HotJava work and some of my own work on commands as media is exploring this frontier.
Rowe, L.A., and Smith, B.C., "A continuous media player," Proc. 3rd Int. Workshop on Network and Operating System Support for Digital Audio and Video, San Diego, CA, November 1992.
Blakowski, G. et. al. "Tool support for the synchronization and presentation of distributed multimedia." Computer Communications, 15(10) (December 1992), 611-618.
Schnepf, J. et. al. "Doing FLIPS: FLexible Interactive Presentation Synchronization," IEEE Journal on Selected Areas of Communications, to appear.
Sun Microsystems, Inc. "The HotJava Browser: A White Paper." http://java.sun.com/1.0alpha3/doc/overview/ hotjava/index.html.
Virtual Environments
Extensions into a wider range of media including immersive and non-immersive virtual environments.