Skip to main content

Currently Skimming:

Change in Human-Computer Interfaces on the Space Station: Why it Needs to Happen and How to Plan for It
Pages 151-175

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 151...
... the modalities according to task, user, and the space station environment. An appropriate matching of interface modalities, task, and user is essential to maximizing the potential of on-board computer systems in their primary gc~1 of supporting and amplifying human abilities.
From page 152...
... APPROPRIATE INTERFACE MODALITIES The need for change in human-computer interfaces on the space station and the acnseque~nt nerd for hooking arises out of the rapid development that has occurred and continues to Incur in interface modalities (typing, graphics, pointing, speech, etc.) and the interaction techniques used with them.
From page 153...
... Analogue/continuous interactions require different kinds of interaction modalities and techniques from those used In more traditional computer command languages. Varied groups of users: Although the most m~ssion-critical systems will continue to be operated by highly trained personnel, the sheer number of systems likely to be available in the space station suggests that this will not be true for all systems.
From page 154...
... An obvious example is extra-vehicular activity, but more frequent examples might arise when it was important to avoid the induced motion problems mentioned above (in the weightlessness bullet) or when it was useful to have an additional I/O channel in the context of a complex hands-on analogue activity such as remote manipulation.
From page 155...
... Interaction will have more of the flavor of direct manipulation if the user can perform an operation by moving an icon, for instance, as in the file deletion example above, an by sele ~ ing the nary of the Operation freon a list in a menu. To the extent that-they can be maintained, the metaphors implicit On direct manipulation interfaces make the interfaces more easily learnable, and reduce the need for help systems.
From page 156...
... Grnphirally-oriented or direct manipulation interfaces are in many ways superior to ~aracter-oriented interfaces for the space station Am.: ~_.
From page 157...
... However, this is very Me-consuming and expensive at the time the interface is developed s mce it involves detailed observations of many users interacting with the system and repeated extensions of the natural language coverage until all the commonly occurring syntax, semantics, and pragmatics a handled. Perhaps the most ~ rtant reason for not us mg natural language interaction is that most interaction can be handled more easily by direct manipulation or other graphim~lly-orient^~ means.
From page 158...
... There are, however, some cir=~mstan~= in which natural language or command language interaction is preferable to graphical interaction, including: . When there is a large range of options to choose between, especially when the options can be composed in a combinatorially explosive kind of way; When there is no convenient way to distribute the information in a twordimer6ional space; When a suitable spatial distribution exists, but the resulting space of information is so large that only a small fraction of it can be presented to the user at any one time; When the user is looking for information that is distributed across several spatia1ly-dist~nct items, so that retrieval of the information by direct manipulation would require iterative exam mation of each of the relev ant interface components.
From page 159...
... The specific disadvantage of the space station environment is the relatively high level of ambient noise that can be expected inside it, at least if the experience of the Shuttle is a guide. Ambient noise is problematic for speech recognition.
From page 160...
... Many of the same research issues arise in integrating speech with other mKdalitimc as were described in the section titled "Natural Language Interaction Via Keyboard" for the integration of typed natural language and graphical interaction. These issue= include resolution of deictic phrases ("this one", "that")
From page 161...
... Work is needed here to find better ways of developing a reasonably habitable subset of a natural language for a restricted domain, to develop ways for the system to encourage the user to stay within the bounds of the restricted language through appropriate cu¢Fut of its own, to devise methods for partial understanding when a user strays outside the bounds of the restricted language, and to develop interaction methods for st ~ ring the user back on track when he does stray as he inevitably will. Novel I/O Modalities m e m~ceraction Cities discussed so far are conventional ~ the sense that they have already been widely used (this is 1~t true of speech)
From page 162...
... Ok her kinds of novel output mcdalities on which further research could bring useful results include force or tactile f~dh~ck on j~ystick-type direct manipulation or analogue tasks and acceptably unobtrusive speech output. Force and tactile feedback has been used regularly ~ flying and remote manipulation tasks, but has been little explored for use in human-comput~r interaction for more abstract tasks, such as manipulating a set of computer files.
From page 163...
... Conversational systems also do not fit well with the direct manipulation or command language styles of interface, but fit much better with natural language or speech interfaces which naturally lend themselves to a dialogue style. Interfaces to intelligent, autonomous application systems can also make good use of a conversational style of interaction.
From page 164...
... The key to the development of conversational/agent interfaces lies in the development of detailed models of the back and the user. To produce intelligent agent behavior, it is necessary to use Artificial Intelligence techniques to model what tasks the user can accomplish through the interface, how he can achieve his goals, and what his current goals and state of knowledge are.
From page 165...
... Adding a conversational/agent component to a tool/machine-like direct manipulation interface for performing such tasks allows the basic task to be performed in the most efficient manner, but also allows components of that task that could benefit from a con versationa~ approach to do so. Examples of conversational interaction in such a situation include: the user requesting information that waNld require multiple actions to retrieve through the direct manipulation interface; the user asking questions about how to use the direct manipulation interface component; the system volunteering information about more efficient ways to use the direct manipulation component; the user requesting the system to achieve a higher level goal that WaN1d require extensive interaction with the direct manipulation component.
From page 166...
... (Tanner and Buxton, 1983; Hayes and Szekely, 1983; Hayes et al., 1985; Wasserman and Shewmake, 1984; Jacob, 1984; Yunten and Hartson, 1984~. m e systems developed to achieve this kind of separation are known as user interface management systems (Unless)
From page 167...
... . USER · — retest from the UP for a They on the semantic validity of a prod parameter for an application operation reply fray the application to such a request User Interface Management System -A Interface Specification Database Application I Specification I Database 1 1 1 FIGURE 1 M~el of Fornication In a U:~ Application
From page 168...
... It would govern, for instance, whether commands ware selected from menus, from an array of icons, through a command language line, etc., or whether a particular parameter to a specific command would be selected frum a menu, from a row of "radio buttons", or typed into a field on a form, etc.. The UIMS provides a basic set of facilities to perform these various kinds of Interaction, and the interface developer chooses the desired kind of interaction out of this cookbook by an appropriate interface specification.
From page 169...
... The UlMS model can speed up the modification part of the loop since interface modification can be done through modification of the declarative interface specification, rather than renroorammlna in a conventional sense. whole.
From page 170...
... The way around this difficulty may be to include a procedural component in the interface specification formalism, but organize it at ~~ high a level of abstraction as possible Fran the interface point of view. The procedural component could then be seen as a highly specialized programming language for interface specification.
From page 171...
... The rapid prototyping facilities most useful from this point of view allow ~nterfa~= to be seen and ~nterac ted with as they are developed, rather than forcing the interface developer to create the interface through working in a programming language or other formalism distinct from the interface itself. Examples of this approach include (Gould and Finzer, 1984; Myers and Buxton, 1986~.
From page 172...
... We conclude by listing several broad areas in which we parti Gularly reccm=end the support of further work. investigation of speech recognition techniques and natural language processing techniques for use with spoken input, plus the integration of both of these modalities with direct manipulation interfaces; exploration of innovative I/O device= suitable for the space station environment; work on the user and task retelling needed to support corrversationa~ interfaces and the integration of much interfaces with machine-like direct manipulation interfaces; continued deve~c~nt of He US concept, coupled with highly interactive interface development envirorm~ts for all interface Entities.
From page 173...
... J 1986 Steps tc wards Integrating Natural language and Graphical Interaction for Knowledge-based Systems.
From page 174...
... A 1986 Direct manipulation interfaces.
From page 175...
... 1983 . Some Issues in Future User Interface Management System (UI~= Developmer.t.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.